You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: tutorials/AutomaticDifferentiation.qmd
+8-5Lines changed: 8 additions & 5 deletions
Original file line number
Diff line number
Diff line change
@@ -141,7 +141,7 @@ More generally take a change of the metric into account as
141
141
= Df(p)[X] = g_p(\operatorname{grad}f(p), X)
142
142
```
143
143
144
-
or in words: we have to change the Riesz representer of the (restricted/projected) differential of $f$ ($\tilde f$) to the one with respect to the Riemannian metric. This is done using [`change_representer`](https://juliamanifolds.github.io/Manifolds.jl/latest/manifolds/metric.html#Manifolds.change_representer-Tuple{AbstractManifold,%20AbstractMetric,%20Any,%20Any}).
144
+
or in words: we have to change the Riesz representer of the (restricted/projected) differential of $f$ ($\tilde f$) to the one with respect to the Riemannian metric. This is done using ``[`change_representer`](@extref `ManifoldsBase.change_representer-Tuple{AbstractManifold, ManifoldsBase.AbstractMetric, Any, Any}`)``{=commonmark}.
145
145
146
146
### A continued example
147
147
@@ -187,7 +187,7 @@ This can be computed for symmetric positive definite matrices by summing the squ
187
187
G(q) = sum(log.(eigvals(Symmetric(q))) .^ 2) / 2
188
188
```
189
189
190
-
We can also interpret this as a function on the space of matrices and apply the Euclidean finite differences machinery; in this way we can easily derive the Euclidean gradient. But when computing the Riemannian gradient, we have to change the representer (see again [`change_representer`](https://juliamanifolds.github.io/Manifolds.jl/latest/manifolds/metric.html#Manifolds.change_representer-Tuple{AbstractManifold,%20AbstractMetric,%20Any,%20Any})) after projecting onto the tangent space $T_p\mathcal P(n)$ at $p$.
190
+
We can also interpret this as a function on the space of matrices and apply the Euclidean finite differences machinery; in this way we can easily derive the Euclidean gradient. But when computing the Riemannian gradient, we have to change the representer (see again ``[`change_representer`](@extref `ManifoldsBase.change_representer-Tuple{AbstractManifold, ManifoldsBase.AbstractMetric, Any, Any}`)``{=commonmark}) after projecting onto the tangent space $T_p\mathcal P(n)$ at $p$.
191
191
192
192
Let's first define a point and the manifold $N=\mathcal P(3)$.
193
193
@@ -204,7 +204,7 @@ We could first just compute the gradient using `FiniteDifferences.jl`, but this
204
204
FiniteDifferences.grad(central_fdm(5, 1), G, q)
205
205
```
206
206
207
-
Instead, we use the [`RiemannianProjectedBackend`](https://juliamanifolds.github.io/Manifolds.jl/latest/features/differentiation.html#Manifolds.RiemannianProjectionBackend) of `Manifolds.jl`, which in this case internally uses `FiniteDifferences.jl` to compute a Euclidean gradient but then uses the conversion explained before to derive the Riemannian gradient.
207
+
Instead, we use the ``[`RiemannianProjectedBackend`](@extref `ManifoldDiff.RiemannianProjectionBackend`)``{=commonmark} of ``[`ManifoldDiff.jl`](@extref ManifoldDiff :std:doc:`index`)``{=commonmark}, which in this case internally uses `FiniteDifferences.jl` to compute a Euclidean gradient but then uses the conversion explained before to derive the Riemannian gradient.
208
208
209
209
We define this here again as a function `grad_G_FD` that could be used in the `Manopt.jl` framework within a gradient based optimization.
210
210
@@ -220,7 +220,7 @@ end
220
220
G1 = grad_G_FD(N, q)
221
221
```
222
222
223
-
Now, we can again compare this to the (known) solution of the gradient, namely the gradient of (half of) the distance squared $G(q) = \frac{1}{2}d^2_{\mathcal P(3)}(q,I_3)$ is given by $\operatorname{grad} G(q) = -\operatorname{log}_q I_3$, where $\operatorname{log}$ is the [logarithmic map](https://juliamanifolds.github.io/Manifolds.jl/latest/manifolds/symmetricpositivedefinite.html#Base.log-Tuple{SymmetricPositiveDefinite,%20Vararg{Any,%20N}%20where%20N}) on the manifold.
223
+
Now, we can again compare this to the (known) solution of the gradient, namely the gradient of (half of) the distance squared $G(q) = \frac{1}{2}d^2_{\mathcal P(3)}(q,I_3)$ is given by $\operatorname{grad} G(q) = -\operatorname{log}_q I_3$, where $\operatorname{log}$ is th ``[logarithmic map](@extref Manifolds :jl:method:`Base.log-Tuple{SymmetricPositiveDefinite, Vararg{Any}}`)``{=commonmark} on the manifold.
224
224
225
225
```{julia}
226
226
G2 = -log(N, q, Matrix{Float64}(I, 3, 3))
@@ -243,11 +243,14 @@ This tutorial is cached. It was last run on the following package versions.
243
243
244
244
```{julia}
245
245
#| code-fold: true
246
+
#| echo: false
246
247
using Pkg
247
248
Pkg.status()
248
249
```
249
250
```{julia}
250
251
#| code-fold: true
252
+
#| echo: false
253
+
#| output: asis
251
254
using Dates
252
-
now()
255
+
println("This tutorial was last rendered $(Dates.format(now(), "U d, Y, H:M:S")).");
0 commit comments