Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions Changelog.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,12 @@ The file was started with Version `0.4`.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [0.5.32] January 15, 2026

### Fixed

* Fixed failing precompilation related to the release of Glossaries.jl v0.1.1 (#567).

## [0.5.31] January 11, 2026

### Changed
Expand Down
4 changes: 2 additions & 2 deletions Project.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name = "Manopt"
uuid = "0fc0a36d-df90-57f3-8f93-d78a9fc72bb5"
version = "0.5.31"
version = "0.5.32"
authors = ["Ronny Bergmann <manopt@ronnybergmann.net>"]

[workspace]
Expand Down Expand Up @@ -47,7 +47,7 @@ ColorTypes = "0.9.1, 0.10, 0.11, 0.12"
Colors = "0.11.2, 0.12, 0.13"
DataStructures = "0.17, 0.18, 0.19"
Dates = "1.10"
Glossaries = "0.1.0"
Glossaries = "0.1.1"
JuMP = "1.15"
LRUCache = "1.4"
LineSearches = "7.2.0"
Expand Down
4 changes: 2 additions & 2 deletions _typos.toml
Original file line number Diff line number Diff line change
Expand Up @@ -2,12 +2,12 @@
methodes = "methodes" # french
Serie = "Serie" # french
sur = "sur" # french
cmo = "cmo" # often ussed abbreciation for constrained manifold objective
cmo = "cmo" # often used abbreviation for constrained manifold objective

[files]
extend-exclude = [
"tutorials/*.html",
# rendered from tutorials/*.qmd -Z we do not want typos twice
"docs/src/tutorials/*.md",
"joss/paper.md",
]
]
4 changes: 2 additions & 2 deletions src/documentation_glossary.jl
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ _tex_Cal(letter) = raw"\mathcal{" * "$letter" * "}"
Glossaries.define!(_glossary_tex_terms, :Cal, :math, _tex_Cal)
function _tex_cases(cases...)
return raw"\begin{cases}" *
"$(join([" $(ci)" for ci in c], raw"\\\\ "))" *
"$(join([" $(ci)" for ci in cases], raw"\\\\ "))" *
raw"\end{cases}"
end
Glossaries.define!(_glossary_tex_terms, :cases, :math, _tex_cases)
Expand All @@ -80,7 +80,7 @@ Glossaries.define!(_glossary_tex_terms, :inner, :math, _tex_inner)
Glossaries.define!(_glossary_tex_terms, :log, :math, raw"\log")
Glossaries.define!(_glossary_tex_terms, :max, :math, raw"\max")
Glossaries.define!(_glossary_tex_terms, :min, :math, raw"\min")
_tex_norm(v; index = "") = raw"\lVert " * "$v" * raw" \rVert" * "_{$index}"
_tex_norm(v; index = "") = raw"\lVert " * "$v" * raw" \rVert" * (length(index) > 0 ? "_{$index}" : "")
Glossaries.define!(_glossary_tex_terms, :norm, :math, _tex_norm)
_tex_pmatrix(lines...) = raw"\begin{pmatrix} " * join(lines, raw"\\ ") * raw"\end{pmatrix}"
Glossaries.define!(_glossary_tex_terms, :pmatrix, :math, _tex_pmatrix)
Expand Down
4 changes: 2 additions & 2 deletions src/helpers/checks.jl
Original file line number Diff line number Diff line change
Expand Up @@ -165,13 +165,13 @@ no plot is generated.
# Keyword arguments

* `check_grad=true`:
verify that ``$(_tex(:grad))f(p) ∈ $(_math(:TangentSpace)))``.
verify that ``$(_tex(:grad))f(p) ∈ $(_math(:TangentSpace))``.
* `check_linearity=true`:
verify that the Hessian is linear, see [`is_Hessian_linear`](@ref) using `a`, `b`, `X`, and `Y`
* `check_symmetry=true`:
verify that the Hessian is symmetric, see [`is_Hessian_symmetric`](@ref)
* `check_vector=false`:
verify that `$(_tex(:Hess)) f(p)[X] ∈ $(_math(:TangentSpace)))`` using `is_vector`.
verify that `$(_tex(:Hess)) f(p)[X] ∈ $(_math(:TangentSpace))`` using `is_vector`.
* `mode=:Default`:
specify the mode for the verification; the default assumption is,
that the retraction provided is of second order. Otherwise one can also verify the Hessian
Expand Down
6 changes: 3 additions & 3 deletions src/plans/conjugate_residual_plan.jl
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
# Objective.
_doc_CR_cost = """
```math
f(X) = $(_tex(:frac, 1, 2)) $(_tex(:norm, _tex(:Cal, "A") * "[X] + b"; index = "p"))^2,\\qquad X ∈ $(_math(:TangentSpace))),
f(X) = $(_tex(:frac, 1, 2)) $(_tex(:norm, _tex(:Cal, "A") * "[X] + b"; index = "p"))^2,\\qquad X ∈ $(_math(:TangentSpace)),
```
"""
@doc """
Expand All @@ -13,7 +13,7 @@ Model the objective

$(_doc_CR_cost)

defined on the tangent space ``$(_math(:TangentSpace)))`` at ``p`` on the manifold ``$(_math(:Manifold)))``.
defined on the tangent space ``$(_math(:TangentSpace))`` at ``p`` on the manifold ``$(_math(:Manifold))``.

In other words this is an objective to solve ``$(_tex(:Cal, "A")) = -b(p)``
for some linear symmetric operator and a vector function.
Expand Down Expand Up @@ -290,7 +290,7 @@ Stop when re relative residual in the [`conjugate_residual`](@ref)
is below a certain threshold, i.e.

```math
$(_tex(:displaystyle))$(_tex(:frac, _tex(:norm, "r^{(k)"), "c")) ≤ ε,
$(_tex(:displaystyle))$(_tex(:frac, _tex(:norm, "r^{(k)}"), "c")) ≤ ε,
```

where ``c = $(_tex(:norm, "b"))`` of the initial vector from the vector field in ``$(_tex(:Cal, "A"))(p)[X] + b(p) = 0_p``,
Expand Down
2 changes: 1 addition & 1 deletion src/plans/first_order_plan.jl
Original file line number Diff line number Diff line change
Expand Up @@ -1002,7 +1002,7 @@ end
PreconditionedDirection(M::AbstractManifold, preconditioner; kwargs...)

Add a preconditioner to a gradient processor following the [motivation for optimization](https://en.wikipedia.org/wiki/Preconditioner#Preconditioning_in_optimization),
as a linear invertible map ``P: $(_math(:TangentSpace))) → $(_math(:TangentSpace)))`` that usually should be
as a linear invertible map ``P: $(_math(:TangentSpace)) → $(_math(:TangentSpace))`` that usually should be

* symmetric: ``⟨X, P(Y)⟩ = ⟨P(X), Y⟩``
* positive definite ``⟨X, P(X)⟩ > 0`` for ``X`` not the zero-vector
Expand Down
2 changes: 1 addition & 1 deletion src/plans/hessian_plan.jl
Original file line number Diff line number Diff line change
Expand Up @@ -246,7 +246,7 @@ A functor to approximate the Hessian by a finite difference of gradient evaluati

Given a point `p` and a direction `X` and the gradient ``$(_tex(:grad)) f(p)``
of a function ``f`` the Hessian is approximated as follows:
let ``c`` be a stepsize, ``X ∈ $(_math(:TangentSpace)))`` a tangent vector and ``q = $_doc_ApproxHessian_step``
let ``c`` be a stepsize, ``X ∈ $(_math(:TangentSpace))`` a tangent vector and ``q = $_doc_ApproxHessian_step``
be a step in direction ``X`` of length ``c`` following a retraction
Then the Hessian is approximated by the finite difference of the gradients,
where ``$(_math(:VectorTransport))`` is a vector transport.
Expand Down
2 changes: 1 addition & 1 deletion src/plans/quasi_newton_plan.jl
Original file line number Diff line number Diff line change
Expand Up @@ -515,7 +515,7 @@ end
_doc_QN_B = """
```math
$(_tex(:Cal, "B"))_k^{(0)}[⋅]
= $(_tex(:frac, "$(_tex(:inner, "s_{k-1}", "y_{k-1}"; index = "p_k"))", "$(_tex(:inner, "y_{k-1}", "y_{k-1}"; index = "p_k"))"))$(_tex(:Id))_{$(_math(:TangentSpace)))}[⋅]
= $(_tex(:frac, "$(_tex(:inner, "s_{k-1}", "y_{k-1}"; index = "p_k"))", "$(_tex(:inner, "y_{k-1}", "y_{k-1}"; index = "p_k"))"))$(_tex(:Id))_{$(_math(:TangentSpace))}[⋅]
```
"""

Expand Down
4 changes: 2 additions & 2 deletions src/plans/stepsize/stepsize.jl
Original file line number Diff line number Diff line change
Expand Up @@ -182,7 +182,7 @@ end

Specify a step size that performs an Armijo line search. Given a Function ``f:$(_math(:Manifold))nifold))nifold))nifold)))→ℝ``
and its Riemannian Gradient ``$(_tex(:grad))f: $(_math(:Manifold))nifold))nifold))nifold)))→$(_math(:TangentBundle))``,
the current point ``p∈$(_math(:Manifold))nifold))nifold)))`` and a search direction ``X∈$(_math(:TangentSpace)))``.
the current point ``p∈$(_math(:Manifold))nifold))nifold)))`` and a search direction ``X∈$(_math(:TangentSpace))``.

Then the step size ``s`` is found by reducing the initial step size ``s`` until

Expand Down Expand Up @@ -1630,7 +1630,7 @@ See [`WolfePowellLinesearch`](@ref) for the math details
$(_fields(:X; name = "candidate_direction"))
$(_fields(:p; name = "candidate_point"))
as temporary storage for candidates
$(_fields(:X, "candidate_tangent"))
$(_fields(:X, name = "candidate_tangent"))
* `last_stepsize::R`
* `max_stepsize::R`
$(_fields(:retraction_method))
Expand Down
4 changes: 2 additions & 2 deletions src/plans/vectorial_plan.jl
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ struct ComponentVectorialType <: AbstractVectorialType end
FunctionVectorialType{P<:AbstractPowerRepresentation} <: AbstractVectorialType

A type to indicate that constraints are implemented one whole functions,
for example ``g(p) ∈ ℝ^m`` or ``$(_tex(:grad)) g(p) ∈ ($(_math(:TangentSpace)))^m``.
for example ``g(p) ∈ ℝ^m`` or ``$(_tex(:grad)) g(p) ∈ ($(_math(:TangentSpace))^m``.

This type internally stores the [`AbstractPowerRepresentation`](@extref `ManifoldsBase.AbstractPowerRepresentation`),
when it makes sense, especially for Hessian and gradient functions.
Expand Down Expand Up @@ -157,7 +157,7 @@ Putting these gradients into a vector the same way as the functions, yields a

```math
$(_tex(:grad)) f(p) = $(_tex(:Bigl))( $(_tex(:grad)) f_1(p), $(_tex(:grad)) f_2(p), …, $(_tex(:grad)) f_n(p) $(_tex(:Bigr)))^$(_tex(:transp))
∈ ($(_math(:TangentSpace))))^n
∈ ($(_math(:TangentSpace)))^n
```

And advantage here is, that again the single components can be evaluated individually
Expand Down
2 changes: 1 addition & 1 deletion src/solvers/Lanczos.jl
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ Solve the adaptive regularized subproblem with a Lanczos iteration
# Fields

$(_fields(:stopping_criterion; name = "stop"))
$(_fields(:stopping_criterion, "stop_newton"))
$(_fields(:stopping_criterion, name = "stop_newton"))
used for the inner Newton iteration
* `σ`: the current regularization parameter
* `X`: the Iterate
Expand Down
2 changes: 1 addition & 1 deletion src/solvers/adaptive_regularization_with_cubics.jl
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ $(_fields(:p; add_properties = [:as_Iterate]))
* `q`: a point for the candidates to evaluate model and ρ
$(_fields(:X; add_properties = [:as_Gradient]))
* `s`: the tangent vector step resulting from minimizing the model
problem in the tangent space ``$(_math(:TangentSpace)))``
problem in the tangent space ``$(_math(:TangentSpace))``
* `σ`: the current cubic regularization parameter
* `σmin`: lower bound for the cubic regularization parameter
* `ρ_regularization`: regularization parameter for computing ρ.
Expand Down
6 changes: 3 additions & 3 deletions src/solvers/conjugate_residual.jl
Original file line number Diff line number Diff line change
Expand Up @@ -6,10 +6,10 @@ _doc_conjugate_residual = """

Compute the solution of ``$(_tex(:Cal, "A"))(p)[X] + b(p) = 0_p ``, where

* ``$(_tex(:Cal, "A"))`` is a linear, symmetric operator on ``$(_math(:TangentSpace)))``
* ``$(_tex(:Cal, "A"))`` is a linear, symmetric operator on ``$(_math(:TangentSpace))``
* ``b`` is a vector field on the manifold
* ``X ∈ $(_math(:TangentSpace)))`` is a tangent vector
* ``0_p`` is the zero vector ``$(_math(:TangentSpace)))``.
* ``X ∈ $(_math(:TangentSpace))`` is a tangent vector
* ``0_p`` is the zero vector ``$(_math(:TangentSpace))``.

This implementation follows Algorithm 3 in [LaiYoshise:2024](@cite) and
is initialised with ``X^{(0)}`` as the zero vector and
Expand Down
2 changes: 1 addition & 1 deletion src/solvers/interior_point_Newton.jl
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ _doc_IPN_subsystem = """
```math
$(_tex(:operatorname, "J")) F(p, μ, λ, s)[X, Y, Z, W] = -F(p, μ, λ, s),
$(_tex(:text, " where "))
X ∈ $(_math(:TangentSpace))), Y,W ∈ ℝ^m, Z ∈ ℝ^n
X ∈ $(_math(:TangentSpace)), Y,W ∈ ℝ^m, Z ∈ ℝ^n
```
"""
_doc_IPN = """
Expand Down
2 changes: 1 addition & 1 deletion src/solvers/mesh_adaptive_direct_search.jl
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ _doc_mads = """
mesh_adaptive_direct_search!(M, mco::AbstractManifoldCostObjective, p; kwargs..)

The Mesh Adaptive Direct Search (MADS) algorithm minimizes an objective function ``f: $(_math(:Manifold))nifold))) → ℝ`` on the manifold `M`.
The algorithm constructs an implicit mesh in the tangent space ``$(_math(:TangentSpace)))`` at the current candidate ``p``.
The algorithm constructs an implicit mesh in the tangent space ``$(_math(:TangentSpace))`` at the current candidate ``p``.
Each iteration consists of a search step and a poll step.

The search step selects points from the implicit mesh and attempts to find an improved candidate solution that reduces the value of ``f``.
Expand Down
2 changes: 1 addition & 1 deletion src/solvers/projected_gradient_method.jl
Original file line number Diff line number Diff line change
Expand Up @@ -187,7 +187,7 @@ $(_args(:p))

# Keyword arguments

$(_kwargs(:stepsize, "backtrack"; default = "`[`ArmijoLinesearchStepsize`](@ref)`(M; stop_increasing_at_step=0)")) to perform the backtracking to determine the ``β_k``.
$(_kwargs(:stepsize; name = "backtrack", default = "`[`ArmijoLinesearchStepsize`](@ref)`(M; stop_increasing_at_step=0)")) to perform the backtracking to determine the ``β_k``.
Note that the method requires ``β_k ≤ 1``, otherwise the projection step no longer provides points within the constraints
$(_kwargs([:evaluation, :retraction_method]))
$(_kwargs(:stepsize; default = "`[`ConstantStepsize`](@ref)`(injectivity_radius(M)/2)")) to perform the candidate projected step.
Expand Down
2 changes: 1 addition & 1 deletion src/solvers/proximal_gradient_method.jl
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ $(_kwargs(:stepsize; default = "`[`default_stepsize`](@ref)`(M, `[`ProximalGradi
that by default uses a [`ProximalGradientMethodBacktracking`](@ref).
$(_kwargs(:retraction_method))
$(_kwargs(:stopping_criterion; default = "`[`StopAfterIteration`](@ref)`(100)"))
$(_kwargs(:sub_problem, "sub_problem", "Union{AbstractManoptProblem, F, Nothing}"; default = "nothing"))
$(_kwargs(:sub_problem; type = "Union{AbstractManoptProblem, F, Nothing}", default = "nothing"))
or nothing to take the proximal map from the [`ManifoldProximalGradientObjective`](@ref)
$(_kwargs(:sub_state; default = "evaluation")). This field is ignored, if the `sub_problem` is `Nothing`.

Expand Down
2 changes: 1 addition & 1 deletion src/solvers/stochastic_gradient_descent.jl
Original file line number Diff line number Diff line change
Expand Up @@ -150,7 +150,7 @@ end

# Keyword arguments

$(_kwargs(:X, "initial_gradient"))
$(_kwargs(:X; name = "initial_gradient"))
$(_kwargs(:p; add_properties = [:as_Initial]))

$(_note(:ManifoldDefaultFactory, "StochasticGradientRule"))
Expand Down
2 changes: 1 addition & 1 deletion src/solvers/trust_regions.jl
Original file line number Diff line number Diff line change
Expand Up @@ -292,7 +292,7 @@ $(_kwargs(:evaluation))
$(_kwargs(:retraction_method))
$(_kwargs(:stopping_criterion; default = "`[`StopAfterIteration`](@ref)`(1000)`$(_sc(:Any))[`StopWhenGradientNormLess`](@ref)`(1e-6)"))
$(_kwargs(:sub_kwargs))
$(_kwargs(:stopping_criterion, "sub_stopping_criterion"; default = "`( see [`truncated_conjugate_gradient_descent`](@ref))` "))
$(_kwargs(:stopping_criterion; name = "sub_stopping_criterion", default = "`( see [`truncated_conjugate_gradient_descent`](@ref))` "))
$(_kwargs(:sub_problem; default = "`[`DefaultManoptProblem`](@ref)`(M, `[`ConstrainedManifoldObjective`](@ref)`(subcost, subgrad; evaluation=evaluation))"))
$(_kwargs(:sub_state; default = "`[`QuasiNewtonState`](@ref)` "))
, where [`QuasiNewtonLimitedMemoryDirectionUpdate`](@ref) with [`InverseBFGS`](@ref) is used
Expand Down
Loading