You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: src/controller/nonlinmpc.jl
+8-8
Original file line number
Diff line number
Diff line change
@@ -196,7 +196,7 @@ This controller allocates memory at each time step for the optimization.
196
196
- `optim=JuMP.Model(Ipopt.Optimizer)` : nonlinear optimizer used in the predictive
197
197
controller, provided as a [`JuMP.Model`](@extref) object (default to [`Ipopt`](https://github.com/jump-dev/Ipopt.jl) optimizer).
198
198
- `gradient=AutoForwardDiff()` : an `AbstractADType` backend for the gradient of the objective
199
-
function, see [`DifferentiationInterface`](@extref DifferentiationInterface List).
199
+
function, see [`DifferentiationInterface` doc](@extref DifferentiationInterface List).
200
200
- `jacobian=default_jacobian(transcription)` : an `AbstractADType` backend for the Jacobian
201
201
of the nonlinear constraints, see `gradient` above for the options (default in Extended Help).
202
202
- additional keyword arguments are passed to [`UnscentedKalmanFilter`](@ref) constructor
@@ -247,16 +247,16 @@ NonLinMPC controller with a sample time Ts = 10.0 s, Ipopt optimizer, UnscentedK
247
247
The keyword argument `nc` is the number of elements in `LHS`, and `gc!`, an alias for
248
248
the `gc` argument (both `gc` and `gc!` accepts non-mutating and mutating functions).
249
249
250
-
By default, the optimization relies on dense [`ForwardDiff`](https://github.com/JuliaDiff/ForwardDiff.jl)
250
+
By default, the optimization relies on dense [`ForwardDiff`](@extref ForwardDiff)
251
251
automatic differentiation (AD) to compute the objective and constraint derivatives. One
252
252
exception: if `transcription` is not a [`SingleShooting`](@ref), the `jacobian` argument
253
-
defaults to this [sparse backend](@extref DifferentiationInterface Sparsity):
253
+
defaults to this [sparse backend](@extref DifferentiationInterface AutoSparse-object):
254
254
```julia
255
-
AutoSparse(
256
-
AutoForwardDiff();
257
-
sparsity_detector = TracerSparsityDetector(),
258
-
coloring_algorithm = GreedyColoringAlgorithm()
259
-
)
255
+
AutoSparse(
256
+
AutoForwardDiff();
257
+
sparsity_detector = TracerSparsityDetector(),
258
+
coloring_algorithm = GreedyColoringAlgorithm()
259
+
)
260
260
```
261
261
Optimizers generally benefit from exact derivatives like AD. However, the [`NonLinModel`](@ref)
262
262
state-space functions must be compatible with this feature. See [`JuMP` documentation](@extref JuMP Common-mistakes-when-writing-a-user-defined-operator)
Copy file name to clipboardExpand all lines: src/model/linearization.jl
+2-2
Original file line number
Diff line number
Diff line change
@@ -75,7 +75,7 @@ Linearize `model` at the operating points `x`, `u`, `d` and return the [`LinMode
75
75
The arguments `x`, `u` and `d` are the linearization points for the state ``\mathbf{x}``,
76
76
manipulated input ``\mathbf{u}`` and measured disturbance ``\mathbf{d}``, respectively (not
77
77
necessarily an equilibrium, details in Extended Help). The Jacobians of ``\mathbf{f}`` and
78
-
``\mathbf{h}`` functions are automatically computed with [`ForwardDiff.jl`](https://github.com/JuliaDiff/ForwardDiff.jl).
78
+
``\mathbf{h}`` functions are automatically computed with [`ForwardDiff`](@extref ForwardDiff).
79
79
80
80
!!! warning
81
81
See Extended Help if you get an error like:
@@ -131,7 +131,7 @@ julia> linmodel.A
131
131
equations are similar if the nonlinear model has nonzero operating points.
132
132
133
133
Automatic differentiation (AD) allows exact Jacobians. The [`NonLinModel`](@ref) `f` and
134
-
`h` functions must be compatible with this feature though. See [`JuMP` documentation][@extref JuMP Common-mistakes-when-writing-a-user-defined-operator]
134
+
`h` functions must be compatible with this feature though. See [`JuMP` documentation](@extref JuMP Common-mistakes-when-writing-a-user-defined-operator)
135
135
for common mistakes when writing these functions.
136
136
"""
137
137
functionlinearize(model::SimModel{NT}; kwargs...) where NT<:Real
0 commit comments