Basic Functions
Downhill.optimize
— Functionoptimize(
fdf, x₀;
method,
kw...
)
Find an optimizer for fdf
, starting with the initial approximation x₀
. method
keyword chooses a specific optimization method. See optimize!
for the description of other keywords.
Downhill.optimize!
— Functionoptimize!(fdf, M::Wrapper, x0)
Find an optimizer for fdf
, starting with the initial approximation x0
. fdf(x, g)
must return a tuple (f(x), ∇f(x)) and, if g
is mutable, overwrite it with the gradient.
optimize!(
fdf, M::OptBuffer, x₀;
gtol=1e-6,
convcond=nothing,
maxiter=100,
maxcalls=nothing,
reset=true,
constrain_step=nothing,
tracking=nothing,
verbosity=0
)
Find an optimizer for fdf
, starting with the initial approximation x₀
.
Arguments
M::OptBuffer
: the core method to use for optimizationfdf(x, g)::Function
: function to optimize. It must return a tuple (f(x), ∇f(x)) and, ifg
is mutable, overwrite it with the gradient.x0
: initial approximation
Keywords
Convergence criteria
There are two options to specify convergence criterion. The default is by gtol
and the second by custom stop convcond
.
gtol::Real
: (default stop criterion) stop optimization when the gradient's 2-norm is lessconvcond=(x, xpre, y, ypre, g)->Bool
: function, custom stop criterion based on argument values, function values andg
radient. Ifnothing
(default), corresponds togtol
, and when specified, thegtol
-criterion is ignored.
Example (default criterion): convcond = (x, xpre, y, ypre, g) -> norm(g, 2) ≤ gtol
.
Limitting optimization
(Un)Limit optimization process by specifing either maxiter
and/or maxcalls
.
maxiter::Integer
: force stop optimization after this number of iterations (usenothing
or a negative value to not constrain iteration number)maxcalls::Integer
: force stop optimization after this number of function calls (usenothing
or a negative value to not constrain call number)
Optimization constrains
The inequality constrains of optimization is handled by constrain_step
.
constrain_step(x0, d)
: a function to constrain step fromx0
in the directiond
. It must return a real-numbered valueα
such thatx0 + αd
is the maximum allowed step
Initializing
reset=true
: a value to pass as a keyword argument to the optimizerinit!
method
Optimization path
tracking::Union{Nothing,IO,AbstractString}
: IO stream or a file name to log the optimization process ornothing
to disable logging (default:nothing
)verbosity::Integer
: verbosity of logging.0
(default) disables tracking.1
logs all points of objective function evaluation with corresponding values and gradients.2
shows additional statistics regarding the line search. Option ignored iftracking == nothing
.
Downhill.solver
— FunctionDownhill.solver(
M::OptBuffer;
gtol = 1e-6,
maxiter = 100,
maxcalls = nothing,
constrain_step=nothing,
)
Return the wrapper object for a chosen method to solve an optimization problem with given parameters. For the description of keywords, see optimize!
Downhill.solver(
M::DataType, x;
gtol=1e-6, maxiter = 100, maxcalls = nothing, constrain_step)
Return the wrapper object for a chosen method to solve an optimization problem with given parameters compatible with the dimensions of x
.