Optimization Interface
The functions below provide overviews of the interfaces for accessing objective function, gradient function, Hessian and related information for an OptimizationProblem from the OptimizationModels.jl package .
Allocation
OptimizationProblems.allocate — Functionallocate(
problem::GeneralizedLinearModel;
type::DataType=Float64,
obj::Bool=true,
grad::Bool=true,
hess::Bool=false,
weights::Bool=false,
residual::Bool=false,
jacobian::Bool=false,
)Allocates storage necessary in a Dict for calculating objective information to reduce allocations during runtime.
Arguments
problem::GeneralizedLinearModel, specifies the problem that is being consideredtype::DataType, specifies the targetDataTypefor all calculations. Defaults toFloat64.obj::Bool, iftrueadds a key:objto the storage dictionary with valuetype(0.0). Otherwise, does nothing to the storage dictionary.grad::Bool, iftrueadds a key:gradto the storage dictionary with valuezeros(type, problem.num_param). Otherwise, does nothing to storage dictionary.hess::Bool, iftrueadds a key:hessto the storage dictionary with valuezeros(type, problem.num_param, problem.num_param). Otherwise, does nothing to storage dictionary.weights::Bool, iftrueadds a key:weightsto the storage dictionary with valuezeros(type, problem.num_obs). Otherwise, does nothing to storage dictionary.residual::Bool, iftrueadds a key:residualto the storage dictionary with valuezeros(type, problem.num_obs). Otherwise, does nothing to storage dictionary.jacobian::Bool, iftrueadds a key:jacobianto the storage dictionary with valuezeros(type, problem.num_obs, num_param). Otherwise, does nothign to storage dictionary.
Returns
- Object of type
Dict{Symbol, Any}.
Objective Information
OptimizationProblems.obj! — Functionobj!(problem::GeneralizedLinearModel; store::Dict{Symbol, Any},
x::Vector{T}, reset::Bool=true, batch::AbstractVector{Int64}=
Base.OneTo(problem.num_obs)
) where TUpdates the value of the objective function stored in store[:obj] for the problem specified in problem using the observations specified in batch. If reset is true, then the value of the objective is first set to zero. Returns nothing.
OptimizationProblems.grad! — Functiongrad!(problem::GeneralizedLinearModel; store::Dict{Symbol, Any},
x::Vector{T}, reset::Bool=true, batch::AbstractVector{Int64}=
Base.OneTo(problem.num_obs), block::AbstractVector{Int64}=eachindex(x)
) where TUpdates the value of the gradient function stored in store[:grad] for the problem specified in problem using the observation specified in batch and only updates the entries in block. If reset==true, sets store[:grad][block] to zero. Returns nothing.
OptimizationProblems.objgrad! — Functionobjgrad!(problem::GeneralizedLinearModel; store::Dict{Symbol, Any},
x::Vector{T}, reset::Bool=true, batch::AbstractVector{Int64}=
Base.OneTo(problem.num_obs), block::AbstractVector{Int64}=eachindex(x)
) where TUpdates the objective value and gradient value in store for the problem specified in problem using the observations specified in batch. If block is specified, then, for the gradient update, only the values in store[:grad][block]. block does not impact the objective update. If reset==true, then the objective and store[:grad][block] are set to zeros. Returns nothing.
OptimizationProblems.hess! — Functionhess!(problem::GeneralizedLinearModel, store::Dict{Symbol, Any},
x::Vector{T}, reset::Bool=true, batch::AbstractVector{Int64}=
Base.OneTo(problem.num_obs), block::AbstractVector{Int64}=eachindex(x)
) where TUpdates the hessian in store[:hess] for the problem specified in problem using the observations specified in batch. If block is specified then only the entries of the sub-array, store[:hess][block, block], are updated. If reset==true, then the entries of store[:hess][block,block] are first set to zero. Returns nothing.