Optimization Interface

The functions below provide overviews of the interfaces for accessing objective function, gradient function, Hessian and related information for an OptimizationProblem from the OptimizationModels.jl package .

Allocation

OptimizationProblems.allocateFunction
allocate(
    problem::GeneralizedLinearModel;
    type::DataType=Float64,
    obj::Bool=true, 
    grad::Bool=true,
    hess::Bool=false,
    weights::Bool=false,
    residual::Bool=false,
    jacobian::Bool=false,
)

Allocates storage necessary in a Dict for calculating objective information to reduce allocations during runtime.

Arguments

  • problem::GeneralizedLinearModel, specifies the problem that is being considered
  • type::DataType, specifies the target DataType for all calculations. Defaults to Float64.
  • obj::Bool, if true adds a key :obj to the storage dictionary with value type(0.0). Otherwise, does nothing to the storage dictionary.
  • grad::Bool, if true adds a key :grad to the storage dictionary with value zeros(type, problem.num_param). Otherwise, does nothing to storage dictionary.
  • hess::Bool, if true adds a key :hess to the storage dictionary with value zeros(type, problem.num_param, problem.num_param). Otherwise, does nothing to storage dictionary.
  • weights::Bool, if true adds a key :weights to the storage dictionary with value zeros(type, problem.num_obs). Otherwise, does nothing to storage dictionary.
  • residual::Bool, if true adds a key :residual to the storage dictionary with value zeros(type, problem.num_obs). Otherwise, does nothing to storage dictionary.
  • jacobian::Bool, if true adds a key :jacobian to the storage dictionary with value zeros(type, problem.num_obs, num_param). Otherwise, does nothign to storage dictionary.

Returns

  • Object of type Dict{Symbol, Any}.
source

Objective Information

OptimizationProblems.obj!Function
obj!(problem::GeneralizedLinearModel; store::Dict{Symbol, Any},
    x::Vector{T}, reset::Bool=true, batch::AbstractVector{Int64}=
    Base.OneTo(problem.num_obs)
) where T

Updates the value of the objective function stored in store[:obj] for the problem specified in problem using the observations specified in batch. If reset is true, then the value of the objective is first set to zero. Returns nothing.

source
OptimizationProblems.grad!Function
grad!(problem::GeneralizedLinearModel; store::Dict{Symbol, Any},
    x::Vector{T}, reset::Bool=true, batch::AbstractVector{Int64}=
    Base.OneTo(problem.num_obs), block::AbstractVector{Int64}=eachindex(x)
) where T

Updates the value of the gradient function stored in store[:grad] for the problem specified in problem using the observation specified in batch and only updates the entries in block. If reset==true, sets store[:grad][block] to zero. Returns nothing.

source
OptimizationProblems.objgrad!Function
objgrad!(problem::GeneralizedLinearModel; store::Dict{Symbol, Any},
    x::Vector{T}, reset::Bool=true, batch::AbstractVector{Int64}=
    Base.OneTo(problem.num_obs), block::AbstractVector{Int64}=eachindex(x)
) where T

Updates the objective value and gradient value in store for the problem specified in problem using the observations specified in batch. If block is specified, then, for the gradient update, only the values in store[:grad][block]. block does not impact the objective update. If reset==true, then the objective and store[:grad][block] are set to zeros. Returns nothing.

source
OptimizationProblems.hess!Function
hess!(problem::GeneralizedLinearModel, store::Dict{Symbol, Any},
    x::Vector{T}, reset::Bool=true, batch::AbstractVector{Int64}=
    Base.OneTo(problem.num_obs), block::AbstractVector{Int64}=eachindex(x)
) where T

Updates the hessian in store[:hess] for the problem specified in problem using the observations specified in batch. If block is specified then only the entries of the sub-array, store[:hess][block, block], are updated. If reset==true, then the entries of store[:hess][block,block] are first set to zero. Returns nothing.

source