Javadoc
This class adapts the unconstrained nonlinear minimization algorithms in
the "minimization" package to the task of estimating locally optimal
(minimum-cost) parameter sets. This allows us to use algorithms like BFGS
FunctionMinimizerBFGS to find locally optimal parameters of, for
example, a
DifferentiableFeedforwardNeuralNetwork. Any first-order
derivative
FunctionMinimizer may be dropped into this class.
My current preference is for using BFGS (
FunctionMinimizerBFGS) to
solve virtually all problems. However, when there are too many parameters,
then Liu-Storey conjugate gradient (
FunctionMinimizerLiuStorey) is
another good choice.
When first-order derivative information is not available, then you may use
either automatic differentiation (
GradientDescendableApproximator)
or the derivative-free minimization routines, such as those used by
ParameterDerivativeFreeCostMinimizer.