﻿ Specifying Known Derivatives

# Specifying Known Derivatives

Top  Previous  Next

FreeFlyer's optimization interfaces provide a variety of ways to customize the optimization process to improve speed and chances of convergence. Users have the option to supply any known derivatives of the constraints and objective function, which can drastically improve performance by taking the burden of numerically computing the derivatives off of the Optimizer object.

## Specifying the Jacobian and Gradient

If desired, the user can specify any known derivatives of the problem constraints or objective function by populating the Jacobian Matrix and Gradient Array properties of the Optimizer object. Supplying the Optimizer object with known derivatives can drastically improve runtime for complex problems. If not specified by the user, FreeFlyer will fill the Jacobian and Gradient with a constant (-999) so that any derivatives that the user provides can be automatically detected. All derivatives that are not provided by the user will be calculated numerically via finite differencing.

The Optimizer.Jacobian and Optimizer.Gradient properties should be updated within the evaluation loop on every nominal case evaluation, which can be identified through the Optimizer.OptimizationPhase property. A simple example is presented below: the problem is to find a solution to the equation "x + y^2 = 12" such that "(x + y)^2" is minimized and y is between 0 and 5. There are a number of solutions to this problem that satisfy the constraint, but the optimal solution is x = -4 and y = 4. In this case, the analytic derivatives are easy to calculate, so populating the Jacobian and Gradient properties is simple. For more complex problems that have complicated derivatives, it may be sensible to perform the Jacobian and Gradient calculations within a Procedure that is called inside of the evaluation loop.

 Variable x; Variable y; Optimizer opt;   // Define Problem opt.AddStateVariable("x", 0); opt.AddStateVariable("y", 1, 0, 5);   opt.AddConstraint("constraintExpression"); opt.Constraints.SetEqualityBounds(12);   // Load Engine opt.LoadEngine();   // Evaluation Loop While (opt.HasNotConverged());         opt.UpdateStateVariables();         x = opt.GetStateVariableValue("x");       y = opt.GetStateVariableValue("y");         opt.SetConstraintValue("constraintExpression", x + y^2);         // Supply derivatives on each nominal evaluation       If (opt.OptimizationPhase == 1);               opt.Jacobian[0, 0] = 1;      // Derivative of constraintExpression wrt x             opt.Jacobian[0, 1] = 2*y;    // Derivative of constraintExpression wrt y               opt.Gradient = 2*(x + y); // Derivative of objective function wrt x             opt.Gradient = 2*(x + y); // Derivative of objective function wrt y         End;         opt.Minimize((x + y)^2);   End;   Report opt.GetBestStateVariableValues();

The Jacobian and Gradient can also be configured through the Optimizer.SetJacobianValue() and Optimizer.SetGradientValue() methods, which offer a convenient approach to keeping track of which element is being assigned by using the state variable and constraint labels, as shown below:

 opt.SetJacobianValue("constraintExpression", "x", 1); opt.SetJacobianValue("constraintExpression", "y", 2*y);           opt.SetGradientValue("x", 2*(x + y)); opt.SetGradientValue("y", 2*(x + y));

Derivative Tuning Parameters

There are a number of additional options directly on the Optimizer object that can be configured to change how FreeFlyer handles derivative calculation.

### FiniteDifferenceMethod

FreeFlyer's Optimizer uses a finite differencing implementation to compute numerical derivatives during the optimization process. The user can adjust the Optimizer.FiniteDifferenceMethod property to choose between a forward or central differencing method (forward differencing is used by default).

 opt.FiniteDifferenceMethod = 0; // Forward Difference opt.FiniteDifferenceMethod = 1; // Central Difference

### ValidateUserDerivatives

This boolean property indicates whether FreeFlyer should report an error if a user-provided derivative differs largely from the finite difference derivative evaluated during the sampling process. When set to true, if a user provided a value for the Gradient or Jacobian that was more than 10 percent different than the value calculated through the finite differencing method, the process would report an error. This functionality is turned on by default.

 opt.ValidateUserDerivatives = 1;

### UseJacobianSparsity

The sparsity of a Jacobian matrix is determined by the number of elements that are equal to zero. This boolean property indicates whether the optimization engine should take advantage of the sparsity of the Jacobian to simplify the optimization algorithm, or always process the full dense Jacobian. This functionality is turned on by default, but has no effect when using NLopt, which always uses a dense Jacobian.

 opt.UseJacobianSparsity = 1;