Optimizer - Settings

Home: Simulation Optimizer  - Settings

Home: Simulation New Task Optimization  - Settings

Within this property page you can select the algorithm type, and the parameters for optimization. You can also set the value bounds as well as the number of samples for the selected parameters.

General controls

See Optimizer online help.

Algorithm

Choose between the six algorithm types. The Trust Region Framework is most modern of the implemented algorithms. It uses local linear models on primary data. The Interpolated Quasi Newton optimizer type makes use of approximated gradient information to achieve fast convergence rates. However such algorithms are in general sensitive to the choice of the starting point in the parameter space. This optimizer is fast due to its support of interpolation of primary data. If the starting point is close to the desired optimum or the (unknown) goal function is sufficiently smooth then the local algorithm will converge quickly. The Trust Region Framework is the most robust of the algorithms, because the Trust Region approach will always assure convergence to a stationary point. It is also very efficient, avoiding many solver runs by interpolating primary data without sacrificing accuracy. The Nelder Mead Simplex Algorithm is another local method. It does not need to compute gradient information, which is an advantage when the number of variables grows. It is also less dependent from the chosen starting point because it starts with a set of points distributed in the parameter space. Which, compared with the other local algorithms, is an advantage if you have a bad starting point but a disadvantage if your starting point lies already close to the desired optimum.

If a non-smooth goal function is expected, the starting point is far away from the optimum or a large parameter space is going to be explored then a global algorithm should be preferred. For the featured global optimizers a maximal number of iterations  can be specified. Therefore the maximal number of goal function evaluations, and thus optimization time, can be determined a priori. Another advantage of the global optimizers is that the number of evaluations is independent from the number of parameters. Therefor the choice of a global optimizer over a local one can pay off if the optimization problem has a large number of parameters. The CMA Evolutionary Strategy, which is the most sophisticated approach of the implemented global optimizers, uses a statistical model in combination with some step size parameter. In addition the history of successful optimization steps is exploited. This improves the algorithms performance without loosing its global optimization properties.

 

Trust Region Framework: Selects a local optimizing technique embedded in a trust region framework. The algorithm starts with building a linear model on primary data in a "trust" region around the starting point.  Fast optimizations are done based on this local model to achieve a candidate for a new solver evaluation. The new point is accepted, if it is superior to the anchors of the model. If the model is not accurate enough the radius of the trust region will be decreased and a model on the new trust region will be created. The algorithm will be converged once the trust region radius or distance to the next predicted optimum becomes smaller than the specified domain accuracy.

Nelder Mead Simplex Algorithm: Selects the local Simplex optimization algorithm by Nelder and Mead. This method is a local optimization technique. If N is the number of parameters, it starts with N+1 points distributed in the parameter space.

CMA Evolutionary Strategy: Selects the global  covariance matrix adaptation evolutionary strategy.

Genetic  Algorithm: Selects the global genetic optimizer.

Particle Swarm Optimization: Selects the global particle swarm optimizer.

Interpolated Quasi Newton: Selects the local optimizer supporting interpolation of primary data. In addition, you can set the number N of optimizer passes (1 to 10) for this optimizer type. A number N greater than 1 forces the optimizer to start over (N-1) times.  Within each optimizer pass the minimum and maximum settings of the parameters (see Optimizer - Settings are changed approaching the optimal parameter setting. Increase the number of passes to values greater than 1 (e.g., 2 or 3) to obtain more accurate results .The corresponding numerical solver for the optimization will only be evaluated for the defined samples.  All other parameter combinations will be evaluated by using the interpolation of primary data. At the end of each optimization pass the optimum predicted by this approach will be verified by another evaluation of the numerical solver (e.g. "Update all tasks").

 

Settings:  If a global algorithm or the Nelder Mead Simplex Algorithm is selected the settings dialog for the corresponding optimizer will be opened.

Reset min/max

Specify a percentage value in the associated edit field and press this button to quickly modify the bounds for the parameters selected for optimization. Then, the lower/upper bound is set to the initial value minus/plus the given percentage.

Use current as initial values

Activate this check button to initialize the optimizer with the current values. This means that you are able to continue the optimization process, starting the solver with the previously achieved parameter results. However, if you want to run the optimizer several times with the same initial parameter conditions, you must disable this check button.

Use data of previous calculations

Activate this check button to trigger the import of previously calculated results for new optimizations to speed up the optimization process. If the result templates on which the optimizer goals are based were already evaluated before and the corresponding parameter combinations lie in the defined parameter space the  results might be imported without the need for recalculation. For the local algorithms it's possible that the initial point is replaced if a more suitable point is found in advance. For the algorithms that use a set of initial points, multiple initial points will be replaced by points that lie close or have a better goal value than the points in the close neighbourhood. This may disturb the selected distribution type but the algorithm will find a good compromise between finding points with good goal value and a well distributed set of starting points in the parameter space. Keep in mind that this feature will make the reproducibility of optimizations more difficult because after an optimization there will be more potential imports available than before.

Parameter list

Check box: Select the parameters that are varied during the optimization run.

Parameter: Displays the name of the parameter.

Min/Max: Set the interval considered during the optimization process for the respective parameter here. Note, the minimum/maximum value must be less/greater than the initial parameter value.

Samples: Set the number of samples used for interpolation for the respective parameter here. For a larger parameter range, a higher sample value may lead to more accurate results. Note that a minimum of 3 is required. If the Genetic or the Particle Swarm optimizer is used and the interpolation is switched off, this setting will have no effect.

Initial: Set the initial value for the respective parameter here.

Current: Shows the current parameter values.

Best: Shows the current optimal parameter values

The following settings are available depending on the chosen algorithm type:

Properties

If a global algorithm or the Nelder Mead Simplex Algorithm is selected the properties dialog for the corresponding optimizer will be opened.

Use Interpolation

For both global optimizers it is possible to switch on the Interpolation of Primary Data. If the interpolation is applied the only true solver runs that will be done are the ones for the evaluation of the specified anchors and a final solver run for the estimated  best parameters. All other goal function evaluations will be interpolated.

Please note that global optimization algorithms have the probability of exploring most of the parameter space. Thus it is most likely that all or nearly all anchor points will actually be evaluated. Keep in mind that the number of solver runs needed for interpolation is dependant of the number of parameters whereas the number of solver runs needed for the two global optimization algorithms are independent of the number of parameters. Because of this, the usage of the interpolation feature will only pay off if the parameter space is not too high dimensional or a large number of iterations is planned.

Since the possible goal functions that can be defined have always non negative values the optimization will automatically be stopped if one of the anchor evaluations yields a goal value equal zero.

Include anchor in Initial Point Set

If this feature is switched on then the point that is defined as initial point on the parameter settings property page will be included in the initial data set of the algorithm. If the current parameter settings are already quite good then it makes sense to include this point in the starting set. After the set of initial points is generated the closest point from the automatically generated set will be substituted with the predefined point. However if the current point was created by a previous optimization run of a local optimizer and a second optimization is planned on a reduced parameter space this setting should be turned off because it increases the risk that the second optimization will converge to the same local optimum as before. In this case the second optimization won't yield any improvement.

Optimizer passes

This check box is only available for the Interpolated Quasi Newton. Set the number of samples required for the Interpolated Quasi Newton optimizer.

Domain accuracy

This check box is only available for the Trust Region Framework. Set the accuracy of the optimizer in the parameter space if all parameter ranges are mapped to the interval [0,1].

 

See also

Optimizer Overview, Optimizer, Optimizer - Goals, Optimizer - Algorithm Settings, Optimizer - Info