Depending on which algorithm has been chosen in the optimizer's settings page this dialog box offers the possibility to change the optimization settings for the CMA Evolution Strategy, the Nelder Mead Simplex or Genetic Algorithm or the Particle Swarm Optimization.
Nelder Mead Simplex Algorithm
This algorithm features several breaking criteria. The reason for this is that it has some properties of local and some properties of global optimization techniques. It starts with a set of points distributed in the parameter space like the global optimization algorithms and as the optimization run proceeds it will converge to some optimum, which may be a local extremum.
Goal Function Level: A desired goal function level can be specified for the Nelder Mead Simplex Algorithm. The algorithm will be stopped if the goal function value is less than the specified level. However, if the optimization is done distributed this criterion will only be checked after all parallel evaluations are calculated. If the desired accuracy is set to zero then the algorithm will only stop if one of the other criteria is satisfied.
Minimal Simplex Size: For optimization the parameter space is mapped onto the unit cube. The simplex is a geometrical figure that moves in this multidimensional space. The algorithm will stop as soon as the largest edge of the Simplex will be smaller than the specified size. If the optimization is defined over just one parameter in the interval [0;1] then this setting corresponds with the desired accuracy in the parameter space.
Maximal Number of Evaluations: Depending on the optimization problem definition it is possible that the specified goal function level can't be reached. If in addition a small value is chosen for minimal simplex size it is possible that the algorithm needs more goal function evaluations to converge. In this case it is convenient to define a maximal number of function evaluations to restrict optimization time a priory. This number has to be greater than one.
CMA Evolution Strategy
Maximal Number of Evaluations: Depending on the optimization problem definition it is possible that the specified goal function level can't be reached. If in addition a small value is chosen for minimal simplex size it is possible that the algorithm needs more goal function evaluations to converge. In this case it is convenient to define a maximal number of function evaluations to restrict optimization time a priory. This number has to be greater than one.
Sigma: T
Genetic Algorithm
Population Size: It's possible to specify the population size for the algorithm. Keep in mind that choosing a small population size increases the risk that the genes can be depleted. If a large population size is chosen there will be more solver evaluations necessary for the calculation of each generation.
Particle Swarm Optimization
Swarm Size: It's possible to specify the swarm size for the algorithm. Keep in mind that choosing a small swarm size increases the risk that the desired improvement needs more iterations . If a large swarm size is chosen there will be more solver evaluations necessary for the each iteration.
Maximal Number of Iterations: The Particle Swarm Algorithm will stop after the maximal number of iterations have been done. Like this, it is possible to estimate the maximal optimization time a priori. If "n" is the population size and "m" is the maximal number of iterations then at most "m*n + 1" solver runs will be done. However this estimation is not valid if the Interpolation feature is switched on, the optimization is aborted or the desired accuracy is reached.
Goal Function Level: A desired goal function level can be specified for the Particle Swarm Algorithm. The algorithm will be stopped if the goal function value is less than the specified level. However, if the optimization is done distributed this criterion will only be checked after the complete swarm was calculated. If the desired level is set to zero then the Maximal Number of Iterations is the only breaking condition.
General Setting for Optimization:
Choice of the Initial Data Set
For the featured global optimization techniques and the Nelder Mead Simplex Algorithm a set of initial points in the parameter space are necessary. These points will automatically be generated by a uniform random distribution generator or by the Latin Hypercube approach.
Uniform Random Numbers: For each starting point a pseudo random number generator will choose uniformly distributed points in the parameter space.
Latin Hypercube: Randomly chosen points sometimes have the disadvantage that they do not have optimal space filling properties in the parameter space. The Latin Hypercube sampling has the special property that a projection onto each parameter interval yields an equidistant sampling.
The initial points are distributed similar to the Latin Hypercube Distribution but a perturbation is added on each point. This distribution type has similar space filling properties as the Latin Hypercube Distribution but the generated point set will be less regular. This distribution type is only available for the Nelder Mead Simplex Algorithm.
Cancel
Close the optimizer dialog box without applying any changes.
OK
Stores the current settings and closes the dialog box. Please note that the settings that were set in the dialog box for the Genetic Algorithm or the Particle Swarm Optimization will only be valid if Start or Apply in the is pressed afterwards.
Help
Shows this help text.
See also