淘宝官方店     推荐课程     在线工具     联系方式     关于我们  
 
 

微波射频仿真设计   Ansoft Designer 中文培训教程   |   HFSS视频培训教程套装

 

Agilent ADS 视频培训教程   |   CST微波工作室视频教程   |   AWR Microwave Office

          首页 >> Ansoft Designer >> Ansoft Designer在线帮助文档


Ansoft Designer / Ansys Designer 在线帮助文档:


Optimetrics >
   Setting up an Optimization Analysis >
       Available Optimizers >
           Optimizers in Planar EM and Nexxim               


Optimizers in Planar EM and Nexxim

In Planar EM and Nexxim analyses, you have the following choices of optimizer: Sequential Nonlinear Programming, Sequential Mixed Integer Nonlinear Programming, Quasi Newton, Pattern Search, and Genetic Algorithm — although in most cases the Sequential Non-Linear Programming optimizer is recommended.

Quasi Newton

If the Sequential Non Linear Programming Optimizer has difficulty, and if the numerical noise is insignificant during the solution process, use the Quasi Newton optimizer to obtain the results. This optimizer uses gradient approximation of a user-defined cost function in its search for the minimum location of the cost function. This gradient approximation is accurate enough only if there is little noise involved in the cost function calculation. The cost function calculation involves Finite Element Analysis (FEA), which possesses finite accuracy.

Pattern Search

If the noise is significant in the nominal project, use the Pattern Search optimizer to obtain the results. It performs a grid-based simplex search, which makes use of simplices: triangles in 2D space. The cost value is calculated at the vertices of the simplex. The optimizer mirrors the simplex across one of its faces based on mathematical guidelines and determines if the new simplex provides better results. If it does not produce a better result, the next face is used for mirroring and the pattern continues. If no improvement occurs, the grid is refined. If improvement occurs, the step is accepted and the new simplex is generated to replace the original one. Pattern Search algorithms are less sensitive to noise than the other methods.

Sequential Non-Linear Programming

The main advantage of Sequential Nonlinear Programming (SNLP) over quasi Newton is that it handles the optimization problem in more depth. This optimizer assumes that the optimization variables span a continuous space.

Like the Quasi Newton, the SNLP optimizer assumes that the noise is not significant. It does reduce the effect of the noise, but the noise filtering is not strong. The SNLP optimizer approximates the FEA characterization with Response Surfaces. With the FEA-approximation and with light evaluation of the cost function, SNLP has a good approximation of the cost function in terms of the optimization variables. This approximation allows the SNLP optimizer to estimate the location of improving points. The overall cost approximations are more accurate. This allows the SNLP optimizer a faster practical convergence speed then that of quasi Newton.

The SNLP Optimizer attempts to solve a series of Nonlinear Programming (NLP) problems on a series of inexpensive, local surrogates. Direct application of a Nonlinear Programming solver is impractical because the cost evaluation involves finite element analysis (FEA), which uses extensive computational resources.

The SNLP method is similar to the Sequential Quadratic Programming (SQP) method in two ways: Both are sequential, and both use local and inexpensive surrogates. However, in the SNLP case, the surrogate can be of a higher order and is more generally constrained. The inexpensive surrogate model is obtained by response surface (RS) techniques. The goal is to achieve a surrogate model that is accurate enough on a winder scale, so that the search procedures are well lead by the surrogate, even for relatively large steps. All functions calculated by the supporting finite element product (for example, Maxwell 3D or Designer) is assumed to be expensive, while the rest of the cost calculation (for example, an extra user-defined expression) -- which is implemented in Optimetrics -- is assumed to be inexpensive. For this reason, it makes sense to remove inexpensive evaluations from the finite element problem and, instead, implement them in Optimetrics. This optimizer holds several advantages over the Quasi Newton and Pattern Search optimizers.

Most importantly, due to the separation of expensive and inexpensive evaluations in the cost calculation, the SNLP optimizer is more tightly integrated with the supporting FEA tools. This tight integration provides more insight into the optimization problem, resulting in a significantly faster optimization process. A second advantage is that the SNLP optimizer does not require cost-derivatives to be approximated, protecting against uncertainties (noise) in cost evaluations. In addition to derivative-free state of the RS-based SNLP, the RS technique also proves to have noise suppression properties. Finally, this optimizer allows you to use nonlinear constraints, making this approach much more general than either of the other two optimizers.

Sequential Mixed Integer Non-Linear Programming

To be able to optimize on number of turns or quarter turns, the optimizer must handle discrete optimization variables. This optimizer can mix continuous variables among the integers, or can have only integers, and works if all variables are continuous. The setup resembles that for SNLP, except that you must flag the integer variables. You can set up internal variables based on the integer optimization variable.

For example, consider N to be an integer optimization variable. By definition it can only assume integer values. You can establish another variable, which further depends on this one: K = 2.345 * N, or K = sin(30 * N ). This way K has a discrete value, but is not necessarily integer. Or, one can use N directly as a design parameter.

Genetic Algorithm

The Genetic Algorithm (GA) search is an iterative process that goes through a number of generations. In each generation some new individuals (Children / Number of Individuals) are created and the so grown population participates in a selection (natural-selection) process that in turn reduces the size of the population to a desired level (Next Generation / Number of Individuals).

When a smaller set of individuals must be created from a bigger set, the GA selects individuals from the original set. During this process, better fit (in relation to the cost function) individuals are preferred. In the elitist selection, simply the best so many individuals are selected, but if you turn on the roulette selection, then the selection process gets relaxed. An iterative process starts selecting the individuals and fill up the resulting set, but instead of selecting the best so many, we use a roulette wheel that has for each selection-candidate divisions made proportional to the fitness level (relative to the cost function) of the candidate. This means that the fitter the individual is, the larger the probability of his survival will be.




HFSS视频教学培训教程 ADS2011视频培训教程 CST微波工作室教程 Ansoft Designer 教程

                HFSS视频教程                                      ADS视频教程                               CST视频教程                           Ansoft Designer 中文教程


 

      Copyright © 2006 - 2013   微波EDA网, All Rights Reserved    业务联系:mweda@163.com