官方淘宝店 易迪拓培训 旧站入口
首页 > 仿真设计 > CST微波工作室 > cst 四个优化算法中各有什么优点?

cst 四个优化算法中各有什么优点?

05-08
CST的四个优化算法中都用在什么情况,什么遗传算法,微粒群算法,还有什么牛顿迭代和什么的,一般用哪个可以比较快的得到最优结果啊?

CST帮助文件里有很详细的解释:
Choose between the five optimizer types. The Interpolated Quasi Newton algorithm makes use of approximated gradient information and the Powell optimizer of partial derivatives to achieve faster convergence rates. However these algorithms are sensitive to the choice of the starting point in the parameter space. If the starting point is close to the desired optimum or the (unknown) goal function is sufficiently smooth then the local algorithms will converge quickly.
The Interpolated Quasi Newton optimizer is fast due to its support of interpolation of primary data, but in some cases it may be not as accurate as the slower Classic Powell optimizer.
The Nelder Mead Simplex Algorithm has a set of starting points and does not need gradient information to determine it's search direction. This is an advantage over the other local algorithms as soon as the number of variables grows.
If a non-smooth goal function is expected, the starting point is far away from the optimum or a large parameter space is going to be explored then a global algorithm should be preferred. For the featured global optimizers a maximal number of iterations can be specified. Therefore the maximal number of goal function evaluations, and thus optimization time, can be determined a priori. Another advantage of the global optimizers is that the number of evaluations is independent from the number of parameters. Therefor the choice of a global optimizer over a local one can pay off if the optimization problem has a large number of parameters.
Genetic Algorithm: Selects the global genetic optimizer.
Particle Swarm Optimization: Selects the global particle swarm optimizer.
Nelder Mead Simplex Algorithm: Selects the local Simplex optimization algorithm by Nelder and Mead. This method is a local optimization technique. If N is the number of parameters, it starts with N+1 points distributed in the parameter space.
Interpolated Quasi Newton: Selects the local optimizer supporting interpolation of primary data. This optimizer is fast in comparison to the Classic Powell optimizer but may be less accurate. In addition, you can set the number N of optimizer passes (1 to 10) for this optimizer type. A number N greater than 1 forces the optimizer to start over (N-1) times. Within each optimizer pass the minimum and maximum settings of the parameters (see Optimizer Parameters) are changed approaching the optimal parameter setting. Increase the number of passes to values greater than 1 (e.g., 2 or 3)  to obtain more accurate results.
Classic Powell: Selects the local optimizer without interpolation of primary data. In addition, it is necessary to set the accuracy, which effects the accuracy of the optimal parameter settings and the time of termination of the optimization process. For optimizations with more than one parameter the Interpolated Quasi Newton or the Nelder Mead Simplex Algorithm should be preferred to this technique.
没用过Optimisation,所以没有什么经验可以分享。就上面这段帮助内容,有两个Global Optimiser(Genetic Algorithm、Particle Swarm Optimisation)和三个Local Optimiser(Nelder Mead Simplex Algorithm、Interpolated Quasi Newton、Classic Powell)。IQN的优化速度比CP来得快,但是准确性可能没有CP高。如果优化过程用到很多参数,那么最好同时使用一个Global Optimiser。

恩,谢谢啊!

不客气。看以前的帖子,小编现在应该是在尝试用MWS和DS做优化仿真。个人建议你最好能将这几种优化算法都试一下,看看对于特定的案例(比如你现在正在做的例子),用什么优化算法的组合可以得到最满意的结果(兼顾速度和准确性)。
希望小编试验成功后可以专门开贴描述流程,总结经验。

实话不满您说啊,结果一直不好,我都郁闷死了,这优化的时间很长,哎,但结果改善的不明显,我一直在怀疑我的模型,或者参数设置有问题,有经验一定会分享啊!哈哈

用CST的优化工具优化时间很长,尤其是电尺寸较大,模型较复杂优化时间实在让人等待心急。采用遗传算法和粒子群算法是基于全局优化的,迭代次数需要很多才能较收敛,而且跟变量采样数有关。后面的两种算法是局部算法,对于变量较少的,用局部算法来优化方法较快收敛,对于多变量尽量选择全局算法收敛较块。在CST里面目前还是没有找到混合方法优化。总之,场优化是需要很长时间的。这是个人的使用CST优化的一点感觉而已。

总之, CST的优化能力是比较弱的。  

很少用optimiser,所以这个我没有什么个人意见。不过我自己更倾向于使用parameter sweep手动调整参数。个人习惯,仅供参考!

先扫描,再优化吧,建议这样~

8楼的说得对,至少优化得要有比较明确的范围,这样才好进行快速的优化

Top