cadcaecam2017年-第六章.pptVIP

  • 6
  • 0
  • 约1.01万字
  • 约 47页
  • 2018-06-10 发布于上海
  • 举报
N-D Search N Dimension Optimization Method * Non gradient Based Optimization Method * Univariate search * Direct Search ( Hunt Pech ) * Random Search * Conjugate Direction Search * Method of HookGeeves * Gradient Based Search Schemes * Steepest Decent * Conjugate Gradient Search ( Fletcher Reerso ) * Newton’s Method * Quasi – Newton’s Methods --DFPM ( Dowiden, Fletcher, Powell Method) --BFGS ( Broyden, Fletcher, Goldfarb Scheme) The non-gradient based search schemes works better with ill-behaved objective functions. They are less efficient. The gradient based search schemes are more efficient, but they are more sensitive to the shape of the objective function N-Dimensional Search ? The Problem now has N Design Variables. ? Solving the Multiple Design Variable Optimization (Minimization) Problem Using the 1-D Search Methods Discussed Previously ? This is carried out by: – To deal one variable each time, in sequential order – easy, but take a long time. – To introduce a new variable that changes all variables simultaneously, more complex, but quicker N-D Search Methods Non-Gradient Based Search Methods Univariate Search Search is carry out in a sequence of fixed and prespecified direction (usually the coordinate directions) Small step search * Hold all xj constant, except xi * f(xi+ε) f(xi) new f old f * T F accept it next variable * repeat until stopping rule satisfied Large Step 1-D Search ? Two Key Questions: – Search Direction – In what direction should we move (or search)? – Next Point – How far we should go or where do we stop along this move/search direction? ? Search along the Coordinate Direction ? Introduce a New Variable α which represents how far we should go along the selected search direction ? The value of α is determined by a 1-D optimization problem

文档评论(0)

1亿VIP精品文档

相关文档