Optimization by Direct Search: New Perspectives on Some Classical and Modern Methods

Abstract

Direct search methods are best known as unconstrained optimization techniques that do not explicitly use derivatives. Direct search methods were formally proposed and widely applied in the 1960s but fell out of favor with the mathematical optimization community by the early 1970s because they lacked coherent mathematical analysis. Nonetheless, users remained loyal to these methods, most of which were easy to program, some of which were reliable. In the past fifteen years, these methods have seen a revival due, in part, to the appearance of mathematical analysis, as well as to interest in parallel and distributed computing. This review begins by briefly summarizing the history of direct search methods and considering the special properties of problems for which they are well suited. Our focus then turns to a broad class of methods for which we provide a unifying framework that lends itself to a variety of convergence results. The underlying principles allow generalization to handle bound constraints and linear constraints. We also discuss extensions to problems with nonlinear constraints.

Publication
SIAM Review
Date
Citation
T. G. Kolda, R. M. Lewis, V. Torczon. Optimization by Direct Search: New Perspectives on Some Classical and Modern Methods. SIAM Review, Vol. 45, No. 3, pp. 385-482, 2003. https://doi.org/10.1137/S003614450242889

Keywords

nonlinear programming, nonlinear optimization, direct search, pattern search, simplex search, positive bases, global convergence analysis, local convergence analysis, generating set search

BibTeX

@article{KoLeTo03,  
author = {Tamara G. Kolda and Robert Michael Lewis and Virginia Torczon}, 
title = {Optimization by Direct Search: New Perspectives on Some Classical and Modern Methods}, 
journal = {SIAM Review}, 
volume = {45}, 
number = {3}, 
pages = {385--482}, 
month = {August}, 
year = {2003},
doi = {10.1137/S003614450242889},
}