TU Darmstadt / ULB / TUbiblio

DIRECT using local search on surrogates

Hemker, Thomas and Werner, Christian (2011):
DIRECT using local search on surrogates.
In: Pacific Journal of Optimization, p. 443, 7, (3), [Article]

Abstract

The solution of noisy nonlinear optimization problems with nonlinear constraints and derivative information is becoming increasingly important, as many practical applications can be described by this type of problem in e.g., engineering applications. Existing local optimization methods show good convergence properties. However, they often depend on sufficiently good starting points and/or the approximation of gradients. In turn, global derivative free methods, which need no starting values to be initialized, require many evaluations of the objective function, particularly in the vicinity of the solution. A derivative free optimization algorithm is developed that combines advantages of both local and global methods. The DIRECT algorithm, which is often used for problems where no prior knowledge is available as kind of a brute force start, is extended by an inner loop using a surrogate based optimization method. The local search on the surrogate function determines better candidates for sampling than the hypercube center points chosen by DIRECT, especially if constraints are arising. This inner loop needs no additional evaluation of the original problem. Standard test problems and a computational more expensive test problem are chosen to show the performance of the new algorithm.

Item Type: Article
Erschienen: 2011
Creators: Hemker, Thomas and Werner, Christian
Title: DIRECT using local search on surrogates
Language: German
Abstract:

The solution of noisy nonlinear optimization problems with nonlinear constraints and derivative information is becoming increasingly important, as many practical applications can be described by this type of problem in e.g., engineering applications. Existing local optimization methods show good convergence properties. However, they often depend on sufficiently good starting points and/or the approximation of gradients. In turn, global derivative free methods, which need no starting values to be initialized, require many evaluations of the objective function, particularly in the vicinity of the solution. A derivative free optimization algorithm is developed that combines advantages of both local and global methods. The DIRECT algorithm, which is often used for problems where no prior knowledge is available as kind of a brute force start, is extended by an inner loop using a surrogate based optimization method. The local search on the surrogate function determines better candidates for sampling than the hypercube center points chosen by DIRECT, especially if constraints are arising. This inner loop needs no additional evaluation of the original problem. Standard test problems and a computational more expensive test problem are chosen to show the performance of the new algorithm.

Journal or Publication Title: Pacific Journal of Optimization
Volume: 7
Number: 3
Divisions: 20 Department of Computer Science > Simulation, Systems Optimization and Robotics Group
20 Department of Computer Science
Date Deposited: 20 Jun 2016 23:26
Identification Number: hemker2011
Export:
Suche nach Titel in: TUfind oder in Google

Optionen (nur für Redakteure)

View Item View Item