Abstract:
Delaunay-based derivative-free optimization, ∆DOGS, is an efficient and provably-convergent global optimization method for the problems which has computationally expensive objection function and the analytical expression for the objective function is not available. ∆-DOGS is a novel optimization scheme in the family of response surface methods (RSMs); however, it suffers from the curse of dimensionality since the computational cost increases dramatically as the number of design parameters increases. As a result, the number of design parameters in ∆-DOGS algorithm is relatively low (n.10). To avoid such problems, this paper proposes a combination of derivative-free optimization, seeking the global minimizer of an expensive and nonconvex objective function f(x) and active subspace method, detecting the directions of the most variability using evaluations of the gradient. The contribution of other directions to the objective function is bounded by a sufficiently small constant. This new algorithm iteratively applied Delaunay-based derivative-free optimization to seek the minimizer on the d-dimensional active subspace that has most function variation. Inverse mapping is needed to project data from active subspace to full-model for evaluating function values. This task is overcome by solving an inequality constrained problem that curves the response surface of the objective function. The test results show that this strategy is effective on a handful of optimization problems.