JPL Technical Report Server

An Active Subspace Method for Accelerating Convergence in Delaunay-Based Optimization via Dimension Reduction

Show simple item record

dc.contributor.author Zhao, Muhan
dc.contributor.author Alimo, Shahrouz Ryan
dc.contributor.author Bewley, Thomas R.
dc.date.accessioned 2020-06-02T17:45:40Z
dc.date.available 2020-06-02T17:45:40Z
dc.date.issued 2018-12-17
dc.identifier.citation 2018 IEEE Conference on Decision and Control (CDC), Miami Beach, Florida, December 17 - 19, 2018 en_US
dc.identifier.clearanceno 18-5151
dc.identifier.uri http://hdl.handle.net/2014/48698
dc.description.abstract Delaunay-based derivative-free optimization, ∆DOGS, is an efficient and provably-convergent global optimization method for the problems which has computationally expensive objection function and the analytical expression for the objective function is not available. ∆-DOGS is a novel optimization scheme in the family of response surface methods (RSMs); however, it suffers from the curse of dimensionality since the computational cost increases dramatically as the number of design parameters increases. As a result, the number of design parameters in ∆-DOGS algorithm is relatively low (n.10). To avoid such problems, this paper proposes a combination of derivative-free optimization, seeking the global minimizer of an expensive and nonconvex objective function f(x) and active subspace method, detecting the directions of the most variability using evaluations of the gradient. The contribution of other directions to the objective function is bounded by a sufficiently small constant. This new algorithm iteratively applied Delaunay-based derivative-free optimization to seek the minimizer on the d-dimensional active subspace that has most function variation. Inverse mapping is needed to project data from active subspace to full-model for evaluating function values. This task is overcome by solving an inequality constrained problem that curves the response surface of the objective function. The test results show that this strategy is effective on a handful of optimization problems. en_US
dc.description.sponsorship NASA/JPL en_US
dc.language.iso en_US en_US
dc.publisher Pasadena, CA: Jet Propulsion Laboratory, National Aeronautics and Space Administration, 2018 en_US
dc.title An Active Subspace Method for Accelerating Convergence in Delaunay-Based Optimization via Dimension Reduction en_US
dc.type Preprint en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search


Browse

My Account