1. Derivative-free Algorithm Design
The main part of my Ph.D. research is the development of new Delaunay-based Derivative-free Optimization via Global Surrogates algorithm, dubbed as Δ-DOGS. Δ-DOGS is an efficient response surface method that efficiently and globally solves black-box, computationally expensive, nonconvex optimization problems. Δ-DOGS iteratively determines the minimum of the response surface where the minimizer of the truth function locates with the highest probability. This response surface s(x) models the relationship between input-output data and is constructed based on an interpolant p(x), which curves the fidelity of the underlying function, and an artificially generated uncertainty function e(x) which quantifies how uncertain of the regions that have not yet been explored.
1.1 Dimension reduction of Δ-DOGS via Active Subspace Method[1]:
It is well-known that the derivative-free algorithm suffers deeply from the curse of dimensionality. The computational cost of the parameter space increases exponentially as the dimension of the input data increases. The dimension reduction techniques become an essential tool for derivative-free optimization to deal with moderate dimensional input parameter space. Δ-DOGS is suitable for the number of input parameteres relatively low, (say less than 10). Throught utilizing the dimension reduction of parameter space using Active Subspace Method, we extended the restriction of the number of design parameters from 10 to 20, which is also the restriction on the number of parameters that Bayesian optimization could possibly handle.
1.2 Safety learning algorithm S-DOGS: Safety-guaranteed Derivative-free Optimization via Global Surrogates[2]:
In autonomous systems, it is often necessary to tune parameters in order to optimize towards a given objective. However, users often do not know in advance the region(s) of the otherwise feasible parameter space which, if explored during the optimization process, could lead to severe damage of the experimental system. For such problems, it is useful to develop algorithms that automatically minimize a given performance measure with unknown mathematical form, while assuring at each function evaluation that the safety of the system is guaranteed. In this work, a new algorithm is developed such that any user-supplied safety constraints are ensured to be satisfied during each step of the optimization process. The algorithm developed, dubbed Safety-guaranteed Derivative-free Optimization via Global Surrogate (S-DOGS), can be implemented to automatically, efficiently, and safely learn the boundary of the underlying safe region, while simultaneously identifying the global minimizer of the objective function. As to the best knowledge of authors, S-DOGS is the very first safe learning data-driven algorithm that does not rely on Gaussian processes.
To demonstrate the efficiency, S-DOGS is applied to optimize the parameters of the nonlinear control problem of quadrotor trajectory following test case.
1.3 Multi-fidelity optimization on stochastic function approximated via time-averaged statistics:
Under preparation.
1.4 Function feasibility via binary oracle calls.
Under preparation.
1.5 Open/infinite domains.
Under preparation.
2. Nonlinear Controller Design
[1] An active subspace method for accelerating convergence in Delaunay-based optimization via dimension reduction
Muhan Zhao, Shahrouz Ryan Alimo and Thomas. R. Bewley
2018 IEEE Conference on Decision and Control (CDC), Miami Beach, FL, 2018, pp. 2765-2770. doi: 10.1109/CDC.2018.8619219
[2] Delaunay-based Derivative-free Optimization via Global Surrogates with Safe and Exact Function Evaluations
Muhan Zhao, Shahrouz Ryan Alimo, Pooriya Beyhaghi and Thomas. R. Bewley
Accepted. To appear in CDC 2019.
[3] A Delaunay-based method for optimizing infinite time averages of numerical discretizations of ergodic systems
Pooriya Beyhaghi, Shahrouz Ryan Alimo, Muhan Zhao and Thomas. R. Bewley
Accepted. To appear in AIAA SciTech 2020.