Center for Applied Mathematics Colloquium

Katya ScheinbergCornell Operations Research and Information Engineering
Overview of adaptive stochastic optimization methods

Friday, February 10, 2023 - 3:45pm
Rhodes 655

Abstract: Continuous optimization is a mature field, which has recently undergone major expansion and change. One of the key new directions is the development of methods that do not require exact information about the objective function. Nevertheless, the majority of these methods, from stochastic gradient descent to "zero-th order" methods use some kind of approximate first order information. We will introduce a general definition of a stochastic oracle and show how this definition applies in a variety of familiar settings. We will then discuss how these oracles can be used by stochastic variants of adaptive methods, which include stochastic step search, trust region and cubicly regularized Newton methods. Such methods adapt the step size parameter and use it to dictate the accuracy required of the oracles. The requirements on the oracles, thus, are also adaptive. The step size parameters in these methods can increase and decrease based on the perceived progress, but unlike the deterministic case they are not bounded away from zero. This creates obstacles in complexity analysis of such methods. We will show how one can develop bounds on expected complexity that also apply in high probability. We will discuss various stochastic settings, where the framework easily applies, such as expectation minimization, black box and simulation optimization, expectation minimization with corrupt samples, etc.