Videos

Applying Subgradient Methods -- and Accelerated Gradient Methods -- to Efficiently Solve General Convex, Conic Optimization Problems

Presenter
January 26, 2016
Keywords:
  • convex conic optimization, subgradient methods, hyperbolic programming, accelerated first-order methods
MSC:
  • 37D40
Abstract
Recently we introduced a framework for applying subgradient methods to solve general convex, conic optimization problems. The framework, once seen, is "obvious," but had not appeared in the literature, a blind spot. Quite recently we posted a refinement of the framework in the special case of hyperbolic programming. Hyperbolicity cones have algebraic structure ideal for "smoothing." Once a hyperbolic program is smoothed, virtually any accelerated method can be applied, which if done with care, results in a first-order algorithm with "best-possible" iteration bound. We provide an overview of these developments, then briefly discuss where we are now working to deepen and broaden the results, both in the pure theory and with regards to design of algorithms aimed at practice.