Electrical Engineering Systems Seminar
This talks lies at the interface of geometry and optimization. I'll talk about geodesically convex optimization problems, a rich class of non-convex optimization problems that admit tractable global optimization. I'll provide some background on this class and some motivating examples. Beyond a general introduction to the topic area, I will dive deeper into a recent discovery of a long-sought result: an accelerated gradient method for Riemannian manifolds. Towards developing this method, we will revisit Nesterov's (Euclidean) estimate sequence technique and present a conceptually simple alternative. We will then generalize this simpler alternative to the Riemannian setting. Combined with a new geometric inequality, we will then obtain the first (global) accelerated Riemannian-gradient method. I'll also comment on some very recent updates on this topic.