KAUST Research Workshop on Optimization and Big Data
Politehnica University of Bucharest
Ion Necoara received the MSc degree in optimization and statistics from University of Bucharest, Romania in 2002 and the PhD degree in applied sciences (cum laude) from Delft University of Technology, The Netherlands in 2006. For the period 2007-2009, he completed a postdoctoral fellowship at the Katholieke Universiteit Leuven, Belgium. Currently, he is a professor at the Department of Automatic Control and Systems Engineering, University Politehnica Bucharest, Romania. He is author of about 100 research papers in optimization and control, for which he received several awards: National Order Faithful Service from Romanian Presidency 2017, Excellence in Research award from Ad Astra 2016, Best Paper Award from Journal of Global Optimization 2015; Romanian Academy Award 2015. His research interests cover various topics in developing new optimization algorithms with a focus on structure exploiting and applying optimization techniques for developing new advanced design algorithms for complex systems.
For convex optimization problems deterministic first order methods have linear convergence provided that the objective function is smooth (Lipschitz continuous gradient) and strongly convex. Moreover, under the same conditions - smoothness and strong convexity - stochastic first order methods have sublinear convergence rates. However, in many applications (machine learning, statistics, control, signal processing) the smoothness/strong convexity conditions do not hold; but the objective function still has a special structure (e.g. composition of a strongly convex function with a linear map). In this talk we replace the smoothness/strong convexity assumptions with several other conditions, that are less conservative, for which we prove that several (stochastic) first order methods are converging linearly. We also provide necessary conditions for linear convergence of (stochastic) gradient method. Finally, we provide examples of several functional classes satisfying our new conditions and discuss several applications of these results (Lasso problem, linear systems, linear programming, convex feasibility, etc).