KAUST Research Workshop on Optimization and Big Data
Prof Takac received his B.S. (2008) and M.S. (2010) degrees in Mathematics from Comenius University, Slovakia, and Ph.D. (2014) degree in Mathematics from The University of Edinburgh, United Kingdom. He received several awards during this period, including the Best Ph.D. Dissertation Award by the OR Society (2014), Leslie Fox Prize (2nd Prize; 2013) by the Institute for Mathematics and its Applications, and INFORMS Computing Society Best Student Paper Award (runner up; 2012). Since 2014, he is a Tenure Track Assistant Professor in the Department of Industrial and Systems Engineering at Lehigh University, USA. His current research interests include the design, analysis, and application of algorithms for machine learning, optimization, high-performance computing, operations research and energy systems. Prof. Takac is a core faculty at the OptML Group and is affiliated faculty of the Cognitive Science Program.
In this paper, we propose a StochAstic Recursive grAdient algoritHm (SARAH), as well as its practical variant SARAH+, as a novel approach to the finite-sum minimization problems. Different from the vanilla SGD and other modern stochastic methods such as SVRG, S2GD, SAG and SAGA, SARAH admits a simple recursive framework for updating stochastic gradient estimates; when comparing to SAG/SAGA, SARAH does not require a storage of past gradients. The linear convergence rate of SARAH is proven under strong convexity assumption. We also prove a linear convergence rate (in the strongly convex case) for an inner loop of SARAH, the property that SVRG does not possess. The convergence rate for convex and non-covex case is also discussed. Numerical experiments demonstrate the efficiency of our algorithm.