数学与统计学院学术报告[2019] 123号
(高水平大学建设系列报告353号)
报告题目: Stochastic Gradient Hamiltonian Monte Carlo for Non-Convex Stochastic Optimization
报告人:朱凌炯 助理 教授 (佛罗里达州立大学)
报告时间:2019年12月18日10:00-11:00
报告地点:汇星楼( 科技楼)514
报告内容:
Stochastic gradient Hamiltonian Monte Carlo (SGHMC) is a variant of stochastic gradient with momentum where a controlled and properly scaled Gaussian noise is added to the stochastic gradients to steer the iterates towards a global minimum. Many works reported its empirical success in practice for solving stochastic non-convex optimization problems, in particular it has been observed to outperform overdamped Langevin Monte Carlo-based methods such as stochastic gradient Langevin dynamics (SGLD) in many applications. Although asymptotic global convergence properties of SGHMC are well known, its finite-time performance is not well-understood. In this work, we study two variants of SGHMC based on two alternative discretizations of the underdamped Langevin diffusion. We provide finite-time performance bounds for the global convergence of both SGHMC variants for solving stochastic non-convex optimization problems with explicit constants. Our results lead to non-asymptotic guarantees for both population and empirical risk minimization problems. For a fixed target accuracy level, on a class of non-convex problems, we obtain complexity bounds for SGHMC that can be tighter than those for SGLD. These results show that acceleration with momentum is possible in the context of global non-convex optimization. This is based on the joint work with Xuefeng Gao and Mert Gurbuzbalaban.
报告人简历:
朱凌炯2008年本科毕业于英国剑桥大学,2013年获美国纽约大学数学博士学位。后在纽约摩根斯坦利,明尼苏达大学任职,并于2015年加入佛罗里达州立大学。研究方向包括应用概率,数据科学,金融工程,运筹。先后在SIFIN, FS, QF, AAP, SPA, Bernoulli, IME, INFORMS JoC, Queueing Systems等期刊发表学术论文三十余篇。曾获Kurt Friedrichs Prize,主持美国国家科学基金一项。
欢迎感兴趣的师生参加!
数学与统计学院
2019年12月16日