期刊文献+

Heavy-Ball型动量方法的最优个体收敛速率 预览

Optimal Individual Convergence Rate of the Heavy-Ball-Based Momentum Methods
在线阅读 下载PDF
收藏 分享 导出
摘要 动量方法作为一种加速技巧被广泛用于提高一阶梯度优化算法的收敛速率.目前,大多数文献所讨论的动量方法仅限于Nesterov提出的加速方法,而对Polyak提出的Heavy-ball型动量方法的研究却较少.特别,在目标函数非光滑的情形下,Nesterov加速方法具有最优的个体收敛性,并在稀疏优化问题的求解中具有很好的效果.但对于Heavy-ball型动量方法,目前仅仅获得了平均输出形式的最优收敛速率,个体收敛是否具有最优性仍然未知.对于非光滑优化问题,通过巧妙地设置步长,证明了Heavy-ball型动量方法具有最优的个体收敛速率,从而说明了Heavy-ball型动量方法可以将投影次梯度方法的个体收敛速率加速至最优.作为应用,考虑了l1范数约束的hinge损失函数优化问题.通过与同类的优化算法相比,实验验证了该理论分析的正确性以及所提算法在保持稀疏性方面的良好性能. The momentum method is widely used as an acceleration technique to improve the convergence of the first-order gradient algorithms. So far, the momentum methods discussed in most literatures are only limited to the accelerated method proposed by Nesterov, but the Heavy-ball momentum method proposed by Polyak is seldom studied. In particular, in the case of non-smooth objective functions, the individual optimal convergence of Nesterov accelerated methods has been derived, and it has high performance in solving sparse optimization problems. In contrast, while it has been proved that the Heavy-ball momentum method has an optimal convergence rate,it is only in terms of the averaged outputs. To our best knowledge, whether it has optimal individual convergence or not still remains unknown. In this paper, we focus on the non-smooth optimizations. We prove that the Heavy-ball momentum method achieves the optimal individual convergence by skillfully selecting the time-varying step-size, which indicates that Heavy-ball momentum is an efficient acceleration strategy for the individual convergence of the projected subgradient methods. As an application, the constrained hinge loss function optimization problems within an l 1 -norm ball are considered. In comparison with other optimization algorithms, the experiments demonstrate the correctness of our theoretical analysis and performance of the proposed algorithms in keeping the sparsity.
作者 程禹嘉 陶蔚 刘宇翔 陶卿 Cheng Yujia;Tao Wei;Liu Yuxiang;Tao Qing(Department of Information Engineering, Army Academy of Artillery and Air Defense of PLA, Hefei 230031;College of Command and Control Engineering, Army Engineering University of PLA, Nanjing 210007)
出处 《计算机研究与发展》 EI CSCD 北大核心 2019年第8期1686-1694,共9页 Journal of Computer Research and Development
基金 国家自然科学基金项目(61673394).
关键词 一阶梯度方法 动量方法 个体收敛速率 Heavy-ball方法 稀疏性 first-order gradient methods momentum methods individual convergence rate heavy-ball methods sparsity
作者简介 Cheng Yujia , born in 1996. Master candidate. Her main research interests include convex optimization algorithms and its application in machine learning;Tao Wei , born in 1991. PhD candidate. His main research interests include convex optimization algorithms and its application in machine learning;Liu Yuxiang , born in 1992. Master candidate. His main research interests include convex optimization algorithms and its application in machine learning;通信作者:陶卿, born in 1965. Professor and PhD supervisor. Senior member of CCF. His main research interests include pattern recognition, machine learning and applied mathematic(qing.tao@ia.ac.cn).
  • 相关文献

参考文献2

二级参考文献3

共引文献3

投稿分析

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部 意见反馈