报告人:张超教授
报告题目:Adaptive smoothing mini-batch stochastic accelerated gradient method for nonsmooth convex stochastic composite optimization
摘要:This paper considers a class of convex constrained nonsmooth convex stochastic composite optimization problems whose objective function is given by the summation of a differentiable convex component, together with a general nonsmooth but convex component. The nonsmooth component is not required to have easily obtainable proximal operator, or have the max structure that the smoothing technique in Nesterov’s smoothing method can be used. In order to solve such type problems, we propose an adaptive smoothing mini-batch stochastic accelerated gradient (AdaSMSAG) method, which combines the stochastic approximation method, the Nesterov's accelerated gradient method, and the smoothing methods that allow general smoothing approximations. Convergence of the method is established. Moreover, the order of the worst-case iteration complexity is better than that of the state-of-the art stochastic approximation methods. Numerical results are provided to illustrate the efficiency of the proposed AdaSMSAG method for a risk management in portfolio optimization and a family of Wasserstein distributionally robust support vector machine problems with real data.
报告人简介:张超,北京交通大学理学院数学系,教授、博士生导师。博士毕业于日本弘前大学,目前担任北京交通大学理学院数学系教授。研究兴趣包括:最优化理论、方法及应用、运筹统计分析、最优化理论、算法及其应用等。已在 SIAM Journal on Scientific Computing、SIAM Journal on Optimization、Mathematical Programming、IEEE Transactions on Image Processing、Transportation Research 等一系列国际权威期刊上发表多篇论文。
报告时间:2021年12月9号下午15:00-17:00
报告形式:腾讯会议;会议号:820744044