学术报告
您现在的位置: 首页 > 科学研究 > 学术报告 > 正文

20220826 王浩 Relating lp regularization and reweighted l1 regularization

发布时间:2022-08-26 10:24    浏览次数:    来源:

题目:Relating lp regularization and reweighted l1 regularization

报告人:王浩上海科技大学

报告时间:20220826 1600-1700

报告形式:线上(腾讯会议:473-204-457

摘要:

The iteratively reweighted l1 algorithm is a widely used method for solving various regularization problems, which generally minimize a differentiable loss function combined with a convex/nonconvex regularizer to induce sparsity in the solution. However, the convergence and the complexity of iteratively reweighted l1 algorithms is generally difficult to analyze, especially for non-Lipschitz differentiable regularizers such as lp norm regularization with 0 < p < 1. In this paper, we propose, analyze and test a reweighted l1 algorithm combined with the extrapolation technique under the assumption of Kurdyka-Lojasiewicz (KL) property on the proximal function of the perturbed objective. Our method does not require the Lipschitz differentiability on the regularizers nor the smoothing parameters in the weights bounded away from 0. We show the proposed algorithm converges uniquely to a stationary point of the regularization problem and has local linear convergence for KL exponent at most 1/2 and local sublinear convergence for KL exponent greater than 1/2. We also provide results on calculating the KL exponents and discuss the cases when the KL exponent is at most 1/2. Numerical experiments show the efficiency of our proposed method.

报告人简介:王浩博士,上海市青年东方学者。现任上海科技大学信息科学与技术学院助理教授,于20155月在美国Lehigh大学工业工程系获得博士学位,并于2010年和2007年在北京航空航天大学数学与应用数学系分别获得理学硕士和学士学位。当前研究领域主要为非线性优化、非凸正则化问题等机器学习问题和算法。主要成果在SIAM Journal on OptimizationJournal of Machine Learning Research IEEE on Computers等刊物上发表。

 

湖南大学版权所有©2017年    通讯地址:湖南省长沙市岳麓区麓山南路麓山门     邮编:410082     Email:xiaoban@hnu.edu.cn
域名备案信息:[www.hnu.edu.cn,www.hnu.cn/湘ICP备05000239号]      [hnu.cn 湘教QS3-200503-000481 hnu.edu.cn  湘教QS4-201312-010059]