曾锦山教授学术报告会通知

时间:11.16周四上午9:30

地点:一教南501会议室

报告主题:Global Optimality of Stochastic Semi-definite Optimization with Application to Ordinal Embedding

   欢迎广**生参会学习交流!


AbstractNonconvex reformulations via low-rank factorization for stochastic convex semi-definite optimization problem has attracted arising attention due to its empirical efficiency and scalability. However, it opens a new challenge that under what conditions the nonconvex stochastic algorithms may find the population minimizer within the optimal statistical precision despite their empirical success in applications. In this talk, we provide an answer that the stochastic gradient descent (SGD) method can be adapted to solve the nonconvex reformulation of the original convex problem, with a global linear convergence when using a fixed step size, i.e., converging exponentially fast to the population minimizer within an optimal statistical precision, at a proper initial choice in the restricted strongly convex case. If a diminishing step size is adopted, the bad effect caused by the variance of gradients on the optimization error can be eliminated but the rate is dropped to be sublinear. Then we propose an accelerated stochastic algorithm, i.e., SVRG and establish its global linear convergence. Finally, we apply our developed stochastic algorithms to the ordinal embedding problem and demonstrate their effectiveness.

曾锦山简介:

江西师范大学计算机信息工程学院特聘教授,硕士导师,理学博士。2015年毕业于西安交通大学,师从徐宗本院士。2017年入选江西师范大学首批高端人才培育计划(优青层次)。201311月至201411月在美国加州大学洛杉矶分校数学系访问,20174月至20183月在香港科技大学数学系访问。近五年来发表SCI论文二十余篇,其中在IEEE TSP/TCYBERNerual NetworksNeural Computation等国际主流期刊上发表SCI论文近10篇。论文单篇最高引用86(google scholar)。现承担国家自然科学基金1项,参与多项。担任IEEE CYBER2017年度国际会议学术委员会委员及多个国际主流期刊评论员。