请升级浏览器版本

你正在使用旧版本浏览器。请升级浏览器以获得更好的体验。

学术报告

首页 >> 学术报告 >> 正文

【学术报告】Gradient Descent with Random Initialization for Symmetric Tensor Decomposition

发布日期:2021-05-28    点击:

hjc888老品牌黄金城学术报告

Gradient Descent with Random Initialization for Symmetric Tensor Decomposition

刘海霞  博士

(华中科技大学)

报告时间:202161日星期二 下午2:00-3:00


会议地点:沙河E404(线下); 腾讯会议ID 127 416 879(线上)


报告摘要:Symmetric tensor decomposition is of great importance in applications. Several studies have employed greedy approach for tensor with order m > 2. That is, we first find a best rank-one approximation of given tensor and subtract the corresponding component and repeat the process. In this talk we focus on finding a best rank-one approximation of order-3 symmetric tensor, which is formalized as nonconvex optimization model. Firstly, we give a geometric landscape analysis of the nonconvex objective function. In particular, we show that any local minimizer must be a factor in the low-rank decomposition, and any other critical points are linear combinations of the factors. Then, we start from random initialization and iterate by gradient descent algorithm to solve the nonconvex optimization model. We prove that the algorithm must converge to one factor of the CP decomposition. This result, combined with the landscape, reveals that the greedy algorithm, with random initialized gradient descent, gets the CP low-rank decomposition of symmetric tensor. Numerical results coincide with the theoretical proof.


报告人简介Haixia Liu got her Ph.D degree from The Chinese University of Hong Kong, supervised by Prof. Raymond Chan. Before joining HUST, she was working as a postdoc at The Hong Kong University of Science and Technology, mentored by Prof. Yang Wang. Her research interests mainly focus on algorithm design and theoretical analysis in data science problems with its applications.


邀请人: 谢家新

 

快速链接

版权所有 © 2021  hjc888老品牌黄金城 - 新黄金城xhjc官方网站
地址:北京市昌平区高教园南三街9号   电话:61716719