学术报告
所在位置 网站首页 > 学术科研 > 学术报告 > 正文
学术报告:Best Rank-One Approximation of higher-order multi-partially Symmetric Tensors by Neural Network
编辑:发布时间:2018年07月04日

报告人:王学忠副教授

        河西学院

题目:Best Rank-One Approximation of higher-order multi-partially Symmetric Tensors by Neural Network

时间:2018年07月06日下午16:00

地点:海韵实验楼105

摘要:Our purpose is to compute the multi-partially symmetric rank-one approximations of higher-order multi-partially symmetric tensors. A special case is the partially symmetric rank-one approximation for the fourth-order partially symmetric tensors, which is related to the biquadratic optimization problem. For the special case, we implement the neural network model by the ordinary differential equations (ODE), which is a class of continuous-time recurrent neural network. Several properties of states for the network are established. We prove that the solution of the ODE is locally asymptotically stable by establishing an appropriate Lyapunov function under mild conditions. Similarly, we consider how to compute the multi-partially symmetric rank-one approximations of multi-partially symmetric tensors via neural networks. Finally, we define the restricted M-singular values and the corresponding restricted M-singular vectors of higher-order multi-partially symmetric tensors and design to compute them. Numerical results show that the neural network models are efficient.

报告人简介:王学忠,男,博士,副教授。2018年6月于复旦大学数学科学学院获博士学位后到河西学院数学与统计学院工作。研究方向为数值线性代数、张量分析、神经网络等。在国内外重要学术刊物 “Neural Computation”、“Neurocomputing”、 “Journal of Computational and Applied Mathematics” 等发表论文20余篇。

联系人:白正简教授

欢迎广大师生参加!