Nonparametric Tensor Completion Based on Gradient Descent and Nonconvex Penalty
1
Department of Computer Science and Technology, Shantou University, 243 Daxue Road, Shantou 515063, China
2
Key Laboratory of Intelligent Manufacturing Technology (Shantou University), Ministry of Education, Shantou University, 243 Daxue Road, Shantou 515063, China
*
Author to whom correspondence should be addressed.
Symmetry 2019, 11(12), 1512; https://doi.org/10.3390/sym11121512
Received: 7 November 2019 / Revised: 30 November 2019 / Accepted: 10 December 2019 / Published: 12 December 2019
(This article belongs to the Special Issue Iterative Numerical Functional Analysis with Applications)
Existing tensor completion methods all require some hyperparameters. However, these hyperparameters determine the performance of each method, and it is difficult to tune them. In this paper, we propose a novel nonparametric tensor completion method, which formulates tensor completion as an unconstrained optimization problem and designs an efficient iterative method to solve it. In each iteration, we not only calculate the missing entries by the aid of data correlation, but consider the low-rank of tensor and the convergence speed of iteration. Our iteration is based on the gradient descent method, and approximates the gradient descent direction with tensor matricization and singular value decomposition. Considering the symmetry of every dimension of a tensor, the optimal unfolding direction in each iteration may be different. So we select the optimal unfolding direction by scaled latent nuclear norm in each iteration. Moreover, we design formula for the iteration step-size based on the nonconvex penalty. During the iterative process, we store the tensor in sparsity and adopt the power method to compute the maximum singular value quickly. The experiments of image inpainting and link prediction show that our method is competitive with six state-of-the-art methods.
View Full-Text
▼
Show Figures
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited
MDPI and ACS Style
Xu, K.; Xiong, Z. Nonparametric Tensor Completion Based on Gradient Descent and Nonconvex Penalty. Symmetry 2019, 11, 1512. https://doi.org/10.3390/sym11121512
AMA Style
Xu K, Xiong Z. Nonparametric Tensor Completion Based on Gradient Descent and Nonconvex Penalty. Symmetry. 2019; 11(12):1512. https://doi.org/10.3390/sym11121512
Chicago/Turabian StyleXu, Kai; Xiong, Zhi. 2019. "Nonparametric Tensor Completion Based on Gradient Descent and Nonconvex Penalty" Symmetry 11, no. 12: 1512. https://doi.org/10.3390/sym11121512
Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.
Search more from Scilit