Next Article in Journal
Quantum Estimates of Ostrowski Inequalities for Generalized ϕ-Convex Functions
Next Article in Special Issue
Local Convergence of Solvers with Eighth Order Having Weak Conditions
Previous Article in Journal
Research on Nonlinear Coupling Anti-Swing Control Method of Double Pendulum Gantry Crane Based on Improved Energy
Open AccessArticle

Nonparametric Tensor Completion Based on Gradient Descent and Nonconvex Penalty

by Kai Xu 1 and Zhi Xiong 1,2,*
1
Department of Computer Science and Technology, Shantou University, 243 Daxue Road, Shantou 515063, China
2
Key Laboratory of Intelligent Manufacturing Technology (Shantou University), Ministry of Education, Shantou University, 243 Daxue Road, Shantou 515063, China
*
Author to whom correspondence should be addressed.
Symmetry 2019, 11(12), 1512; https://doi.org/10.3390/sym11121512
Received: 7 November 2019 / Revised: 30 November 2019 / Accepted: 10 December 2019 / Published: 12 December 2019
(This article belongs to the Special Issue Iterative Numerical Functional Analysis with Applications)
Existing tensor completion methods all require some hyperparameters. However, these hyperparameters determine the performance of each method, and it is difficult to tune them. In this paper, we propose a novel nonparametric tensor completion method, which formulates tensor completion as an unconstrained optimization problem and designs an efficient iterative method to solve it. In each iteration, we not only calculate the missing entries by the aid of data correlation, but consider the low-rank of tensor and the convergence speed of iteration. Our iteration is based on the gradient descent method, and approximates the gradient descent direction with tensor matricization and singular value decomposition. Considering the symmetry of every dimension of a tensor, the optimal unfolding direction in each iteration may be different. So we select the optimal unfolding direction by scaled latent nuclear norm in each iteration. Moreover, we design formula for the iteration step-size based on the nonconvex penalty. During the iterative process, we store the tensor in sparsity and adopt the power method to compute the maximum singular value quickly. The experiments of image inpainting and link prediction show that our method is competitive with six state-of-the-art methods. View Full-Text
Keywords: tensor completion; iterative solution; nonparametric; gradient descent; nonconvex penalty tensor completion; iterative solution; nonparametric; gradient descent; nonconvex penalty
Show Figures

Figure 1

MDPI and ACS Style

Xu, K.; Xiong, Z. Nonparametric Tensor Completion Based on Gradient Descent and Nonconvex Penalty. Symmetry 2019, 11, 1512.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop