Next Article in Journal
An Improved Dynamic Joint Resource Allocation Algorithm Based on SFR
Next Article in Special Issue
Robust Hessian Locally Linear Embedding Techniques for High-Dimensional Data
Previous Article in Journal
The Effect of Preprocessing on Arabic Document Categorization
Article Menu

Export Article

Open AccessArticle
Algorithms 2016, 9(2), 28; doi:10.3390/a9020028

Alternating Direction Method of Multipliers for Generalized Low-Rank Tensor Recovery

School of Science, Xi’an University of Architecture and Technology, Xi’an 710055, China
*
Author to whom correspondence should be addressed.
Academic Editor: Stephan Chalup
Received: 23 November 2015 / Revised: 26 March 2016 / Accepted: 13 April 2016 / Published: 19 April 2016
(This article belongs to the Special Issue Manifold Learning and Dimensionality Reduction)
View Full-Text   |   Download PDF [2450 KB, uploaded 19 April 2016]   |  

Abstract

Low-Rank Tensor Recovery (LRTR), the higher order generalization of Low-Rank Matrix Recovery (LRMR), is especially suitable for analyzing multi-linear data with gross corruptions, outliers and missing values, and it attracts broad attention in the fields of computer vision, machine learning and data mining. This paper considers a generalized model of LRTR and attempts to recover simultaneously the low-rank, the sparse, and the small disturbance components from partial entries of a given data tensor. Specifically, we first describe generalized LRTR as a tensor nuclear norm optimization problem that minimizes a weighted combination of the tensor nuclear norm, the l1-norm and the Frobenius norm under linear constraints. Then, the technique of Alternating Direction Method of Multipliers (ADMM) is employed to solve the proposed minimization problem. Next, we discuss the weak convergence of the proposed iterative algorithm. Finally, experimental results on synthetic and real-world datasets validate the efficiency and effectiveness of the proposed method. View Full-Text
Keywords: low-rank tensor recovery; low-rank matrix recovery; nuclear norm minimization; alternating direction method of multipliers low-rank tensor recovery; low-rank matrix recovery; nuclear norm minimization; alternating direction method of multipliers
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Shi, J.; Yin, Q.; Zheng, X.; Yang, W. Alternating Direction Method of Multipliers for Generalized Low-Rank Tensor Recovery. Algorithms 2016, 9, 28.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Algorithms EISSN 1999-4893 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top