Non-Convex Sparse and Low-Rank Based Robust Subspace Segmentation for Data Mining
AbstractParsimony, including sparsity and low-rank, has shown great importance for data mining in social networks, particularly in tasks such as segmentation and recognition. Traditionally, such modeling approaches rely on an iterative algorithm that minimizes an objective function with convex l1-norm or nuclear norm constraints. However, the obtained results by convex optimization are usually suboptimal to solutions of original sparse or low-rank problems. In this paper, a novel robust subspace segmentation algorithm has been proposed by integrating lp-norm and Schatten p-norm constraints. Our so-obtained affinity graph can better capture local geometrical structure and the global information of the data. As a consequence, our algorithm is more generative, discriminative and robust. An efficient linearized alternating direction method is derived to realize our model. Extensive segmentation experiments are conducted on public datasets. The proposed algorithm is revealed to be more effective and robust compared to five existing algorithms. View Full-Text
Scifeed alert for new publicationsNever miss any articles matching your research from any publisher
- Get alerts for new papers matching your research
- Find out the new papers from selected authors
- Updated daily for 49'000+ journals and 6000+ publishers
- Define your Scifeed now
Cheng, W.; Zhao, M.; Xiong, N.; Chui, K.T. Non-Convex Sparse and Low-Rank Based Robust Subspace Segmentation for Data Mining. Sensors 2017, 17, 1633.
Cheng W, Zhao M, Xiong N, Chui KT. Non-Convex Sparse and Low-Rank Based Robust Subspace Segmentation for Data Mining. Sensors. 2017; 17(7):1633.Chicago/Turabian Style
Cheng, Wenlong; Zhao, Mingbo; Xiong, Naixue; Chui, Kwok Tai. 2017. "Non-Convex Sparse and Low-Rank Based Robust Subspace Segmentation for Data Mining." Sensors 17, no. 7: 1633.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.