Rank-Restricted Hierarchical Alternating Least Squares Algorithm for Matrix Completion with Applications
Abstract
1. Introduction
2. Related Works and Our Contribution
- A novel optimal relaxation of (3) that enforces adjustable sparsity and incorporates an orthogonality constraint is proposed, yielding more accurate and computationally efficient results than the procedure in (9). Specifically, the orthogonality constraint is included to enhance computational efficiency, while a new shrinkage function is derived to enforce sparsity in the solution.
- To improve the computational efficiency within a single iteration, we employ hierarchical alternating minimization to the proposed model, enabling faster computation through column-wise updates rather than full-matrix updates. This method remains effective even with a relatively rough estimate of the rank and is faster than the procedure in (7).
3. Rank-Restricted Hierarchical Alternating Least Squares Algorithm
3.1. Rank-Restricted Hierarchical Alternating Least Squares with Orthogonality Constraint
3.2. Sparsity Constraint and Imposing the Boundary Condition
Algorithm 1 | |
1: | |
2: and , where | |
3: for k = 1,2,… do | |
4: , , and compute defined in (14) | ▷ |
5: for j = 1,2,…,r do | |
6: | ▷ |
7: end for | |
8: , and compute defined in (14) | ▷ |
9: for j = 1,2,…,r do | |
10: | ▷ |
11: end for | |
12: | ▷ |
13: where shrinkage operator is defined in (18) | |
14: Set | ▷ |
15: | ▷ |
16: if then | |
17: break | |
18: end if | |
19: end for | |
20: return |
3.3. Computational and Memory Complexity
4. Numerical Experiments
4.1. Image Completion Problem
4.2. Recommender System
5. Conclusions and Future Works
Supplementary Materials
Funding
Data Availability Statement
Conflicts of Interest
Abbreviations
HALMC | Hierarchical Alternating Least Squares for Matrix Completion |
References
- Ramlatchan, A.; Yang, M.; Liu, Q.; Li, M.; Wang, J.; Li, Y. A Survey of Matrix Completion Methods, for Recommendation Systems. Big Data Min. Anal. 2018, 1, 308–323. [Google Scholar] [CrossRef]
- Chen, Z.; Wang, S. A review on matrix completion for recommender systems. Knowl. Inf. Syst. 2022, 64, 1–34. [Google Scholar] [CrossRef]
- Jam, J.; Kendrick, C.; Walker, K.; Drouard, V.; Hsu, J.G.; Yap, M.H. A comprehensive review of past and present image inpainting methods. Comput. Vis. Image Underst. 2021, 203, 103147. [Google Scholar] [CrossRef]
- Li, J.; Li, M.; Fan, H. Image Inpainting Algorithm Based on Low-Rank Approximation and Texture Direction. Math. Prob. Eng. 2014. [Google Scholar] [CrossRef]
- Xu, J.; Chen, Y.; Zhang, X. Color image inpainting based on low-rank quaternion matrix factorization. J. Ind. Manag. Optim. 2024, 20, 825–837. [Google Scholar] [CrossRef]
- Fan, J.; Cheung, J. Matrix completion by deep matrix factorization. Neural Net. 2018, 98, 34–41. [Google Scholar] [CrossRef] [PubMed]
- Xu, M.; Jin, R.; Zhou, Z. Speedup matrix completion with side information: Application to multi-label learning. In Proceedings of the NIPS’13: 26th International Conference on Neural Information Processing Systems, Lake Tahoe, NV USA, 5–10 December 2013; Volume 2, pp. 2301–2309. [Google Scholar]
- Radhakrishnan, A.; Stefanakis, G.; Belkin, M.; Uhler, C. Simple, fast, and flexible framework for matrix completion with infinite width neural networks. Proc. Natl. Acad. Sci. USA 2022, 119, e2115064119. [Google Scholar] [CrossRef] [PubMed]
- Candès, E.; Eldar, Y.; Strohmer, T. Phase retrieval via matrix completion. SIAM Rev. 2015, 52, 225–251. [Google Scholar] [CrossRef]
- Kalogerias, D.S.; Petropulu, A.P. Matrix Completion in Colocated MIMO Radar: Recoverability, Bounds & Theoretical Guarantees. IEEE Trans. Signal Process. 2013, 62, 309–321. [Google Scholar] [CrossRef]
- Sun, S.; Zhang, Y.D. 4D Automotive Radar Sensing for Autonomous Vehicles: A Sparsity-Oriented Approach. IEEE J. Sel. Top. Signal Process. 2021, 15, 879–891. [Google Scholar] [CrossRef]
- Ha, J.; Li, C.; Luo, X.; Wang, Z. Matrix completion via modified schattern 2/3-norm. Eurasip J. Adv. Signal Process. 2023, 2023, 62. [Google Scholar] [CrossRef]
- Tanner, J.; Wei, K. Low rank matrix completion by alternating steepest descent methods. Appl. Comput. Harmon. Anal. 2016, 40, 417–429. [Google Scholar] [CrossRef]
- Cai, J.F.; Candès, E.J.; Shen, Z. A Singular Value Thresholding Algorithm for Matrix Completion. Siam J. Optim. 2010, 20, 1956–1982. [Google Scholar] [CrossRef]
- Recht, B.; Fazel, M.; Parrilo, P.A. Guaranteed minimum-rank solution of linear matrix equations via nuclear norm minimization. SIAM Rev. 2010, 52, 471–501. [Google Scholar] [CrossRef]
- Shi, Q.; Lu, H.; Cheung, Y. Rank-One Matrix Completion With Automatic Rank Estimation via L1-Norm Regularization. IEEE Trans. Neural Net. Learn. Sys. 2017, 29, 4744–4757. [Google Scholar] [CrossRef] [PubMed]
- Xiao, J.; Huang, T.; Deng, L.; Dou, H. A Novel Nonconvex Rank Approximation with Application to the Matrix Completion. East Asian J. Appl. Math. 2025, 15, 741–769. [Google Scholar] [CrossRef]
- Cui, A.; Peng, J.; Li, H. Exact recovery low-rank matrix via transformed affine matrix rank minimization. Neurocomputing 2018, 319, 1–12. [Google Scholar] [CrossRef]
- Josse, J.; Sardy, S. Adaptive shrinkage of singular values. Stat. Comput. 2016, 26, 715–724. [Google Scholar] [CrossRef]
- Zhang, H.; Gong, C.; Qian, J.; Zhang, B.; Xu, C.; Yang, J. Efficient recovery of low-rank via double nonconvex nonsmooth rank minimization. IEEE Trans. Neural Netw. Learn. Syst. 2019, 30, 2916–2925. [Google Scholar] [CrossRef]
- Hastie, T.; Mazumder, R.; Lee, J.D.; Zadeh, R. Matrix Completion and Low-Rank SVD via Fast Alternating Least Squares. J. Mach. Learn. Res. 2015, 16, 3367–3402. [Google Scholar]
- Li, C.; Che, H.; Leung, M.F.; Liu, C.; Yan, Z. Robust multi-view non-negative matrix factorization with adaptive graph and diversity constraints. Inf. Sci. 2023, 634, 587–607. [Google Scholar] [CrossRef]
- Yang, X.; Che, H.; Leung, M.F.; Liu, C. Adaptive graph nonnegative matrix factorization with the self-paced regularization. Appl. Intell. 2023, 53, 15818–15835. [Google Scholar] [CrossRef]
- Xu, K.; Zhang, Y.; Dong, Z.; Li, Z.; Fang, B. Hybrid Matrix Completion Model for Improved Images Recovery and Recommendation Systems. IEEE Access 2021, 9, 149349–149359. [Google Scholar] [CrossRef]
- Xu, D.; Ruan, C.; Korpeoglu, E.; Kumar, S.; Achan, K. Rethinking Neural vs. Matrix-Factorization Collaborative Filtering: The Theoretical Perspectives. In Proceedings of the 38 th International Conference on Machine Learning, Virtual, 18–24 July 2021; PMLR: Cambridge, UK, 2021; Volume 139. [Google Scholar]
- Ding, C.; Li, T.; Peng, W.; Park, H. Orthogonal nonnegative matrix t-factorizations for clustering. In Proceedings of the KDD ’06: 12th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Philadelphia, PA, USA, 20–23 August 2006; pp. 126–135. [Google Scholar]
- Gan, J.; Liu, T.; Li, L.; Zhang, J. Non-negative Matrix Factorization: A Survey. Comput. J. 2021, 64, 1080–1092. [Google Scholar] [CrossRef]
- Kimura, K.; Tanaka, Y.; Kudo, M. A Fast Hierarchical Alternating Least Squares Algorithm for Orthogonal Nonnegative Matrix Factorization. In Proceedings of the Sixth Asian Conference on Machine Learning, Ho Chi Minh City, Vietnam, 20–22 November 2015; Volume 39, pp. 129–141. [Google Scholar]
- Chichocki, A.; Zdunek, R.; Phan, A.H.; Amari, S.I. Nonnegative Matrix and Tensor Factorizations: Applications to Exploratory Multi-Way Data Analysis and Blind Source Separation; John Wiley & Sons, Ltd.: Hoboken, NJ, USA, 2009. [Google Scholar]
- Mairal, J.; Bach, F.; Ponce, J.; Sapiro, G.; Zisserman, A. Non-local sparse models for image restoration. In Proceedings of the IEEE 12th International Conference on Computer Vision, Kyoto, Japan, 29 September–2 October 2009; pp. 2272–2279. [Google Scholar]
- Dabov, K.; Foi, A.; Katkovnik, V.; Egiazarian, K. Image Denoising by Sparse 3-D Transformation-Domain Collaborative Filtering. IEEE Trans. Image Process. 2007, 16, 2080–2095. [Google Scholar] [CrossRef]
- Blumensath, T.; Davies, M.E. Iterative hard thresholding for compressed sensing. Appl. Comput. Harmon. Anal. 2009, 27, 265–274. [Google Scholar] [CrossRef]
- Hansen, P.C.; Nagy, J.G.; O’Leary, D.P. Deblurring Images Matrices, Spectra, and Filtering; SIAM: Philadelphia, PA, USA, 2006. [Google Scholar]
- Fan, Y.W.; Nagy, J.G. Synthetic boundary conditions for image deblurring. Linear Algebra Appl. 2011, 434, 2244–2268. [Google Scholar] [CrossRef]
- Zhou, X.; Zhou, F.; Bai, X.; Xue, B. A boundary condition based deconvolution framework for image deblurring. J. Comput. Appl. Math. 2014, 261, 14–29. [Google Scholar] [CrossRef]
- Nguyen, L.T.; Kim, J.; Shim, B. Low-Rank Matrix Completion: A Contemporary Survey. IEEE Access 2019, 7, 94215–94237. [Google Scholar] [CrossRef]
- Bertalmio, M.; Sapiro, G.; Caselles, V.; Ballester, C. Image inpainting. In Proceedings of the 27th Annual Conference on Computer Graphics and Interactive Techniques, New Orleans, LA, USA, 23–28 July 2000; ACM Press/Addison-Wesley Co.: New Orleans, LA, USA, 2000; pp. 417–424. [Google Scholar]
- Chan, T.F.; Shen, J. Nontexture inpainting by curvature-driven diffusions. J. Vis. Commun. Image Represent. 2001, 12, 436–449. [Google Scholar] [CrossRef]
- Zhang, L.; Zhang, L.; Mou, X.; Zhang, D. FSIM: A Feature Similarity Index for Image Quality Assessment. IEEE Trans. Image Process. 2011, 20, 2378–2386. [Google Scholar] [CrossRef]
Missing Rate | Method | Iter. | Time | MSE | PSNR | SSIM | FSIM |
---|---|---|---|---|---|---|---|
30% | incomp. | - | - | 0.2976 | 12.9949 | 0.9971 | 0.9468 |
SVT | 25.1 | 0.0405 | 0.0161 | 25.7045 | 0.9999 | 0.9962 | |
SoftImp | 8.2 | 0.0111 | 0.0169 | 25.4353 | 0.9998 | 0.9961 | |
LAMC | 102.6 | 4.7897 | 0.3074 | 13.0869 | 0.9953 | 0.9695 | |
HALMC | 50.5 | 0.0245 | 0.0173 | 25.3861 | 0.9999 | 0.9954 | |
TFR | 16.2 | 0.0259 | 0.0129 | 26.6635 | 0.9999 | 0.9968 | |
50% | incomp. | - | - | 0.5015 | 10.7283 | 0.9931 | 0.9209 |
SVT | 21.4 | 0.0364 | 0.0431 | 21.3984 | 0.9996 | 0.9888 | |
SoftImp | 19.3 | 0.0184 | 0.0405 | 21.6562 | 0.9996 | 0.9894 | |
LAMC | 99.2 | 4.8873 | 0.3030 | 13.1135 | 0.9955 | 0.9667 | |
HALMC | 26.7 | 0.0141 | 0.0377 | 21.9917 | 0.9997 | 0.9893 | |
TFR | 32.0 | 0.0504 | 0.0319 | 22.7804 | 0.9998 | 0.9911 | |
70% | incomp. | - | - | 0.69903 | 9.2864 | 0.9875 | 0.9016 |
SVT | 500.0 | 0.7911 | 0.1527 | 16.0030 | 0.9979 | 0.9616 | |
SoftImp | 17.3 | 0.0147 | 0.0947 | 17.9690 | 0.9988 | 0.9724 | |
LAMC | 93.0 | 4.7726 | 0.2925 | 13.2024 | 0.9957 | 0.9633 | |
HALMC | 22.0 | 0.0112 | 0.0880 | 18.2990 | 0.9991 | 0.9729 | |
TFR | 67.1 | 0.1028 | 0.0908 | 18.2937 | 0.9993 | 0.9738 |
Missing Rate | Method | Iter. | Time | MSE | PSNR | SSIM | FSIM |
---|---|---|---|---|---|---|---|
30% | incomp. | - | - | 0.2998 | 10.5942 | 0.9950 | 0.8889 |
SVT | 178.3 | 1.1371 | 0.0082 | 26.2113 | 0.9998 | 0.9915 | |
SoftImp | 8.8 | 0.0347 | 0.0073 | 26.7184 | 0.9998 | 0.9905 | |
LAMC | 71.7 | 18.7550 | 0.2675 | 11.5962 | 0.9928 | 0.9379 | |
HALMC | 86.9 | 0.1484 | 0.0038 | 29.5150 | 0.9999 | 0.9959 | |
TFR | 35.0 | 0.2162 | 0.0039 | 29.3969 | 0.9999 | 0.9960 | |
50% | incomp. | - | - | 0.4998 | 8.3751 | 0.9881 | 0.8663 |
SVT | 161.3 | 1.0556 | 0.0398 | 19.3630 | 0.9992 | 0.9625 | |
SoftImp | 13.3 | 0.0481 | 0.0142 | 23.8497 | 0.9997 | 0.9819 | |
LAMC | 66.8 | 18.4806 | 0.2593 | 11.6512 | 0.9929 | 0.9367 | |
HALMC | 60.1 | 0.1064 | 0.0097 | 25.4886 | 0.9999 | 0.9879 | |
TFR | 59.0 | 0.3657 | 0.0127 | 24.3281 | 0.9998 | 0.9850 | |
70% | incomp. | - | - | 0.6999 | 6.9127 | 0.9783 | 0.8561 |
SVT | 500.0 | 3.2191 | 0.0352 | 20.0334 | 0.9994 | 0.9578 | |
SoftImp | 21.7 | 0.0578 | 0.0284 | 20.8240 | 0.9993 | 0.9632 | |
LAMC | 61.4 | 18.7958 | 0.2395 | 11.8678 | 0.9931 | 0.9363 | |
HALMC | 55.3 | 0.0458 | 0.0219 | 21.9476 | 0.9996 | 0.9689 | |
TFR | 101.3 | 0.6076 | 0.0650 | 17.2651 | 0.9987 | 0.9345 |
Missing Rate | Method | Iter. | Time | MSE | PSNR | SSIM | FSIM |
---|---|---|---|---|---|---|---|
30% | incomp. | - | - | 0.2998 | 11.1183 | 0.9956 | 0.9342 |
SVT | 266.5 | 7.8123 | 0.0040 | 29.8465 | 0.9999 | 0.9988 | |
SoftImp | 10.4 | 0.1647 | 0.0104 | 25.7064 | 0.9998 | 0.9949 | |
LAMC | 96.0 | 97.9771 | 0.2373 | 12.8977 | 0.9945 | 0.9772 | |
HALMC | 22.5 | 0.2248 | 0.0072 | 27.3277 | 0.9999 | 0.9970 | |
TFR | 53.5 | 1.4965 | 0.0107 | 25.5943 | 0.9999 | 0.9958 | |
50% | incomp. | - | - | 0.5002 | 8.8956 | 0.9895 | 0.9026 |
SVT | 248.5 | 7.5049 | 0.0114 | 25.3140 | 0.9998 | 0.9961 | |
SoftImp | 17.1 | 0.2751 | 0.0189 | 23.1023 | 0.9997 | 0.9907 | |
LAMC | 89.0 | 99.2489 | 0.2256 | 12.9940 | 0.9946 | 0.9766 | |
HALMC | 53.0 | 0.5171 | 0.0125 | 24.9203 | 0.9998 | 0.9952 | |
TFR | 189.3 | 5.3344 | 0.0154 | 24.1440 | 0.9998 | 0.9939 | |
70% | incomp. | - | - | 0.6998 | 7.4372 | 0.9809 | 0.8717 |
SVT | 500.0 | 14.0607 | 0.0376 | 20.1641 | 0.9994 | 0.9832 | |
SoftImp | 29.0 | 0.3896 | 0.0374 | 20.1628 | 0.9992 | 0.9804 | |
LAMC | 78.9 | 98.6612 | 0.2235 | 12.9410 | 0.9945 | 0.9745 | |
HALMC | 18.7 | 0.1928 | 0.0368 | 20.2316 | 0.9994 | 0.9820 | |
TFR | 220.1 | 5.9861 | 0.0640 | 17.8678 | 0.9991 | 0.9714 |
Missing Rate | Method | Iter. | Time | MSE | PSNR | SSIM | FSIM |
---|---|---|---|---|---|---|---|
30% | incomp. | - | - | 0.3001 | 14.2484 | 0.9978 | 0.9849 |
SVT | 500.0 | 68.8015 | 0.0125 | 28.0383 | 0.9999 | 0.9995 | |
SoftImp | 12.0 | 1.2559 | 0.0173 | 26.6350 | 0.9998 | 0.9978 | |
LAMC | 76.0 | 369.7370 | 0.1622 | 16.9316 | 0.9978 | 0.9867 | |
HALMC | 56.1 | 1.8325 | 0.0105 | 28.8003 | 0.9999 | 0.9995 | |
TFR | 72.7 | 10.6755 | 0.0137 | 27.6484 | 0.9999 | 0.9992 | |
50% | incomp. | - | - | 0.4997 | 12.0339 | 0.9948 | 0.9687 |
SVT | 500.0 | 72.2569 | 0.0296 | 24.3104 | 0.9998 | 0.9980 | |
SoftImp | 19.0 | 1.7221 | 0.0288 | 24.4284 | 0.9997 | 0.9952 | |
LAMC | 73.0 | 370.1930 | 0.1447 | 17.4208 | 0.9982 | 0.9872 | |
HALMC | 54.4 | 1.7604 | 0.0191 | 26.2006 | 0.9999 | 0.9984 | |
TFR | 81.0 | 11.9213 | 0.2502 | 15.5245 | 0.9981 | 0.9822 | |
70% | incomp. | - | - | 0.7002 | 10.5691 | 0.9905 | 0.9454 |
SVT | 500.0 | 71.9124 | 0.0523 | 21.8857 | 0.9997 | 0.9940 | |
SoftImp | 29.0 | 2.1861 | 0.0494 | 22.0774 | 0.9995 | 0.9918 | |
LAMC | 74.0 | 652.0080 | 0.1294 | 17.9027 | 0.9984 | 0.9876 | |
HALMC | 37.0 | 1.2327 | 0.0393 | 23.0746 | 0.9997 | 0.9936 | |
TFR | 82.0 | 11.8185 | 0.4199 | 13.2978 | 0.9957 | 0.9633 |
Missing Rate | Method | Iter. | Time | MAE | RMSE |
---|---|---|---|---|---|
20% | incomp. | - | - | 3.5280 | 3.7034 |
SVT | 210.90 | 30.1586 | 1.0732 | 1.2894 | |
SoftImp | 50.00 | 4.3338 | 1.0772 | 1.2988 | |
TSNMR | 500.00 | 88.6608 | 0.9778 | 1.1729 | |
HALMC | 22.00 | 0.8978 | 0.9848 | 1.1927 | |
TFR | 169.67 | 24.7094 | 1.1993 | 1.4351 | |
40% | incomp. | - | - | 3.5274 | 3.7032 |
SVT | 185.00 | 24.9133 | 1.1314 | 1.3569 | |
SoftImp | 52.00 | 4.1866 | 1.1479 | 1.3789 | |
TSNMR | 500.00 | 83.1942 | 1.0178 | 1.2298 | |
HALMC | 22.40 | 0.9102 | 1.0152 | 1.2281 | |
TFR | 162.50 | 23.3946 | 1.2675 | 1.5077 | |
60% | incomp. | - | - | 3.5291 | 3.7044 |
SVT | 155.20 | 21.0388 | 1.2425 | 1.4824 | |
SoftImp | 56.00 | 4.4691 | 1.2589 | 1.5029 | |
TSNMR | 500.00 | 84.1506 | 1.2208 | 1.4323 | |
HALMC | 23.00 | 0.9184 | 1.0789 | 1.3025 | |
TFR | 174.00 | 24.1157 | 1.3800 | 1.6268 |
Missing Rate | Method | Iter. | Time | MAE | RMSE |
---|---|---|---|---|---|
20% | incomp. | - | - | 3.5812 | 3.7516 |
SVT | 500.00 | 6399.8700 | 1.0244 | 1.2265 | |
SoftImp | 51.00 | 268.8490 | 1.2549 | 1.4883 | |
TSNMR | 500.00 | 6860.5200 | 0.9359 | 1.1240 | |
HALMC | 29.00 | 21.6903 | 1.0621 | 1.2591 | |
TFR | 27.00 | 358.8730 | 2.6127 | 2.8424 | |
40% | incomp. | - | - | 3.5819 | 3.7519 |
SVT | 500.00 | 6401.6500 | 1.0884 | 1.2999 | |
SoftImp | 55.00 | 285.1180 | 1.3028 | 1.5395 | |
TSNMR | 500.00 | 6849.0600 | 0.9466 | 1.1321 | |
HALMC | 28.50 | 21.6172 | 1.0820 | 1.2827 | |
TFR | 35.00 | 463.2450 | 2.4723 | 2.7358 | |
60% | incomp. | - | - | 3.5813 | 3.7515 |
SVT | 434.00 | 5529.1200 | 1.1519 | 1.3707 | |
SoftImp | 60.00 | 296.5940 | 1.4163 | 1.6599 | |
TSNMR | 500.00 | 6821.0300 | 0.9495 | 1.1301 | |
HALMC | 29.50 | 22.2253 | 1.1213 | 1.3269 | |
TFR | 46.00 | 606.4590 | 2.4454 | 2.6799 |
Missing Rate | Method | Iter. | Time | MAE | RMSE |
---|---|---|---|---|---|
20% | incomp. | - | - | 7.9513 | 8.1353 |
SVT | 500.00 | 3866.0800 | 6.4336 | 6.6708 | |
SoftImp | 52.20 | 261.1780 | 5.4328 | 5.7538 | |
TSNMR | 500.00 | 4775.2700 | 2.9857 | 3.3236 | |
HALMC | 91.80 | 72.7517 | 3.0800 | 3.4051 | |
TFR | 200.00 | 1750.8500 | 5.1806 | 5.4939 | |
40% | incomp. | - | - | 7.9601 | 8.1434 |
SVT | 500.00 | 3863.8100 | 6.9228 | 7.1359 | |
SoftImp | 50.00 | 255.3880 | 5.7713 | 6.0657 | |
TSNMR | 500.00 | 4682.0200 | 2.9569 | 3.3006 | |
HALMC | 85.50 | 68.5895 | 3.6792 | 4.0054 | |
TFR | 211.25 | 1756.0200 | 5.5961 | 5.8833 | |
60% | incomp. | - | - | 7.9568 | 8.1399 |
SVT | 500.00 | 3805.2900 | 7.3241 | 7.5232 | |
SoftImp | 47.00 | 242.5140 | 6.3374 | 6.5932 | |
TSNMR | 500.00 | 4802.9000 | 2.9562 | 3.2997 | |
HALMC | 80.00 | 64.4128 | 5.7102 | 5.9981 | |
TFR | 204.00 | 1815.1400 | 5.7946 | 6.0672 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Lee, G. Rank-Restricted Hierarchical Alternating Least Squares Algorithm for Matrix Completion with Applications. Appl. Sci. 2025, 15, 8876. https://doi.org/10.3390/app15168876
Lee G. Rank-Restricted Hierarchical Alternating Least Squares Algorithm for Matrix Completion with Applications. Applied Sciences. 2025; 15(16):8876. https://doi.org/10.3390/app15168876
Chicago/Turabian StyleLee, Geunseop. 2025. "Rank-Restricted Hierarchical Alternating Least Squares Algorithm for Matrix Completion with Applications" Applied Sciences 15, no. 16: 8876. https://doi.org/10.3390/app15168876
APA StyleLee, G. (2025). Rank-Restricted Hierarchical Alternating Least Squares Algorithm for Matrix Completion with Applications. Applied Sciences, 15(16), 8876. https://doi.org/10.3390/app15168876