Next Article in Journal
Modified Viterbi Algorithm with Feedback Using a Two-Dimensional 3-Way Generalized Partial Response Target for Bit-Patterned Media Recording Systems
Previous Article in Journal
In Vivo Efficacy of Contact Lens Drug-Delivery Systems in Glaucoma Management. A Systematic Review
Article

On the Redundancy in the Rank of Neural Network Parameters and Its Controllability

1
Department of Computer Science and Engineering, Korea University, 145, Anam-ro, Seongbuk-gu, Seoul 02841, Korea
2
Alexa AI, Amazon, 410 Terry Ave. North, Seattle, WA 98109-5210, USA
3
NC Soft Corp., 12, Daewangpangyo-ro 644beon-gil, Bundang-gu, Seongnam-si, Gyeonggi-do, Seoul 13494, Korea
*
Author to whom correspondence should be addressed.
Appl. Sci. 2021, 11(2), 725; https://doi.org/10.3390/app11020725
Received: 14 December 2020 / Revised: 7 January 2021 / Accepted: 12 January 2021 / Published: 13 January 2021
(This article belongs to the Section Computing and Artificial Intelligence)
In this paper, we show that parameters of a neural network can have redundancy in their ranks, both theoretically and empirically. When viewed as a function from one space to another, neural networks can exhibit feature correlation and slower training due to this redundancy. Motivated by this, we propose a novel regularization method to reduce the redundancy in the rank of parameters. It is a combination of an objective function that makes the parameter rank-deficient and a dynamic low-rank factorization algorithm that gradually reduces the size of this parameter by fusing linearly dependent vectors together. This regularization-by-pruning approach leads to a neural network with better training dynamics and fewer trainable parameters. We also present experimental results that verify our claims. When applied to a neural network trained to classify images, this method provides statistically significant improvement in accuracy and 7.1 times speedup in terms of number of steps required for training. Furthermore, this approach has the side benefit of reducing the network size, which led to a model with 30.65% fewer trainable parameters. View Full-Text
Keywords: matrix rank; neural network; pruning; redundancy; regularization matrix rank; neural network; pruning; redundancy; regularization
Show Figures

Figure 1

MDPI and ACS Style

Lee, C.; Kim, Y.-B.; Ji, H.; Lee, Y.; Hur, Y.; Lim, H. On the Redundancy in the Rank of Neural Network Parameters and Its Controllability. Appl. Sci. 2021, 11, 725. https://doi.org/10.3390/app11020725

AMA Style

Lee C, Kim Y-B, Ji H, Lee Y, Hur Y, Lim H. On the Redundancy in the Rank of Neural Network Parameters and Its Controllability. Applied Sciences. 2021; 11(2):725. https://doi.org/10.3390/app11020725

Chicago/Turabian Style

Lee, Chanhee, Young-Bum Kim, Hyesung Ji, Yeonsoo Lee, Yuna Hur, and Heuiseok Lim. 2021. "On the Redundancy in the Rank of Neural Network Parameters and Its Controllability" Applied Sciences 11, no. 2: 725. https://doi.org/10.3390/app11020725

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop