A Review of Binarized Neural Networks
AbstractIn this work, we review Binarized Neural Networks (BNNs). BNNs are deep neural networks that use binary values for activations and weights, instead of full precision values. With binary values, BNNs can execute computations using bitwise operations, which reduces execution time. Model sizes of BNNs are much smaller than their full precision counterparts. While the accuracy of a BNN model is generally less than full precision models, BNNs have been closing accuracy gap and are becoming more accurate on larger datasets like ImageNet. BNNs are also good candidates for deep learning implementations on FPGAs and ASICs due to their bitwise efficiency. We give a tutorial of the general BNN methodology and review various contributions, implementations and applications of BNNs. View Full-Text
Share & Cite This Article
Simons, T.; Lee, D.-J. A Review of Binarized Neural Networks. Electronics 2019, 8, 661.
Simons T, Lee D-J. A Review of Binarized Neural Networks. Electronics. 2019; 8(6):661.Chicago/Turabian Style
Simons, Taylor; Lee, Dah-Jye. 2019. "A Review of Binarized Neural Networks." Electronics 8, no. 6: 661.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.