Next Article in Journal
Non-Invasive Blood Glucose Monitoring Using a Curved Goubau Line
Next Article in Special Issue
Efficient Implementation of 2D and 3D Sparse Deconvolutional Neural Networks with a Uniform Architecture on FPGAs
Previous Article in Journal
An Adaptable Train-to-Ground Communication Architecture Based on the 5G Technological Enabler SDN
Previous Article in Special Issue
Jet Features: Hardware-Friendly, Learned Convolutional Kernels for High-Speed Image Classification
 
 
Review
Peer-Review Record

A Review of Binarized Neural Networks

Electronics 2019, 8(6), 661; https://doi.org/10.3390/electronics8060661
by Taylor Simons and Dah-Jye Lee *
Reviewer 1: Anonymous
Reviewer 2: Anonymous
Electronics 2019, 8(6), 661; https://doi.org/10.3390/electronics8060661
Submission received: 14 May 2019 / Revised: 3 June 2019 / Accepted: 5 June 2019 / Published: 12 June 2019

Round 1

Reviewer 1 Report

The article presents a review of Binarized Neural Networks (BNNs), including various BNNs architectures, implementations and application. The paper is a valuable contribution summarizing current state-of-the-art, achievements and trends in this rapidly developing area of research.

Comments:

Section 2: With regard to the methods to compress CNNs, take note of SqueezeNet.

Section 3: the section is out of place. Move it before section 2, or to Appendices.

Section 5: provide a summary of reviewed neural networks in a form of a table, with respect to activation function, optimization, and other criteria.

Sections 6 and 7 deal with the same topic of BNN improvements and should be merged into one section.  A discussion of learning control of BNN should be included, see the following article for reference:

·         Multi-threaded learning control mechanism for neural networks. Future Generation Computer Systems, 87, 16-34. doi:10.1016/j.future.2018.04.050

Section 8.1: provide more details about datasets (number of data samples, number of classes, etc.)

Section 8.2: illustrate topologies with Figures.

Table 2: topologies for [44] and [50] are missing; put n/k, if not known. The same remark applies to Tables 3, 4 and 6.

Table 6: the header of last column is missing.

Table 9: the order of entries for IN dataset is wrong (in decreasing order of accuracy, should be in increasing order, similarly to entries of other datasets).

Sections 9 and 10: suggest to merge under the title of “hardware implementations”.

Section 10: a comparison of ASIC implementations in a tabular form (similar to Table 9 for FPGA implementations) is missing.

Robustness of BNNs also should be discussed. Add some discussion on attacks on BNNs.  It is known that neural networks may be overly sensitive to ``attacks" -- tiny adversarial changes in the input -- which may be detrimental to their use in safety-critical domains. How this problem is addressed in BNNs?

Conclusions: be more specific, provide specific numerical results how much and for what tasks BNNs are better that traditional neural networks.

Author Response

See attached PDF file.

Author Response File: Author Response.pdf

Reviewer 2 Report

This paper provides a comprehensive review for the binarized neural networks (BNN), more focusing on the deep learning network frameworks in the past decade. It presents a strong background and the original implementation of BNN as well as the important variants from the original implementation. It covers the advantages of BNN against the counter-parts and addresses the various issues as well, such as the activation, weight update, execution time, and accuracy. I found a couple minor typos but it looks good to be published. 

Eq(1): W_b -> W_B and W_r -> W_R

Line 142: Directly calculating the gradient of the loss w.r.t. the real valued weights is ...

And please refer to the following papers that need to be included. 

Hubara, Itay, et al. "Quantized neural networks: Training neural networks with low precision weights and activations." The Journal of Machine Learning Research 18.1 (2017): 6869-6898.

Li, Fengfu, Bo Zhang, and Bin Liu. "Ternary weight networks." arXiv preprint arXiv:1605.04711 (2016).

Galloway, Angus, Graham W. Taylor, and Medhat Moussa. "Attacking binarized neural networks." arXiv preprint arXiv:1711.00449 (2017).

Khalil, Elias B., Amrita Gupta, and Bistra Dilkina. "Combinatorial attacks on binarized neural networks." arXiv preprint arXiv:1810.03538 (2018).

Author Response

See attached PDF file

Author Response File: Author Response.pdf

Back to TopTop