Next Article in Journal
Ultra-Low-Power Circuits for Intermittent Communication
Previous Article in Journal
Tunnel Field-Effect Transistor: Impact of the Asymmetric and Symmetric Ambipolarity on Fault and Performance in Digital Circuits
Previous Article in Special Issue
Direct-Grown Helical-Shaped Tungsten-Oxide-Based Devices with Reconfigurable Selectivity for Memory Applications
 
 
Article
Peer-Review Record

Towards Low-Power Machine Learning Architectures Inspired by Brain Neuromodulatory Signalling

J. Low Power Electron. Appl. 2022, 12(4), 59; https://doi.org/10.3390/jlpea12040059
by Taylor Barton 1,†, Hao Yu 2,†, Kyle Rogers 2,*,†, Nancy Fulda 2, Shiuh-hua Wood Chiang 1, Jordan Yorgason 3 and Karl F. Warnick 1
Reviewer 1:
Reviewer 2:
J. Low Power Electron. Appl. 2022, 12(4), 59; https://doi.org/10.3390/jlpea12040059
Submission received: 30 September 2022 / Revised: 22 October 2022 / Accepted: 1 November 2022 / Published: 4 November 2022

Round 1

Reviewer 1 Report

This paper presents a novel task transfer algorithm, termed neuro-modulatory tuning, for machine learning based on biologically-inspired principles. On image recognition tasks, neuromodulation tuning requires four orders of magnitude fewer active training parameters than traditional fine-tuning methods. Besides, a circuit design of neurons that implements neuro-modulatory tuning, a potential layout for the use of such neurons on an analog chip, and post-layout validation of its capabilities are presented.

1. Some conclusions need to be supported by experimental data. It was proposed in Section 3.2 that in spiking networks, modifying bias weights directly functions well. This conclusion needs to be proved by experimental results.

2. I think keeping the abbreviations consistent will greatly increase the readability of the article. For example, the abbreviations in Table 2, Table 5 and Table 6 should be kept as consistent as possible.

3. During the experiment, the article only pays attention to the influence of different learning rates on the results. I think the batch size also needs attention when adjusting the learning rate.

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Reviewer 2 Report

In this work, the author presents the novel structure for the analog computing with the training. Although it is the interesting work, I suggest the author should compare and benchmark the worldwide related chip performance with each other. For example, Power-Computing efficiency (TOPS). That is the key factor we evaluate the system level workable or not.

Ref format is also needed to take care.

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Round 2

Reviewer 1 Report

The authors have answered all my questions and the manuscript can be published now.

Reviewer 2 Report

Auther has addressed my question in the revised version well. Therefore, I suggest to accept this work for the publication.

Back to TopTop