Next Article in Journal
Biofeedback Systems for Gait Rehabilitation of Individuals with Lower-Limb Amputation: A Systematic Review
Next Article in Special Issue
Deep Learning–Based Methods for Automatic Diagnosis of Skin Lesions
Previous Article in Journal
Evaluation of the Visual Stimuli on Personal Thermal Comfort Perception in Real and Virtual Environments Using Machine Learning Approaches
Previous Article in Special Issue
A Camera Sensors-Based System to Study Drug Effects on In Vitro Motility: The Case of PC-3 Prostate Cancer Cells
Open AccessArticle

Stochastic Selection of Activation Layers for Convolutional Neural Networks

Department of Information Enginering, University of Padua, viale Gradenigo 6, 35131 Padua, Italy
DISI, Università di Bologna, Via dell’università 50, 47521 Cesena, Italy
Author to whom correspondence should be addressed.
Sensors 2020, 20(6), 1626;
Received: 14 February 2020 / Revised: 11 March 2020 / Accepted: 12 March 2020 / Published: 14 March 2020
(This article belongs to the Special Issue Machine Learning for Biomedical Imaging and Sensing)
In recent years, the field of deep learning has achieved considerable success in pattern recognition, image segmentation, and many other classification fields. There are many studies and practical applications of deep learning on images, video, or text classification. Activation functions play a crucial role in discriminative capabilities of the deep neural networks and the design of new “static” or “dynamic” activation functions is an active area of research. The main difference between “static” and “dynamic” functions is that the first class of activations considers all the neurons and layers as identical, while the second class learns parameters of the activation function independently for each layer or even each neuron. Although the “dynamic” activation functions perform better in some applications, the increased number of trainable parameters requires more computational time and can lead to overfitting. In this work, we propose a mixture of “static” and “dynamic” activation functions, which are stochastically selected at each layer. Our idea for model design is based on a method for changing some layers along the lines of different functional blocks of the best performing CNN models, with the aim of designing new models to be used as stand-alone networks or as a component of an ensemble. We propose to replace each activation layer of a CNN (usually a ReLU layer) by a different activation function stochastically drawn from a set of activation functions: in this way, the resulting CNN has a different set of activation function layers. View Full-Text
Keywords: Convolutional Neural Networks; ensemble of classifiers; activation functions; image classification; skin detection Convolutional Neural Networks; ensemble of classifiers; activation functions; image classification; skin detection
Show Figures

Figure 1

MDPI and ACS Style

Nanni, L.; Lumini, A.; Ghidoni, S.; Maguolo, G. Stochastic Selection of Activation Layers for Convolutional Neural Networks. Sensors 2020, 20, 1626.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

Search more from Scilit
Back to TopTop