Next Article in Journal
On the Contact Geometry and the Poisson Geometry of the Ideal Gas
Next Article in Special Issue
Optimization of CNN through Novel Training Strategy for Visual Classification Problems
Previous Article in Journal
Hedging for the Regime-Switching Price Model Based on Non-Extensive Statistical Mechanics
Previous Article in Special Issue
An Adaptive Learning Based Network Selection Approach for 5G Dynamic Environments
Article Menu
Issue 4 (April) cover image

Export Article

Open AccessArticle
Entropy 2018, 20(4), 249;

Simulation Study on the Application of the Generalized Entropy Concept in Artificial Neural Networks

Department of Informatics, Faculty of Applied Informatics and Mathematics, Warsaw University of Life Sciences-SGGW, Nowoursynowska 159, 02-787 Warsaw, Poland
Author to whom correspondence should be addressed.
Received: 25 January 2018 / Revised: 23 March 2018 / Accepted: 30 March 2018 / Published: 3 April 2018
Full-Text   |   PDF [5782 KB, uploaded 3 May 2018]   |  


Artificial neural networks are currently one of the most commonly used classifiers and over the recent years they have been successfully used in many practical applications, including banking and finance, health and medicine, engineering and manufacturing. A large number of error functions have been proposed in the literature to achieve a better predictive power. However, only a few works employ Tsallis statistics, although the method itself has been successfully applied in other machine learning techniques. This paper undertakes the effort to examine the q -generalized function based on Tsallis statistics as an alternative error measure in neural networks. In order to validate different performance aspects of the proposed function and to enable identification of its strengths and weaknesses the extensive simulation was prepared based on the artificial benchmarking dataset. The results indicate that Tsallis entropy error function can be successfully introduced in the neural networks yielding satisfactory results and handling with class imbalance, noise in data or use of non-informative predictors. View Full-Text
Keywords: artificial neural network; simulation study; generalized entropy artificial neural network; simulation study; generalized entropy

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).

Share & Cite This Article

MDPI and ACS Style

Gajowniczek, K.; Orłowski, A.; Ząbkowski, T. Simulation Study on the Application of the Generalized Entropy Concept in Artificial Neural Networks. Entropy 2018, 20, 249.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics



[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top