Mathematics Behind Machine Learning

A special issue of Axioms (ISSN 2075-1680).

Deadline for manuscript submissions: closed (20 February 2022) | Viewed by 9752

Special Issue Editor


E-Mail Website
Guest Editor
Enginieering Division of the Campus Irapuato-Salamanca, University of Guanajuato, Salamanca 36885, Mexico
Interests: computer vision; pattern recognition; optimization methods; automatic control; machine and deep learning
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Machine learning processes have been extensively used to accurately determine the hyperparameters of many optimization processes to estimate the parameters of complex mathematical models representing real processes. The mathematical modeling of biological, chemical, medical, electrical, and electronics phenomena helps to interpret the available data to make optimal decisions. Many mathematical foundations used in machine learning are fundamental and sometimes prioritized over heuristic or metaheuristics schemes when the problem’s formal conditions are available. This Special Issue on “Mathematics behind Machine Learning” is focused on detecting important contributions on mathematical modeling, parameter estimation, optimization processes using deterministic approaches or decision schemes based on probabilistic decision related to machine learning methods finding potential applications in areas such as microgrids, biomedical image analysis, signal and image processing, sensors and measurements, robotics, controller design, and energy renewable sources, among others.

Dr. Juan Gabriel Avina-Cervantes
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Axioms is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Machine learning
  • Optimization models
  • Deterministic models
  • Exhaustive search
  • Pattern recognition

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

18 pages, 306 KiB  
Article
Mathematical Neural Networks
by Julia García Cabello
Axioms 2022, 11(2), 80; https://doi.org/10.3390/axioms11020080 - 17 Feb 2022
Cited by 3 | Viewed by 4231
Abstract
ANNs succeed in several tasks for real scenarios due to their high learning abilities. This paper focuses on theoretical aspects of ANNs to enhance the capacity of implementing those modifications that make ANNs absorb the defining features of each scenario. This work may [...] Read more.
ANNs succeed in several tasks for real scenarios due to their high learning abilities. This paper focuses on theoretical aspects of ANNs to enhance the capacity of implementing those modifications that make ANNs absorb the defining features of each scenario. This work may be also encompassed within the trend devoted to providing mathematical explanations of ANN performance, with special attention to activation functions. The base algorithm has been mathematically decoded to analyse the required features of activation functions regarding their impact on the training process and on the applicability of the Universal Approximation Theorem. Particularly, significant new results to identify those activation functions which undergo some usual failings (gradient preserving) are presented here. This is the first paper—to the best of the author’s knowledge—that stresses the role of injectivity for activation functions, which has received scant attention in literature but has great incidence on the ANN performance. In this line, a characterization of injective activation functions has been provided related to monotonic functions which satisfy the classical contractive condition as a particular case of Lipschitz functions. A summary table on these is also provided, targeted at documenting how to select the best activation function for each situation. Full article
(This article belongs to the Special Issue Mathematics Behind Machine Learning)
Show Figures

Figure 1

27 pages, 25974 KiB  
Article
Image Encryption and Decryption System through a Hybrid Approach Using the Jigsaw Transform and Langton’s Ant Applied to Retinal Fundus Images
by Andrés Romero-Arellano, Ernesto Moya-Albor, Jorge Brieva, Ivan Cruz-Aceves, Juan Gabriel Avina-Cervantes, Martha Alicia Hernandez-Gonzalez and Luis Miguel Lopez-Montero
Axioms 2021, 10(3), 215; https://doi.org/10.3390/axioms10030215 - 7 Sep 2021
Cited by 8 | Viewed by 4460
Abstract
In this work, a new medical image encryption/decryption algorithm was proposed. It is based on three main parts: the Jigsaw transform, Langton’s ant, and a novel way to add deterministic noise. The Jigsaw transform was used to hide visual information effectively, whereas Langton’s [...] Read more.
In this work, a new medical image encryption/decryption algorithm was proposed. It is based on three main parts: the Jigsaw transform, Langton’s ant, and a novel way to add deterministic noise. The Jigsaw transform was used to hide visual information effectively, whereas Langton’s ant and the deterministic noise algorithm give a reliable and secure approach. As a case study, the proposal was applied to high-resolution retinal fundus images, where a zero mean square error was obtained between the original and decrypted image. The method performance has been proven through several testing methods, such as statistical analysis (histograms and correlation distributions), entropy computation, keyspace assessment, robustness to differential attack, and key sensitivity analysis, showing in each one a high security level. In addition, the method was compared against other works showing a competitive performance and highlighting with a large keyspace (>1×101,134,190.38). Besides, the method has demonstrated adequate handling of high-resolution images, obtaining entropy values between 7.999988 and 7.999989, an average Number of Pixel Change Rate (NPCR) of 99.5796%±0.000674, and a mean Uniform Average Change Intensity (UACI) of 33.4469%±0.00229. In addition, when there is a small change in the key, the method does not give additional information to decrypt the image. Full article
(This article belongs to the Special Issue Mathematics Behind Machine Learning)
Show Figures

Figure 1

Back to TopTop