Machine Learning in Data Analytics and Prediction

A special issue of Electronics (ISSN 2079-9292). This special issue belongs to the section "Artificial Intelligence".

Deadline for manuscript submissions: 15 June 2025 | Viewed by 4953

Special Issue Editors


E-Mail Website
Guest Editor
Computer Science and Communication Research Centre (CIIC), Polytechnic of Leiria, 2411-901 Leiria, Portugal
Interests: big data; machine learning; cybersecurity and privacy; data integration and quality; data analytics; forecasting

E-Mail Website
Guest Editor
Computer Science Engineering Department at Superior School of Technology and Management, Polytechnic of Leiria, 2411-901 Leiria, Portugal
Interests: cybersecurity; information and networks security; Internet of Things; intrusion detection systems; computer forensics

E-Mail Website
Guest Editor
Computer Science Engineering Department, Superior School of Technology and Management, Polytechnic of Leiria, 2411-901 Leiria, Portugal
Interests: information and networks security; information security management systems; security incident response systems for Industry 4.0; next generation networks and services; wireless networks
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

We are pleased to announce a new Special Issue, "Machine Learning in Data Analytics and Prediction", which aims to allow researchers and practitioners from different research areas to share their experiences in developing state-of-the-art machine learning-based analytics and prediction solutions through new methods, novel architectures and systems, and real-world applications that could benefit from the proposed solutions. Researchers are invited to submit research works describing innovative methods, algorithms, and platforms covering any facets of machine learning in data analytics and prediction. Application papers detailing industrial implementations, design, and deployment experience reports on how machine learning can solve relevant practical problems related to analytics and prediction are also welcome. 

We welcome technical, experimental, and methodological manuscripts, as well as contributions to applied data science, that address any topics on advances in machine learning-based analytics and prediction methods, systems, and applications, including data and integration, deep neural networks, explainable AI, computational intelligence, and concept drift management, in fields like management, business, engineering, computer science, and physical, social, and life sciences.

Application scenarios of interest include, but are not limited to:

  • Cybersecurity and privacy maintenance.
  • Management and marketing.
  • Economics, finance, and accounting.
  • Life sciences and healthcare.
  • Internet of Things (IoT).
  • Energy and Industry 4.0.
  • Business and societal challenges.
  • Environmental sustainability.

Dr. Rogério Luís De Carvalho Costa
Prof. Dr. Leonel Filipe Simões Santos
Prof. Dr. Carlos Rabadão
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Electronics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • machine learning
  • big data
  • data analytics
  • forecasting
  • knowledge extration

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

21 pages, 13080 KiB  
Article
Color Normalization Through a Simulated Color Checker Using Generative Adversarial Networks
by Albert Siré Langa, Ramón Reig Bolaño, Sergi Grau Carrión and Ibon Uribe Elorrieta
Electronics 2025, 14(9), 1746; https://doi.org/10.3390/electronics14091746 - 25 Apr 2025
Viewed by 157
Abstract
Digital cameras often struggle to reproduce the true colors perceived by the human eye due to lighting geometry and illuminant color. This research proposes an innovative approach for color normalization in digital photographs. A machine learning algorithm combined with an external physical color [...] Read more.
Digital cameras often struggle to reproduce the true colors perceived by the human eye due to lighting geometry and illuminant color. This research proposes an innovative approach for color normalization in digital photographs. A machine learning algorithm combined with an external physical color checker achieves color normalization. To address the limitations of relying on a physical color checker, our approach employs a generative adversarial network capable of replicating the color normalization process without the need for a physical reference. This network (GAN-CN-CC) incorporates a custom loss function specifically designed to minimize errors in color generation. The proposed algorithm yields the lowest coefficient of variation in the normalized median intensity (NMI), while maintaining a standard deviation comparable to that of conventional methods such as Gray World and Max-RGB. The algorithm eliminates the need for a color checker in color normalization, making it more practical in scenarios where inclusion of the checker is challenging. The proposed method has been fine-tuned and validated, demonstrating high effectiveness and adaptability. Full article
(This article belongs to the Special Issue Machine Learning in Data Analytics and Prediction)
Show Figures

Figure 1

25 pages, 2222 KiB  
Article
Multiple Kernel Transfer Learning for Enhancing Network Intrusion Detection in Encrypted and Heterogeneous Network Environments
by Abdelfattah Amamra and Vincent Terrelonge
Electronics 2025, 14(1), 80; https://doi.org/10.3390/electronics14010080 - 27 Dec 2024
Viewed by 762
Abstract
Conventional supervised machine learning is widely used for intrusion detection without packet payload inspection, showing good accuracy in detecting known attacks. However, these methods require large labeled datasets, which are scarce due to privacy concerns, and struggle with generalizing to real-world traffic and [...] Read more.
Conventional supervised machine learning is widely used for intrusion detection without packet payload inspection, showing good accuracy in detecting known attacks. However, these methods require large labeled datasets, which are scarce due to privacy concerns, and struggle with generalizing to real-world traffic and adapting to domain shifts. Additionally, they are ineffective against zero-day attacks and need frequent retraining, making them difficult to maintain in dynamic network environments. To overcome the limitations of traditional machine learning methods, we propose novel Deterministic (DetMKTL) and Stochastic Multiple-Kernel Transfer Learning (StoMKTL) algorithms that are based on transfer learning. These algorithms leverage multiple kernel functions to capture complex, non-linear relationships in network traffic, enhancing adaptability and accuracy while reducing dependence on large labeled datasets. The proposed algorithms demonstrated good accuracy, particularly in cross-domain evaluations, achieving accuracy rates exceeding 90%. This highlights the robustness of the models in handling diverse network environments and varying data distributions. Moreover, our models exhibited superior performance in detecting multiple types of cyber attacks, including zero-day threats. Specifically, the detection rates reached up to 87% for known attacks and approximately 75% for unseen attacks or their variants. This emphasizes the ability of our algorithms to generalize well to novel and evolving threat scenarios, which are often overlooked by traditional systems. Additionally, the proposed algorithms performed effectively in encrypted traffic analysis, achieving an accuracy of 86%. This result demonstrates the possibility of our models to identify malicious activities within encrypted communications without compromising data privacy. Full article
(This article belongs to the Special Issue Machine Learning in Data Analytics and Prediction)
Show Figures

Figure 1

17 pages, 5689 KiB  
Article
Advanced Predictive Modeling of Tight Gas Production Leveraging Transfer Learning Techniques
by Xianlin Ma, Shilong Chang, Jie Zhan and Long Zhang
Electronics 2024, 13(23), 4750; https://doi.org/10.3390/electronics13234750 - 30 Nov 2024
Viewed by 1182
Abstract
Accurate production forecasting of tight gas reservoirs plays a critical role in effective gas field development and management. Recurrent-based deep learning models typically require extensive historical production data to achieve robust forecasting performance. This paper presents a novel approach that integrates transfer learning [...] Read more.
Accurate production forecasting of tight gas reservoirs plays a critical role in effective gas field development and management. Recurrent-based deep learning models typically require extensive historical production data to achieve robust forecasting performance. This paper presents a novel approach that integrates transfer learning with the neural basis expansion analysis time series (N-BEATS) model to forecast gas well production, thereby addressing the limitations of traditional models and reducing the reliance on large historical datasets. The N-BEATS model was pre-trained on the M4 competition dataset, which consists of 100,000 time series spanning multiple domains. Subsequently, the pre-trained model was transferred to forecast the daily production rates of two gas wells over short-term, medium-term, and long-term horizons in the S block of the Sulige gas field, China’s largest tight gas field. Comparative analysis demonstrates that the N-BEATS transfer model consistently outperforms the attention-based LSTM (A-LSTM) model, exhibiting greater accuracy across all forecast periods, with root mean square error improvements of 19.5%, 19.8%, and 26.8% of Well A1 for short-, medium-, and long-term horizons, respectively. The results indicate that the pre-trained N-BEATS model effectively mitigates the data scarcity challenges that hinder the predictive performance of LSTM-based models. This study highlights the potential of the N-BEATS transfer learning framework in the petroleum industry, particularly for production forecasting in tight gas reservoirs with limited historical data. Full article
(This article belongs to the Special Issue Machine Learning in Data Analytics and Prediction)
Show Figures

Figure 1

40 pages, 29439 KiB  
Article
A Multivariate Time Series Prediction Method Based on Convolution-Residual Gated Recurrent Neural Network and Double-Layer Attention
by Chuxin Cao, Jianhong Huang, Man Wu, Zhizhe Lin and Yan Sun
Electronics 2024, 13(14), 2834; https://doi.org/10.3390/electronics13142834 - 18 Jul 2024
Cited by 2 | Viewed by 1990
Abstract
In multivariate and multistep time series prediction research, we often face the problems of insufficient spatial feature extraction and insufficient time-dependent mining of historical series data, which also brings great challenges to multivariate time series analysis and prediction. Inspired by the attention mechanism [...] Read more.
In multivariate and multistep time series prediction research, we often face the problems of insufficient spatial feature extraction and insufficient time-dependent mining of historical series data, which also brings great challenges to multivariate time series analysis and prediction. Inspired by the attention mechanism and residual module, this study proposes a multivariate time series prediction method based on a convolutional-residual gated recurrent hybrid model (CNN-DA-RGRU) with a two-layer attention mechanism to solve the multivariate time series prediction problem in these two stages. Specifically, the convolution module of the proposed model is used to extract the relational features among the sequences, and the two-layer attention mechanism can pay more attention to the relevant variables and give them higher weights to eliminate the irrelevant features, while the residual gated loop module is used to extract the time-varying features of the sequences, in which the residual block is used to achieve the direct connectivity to enhance the expressive power of the model, to solve the gradient explosion and vanishing scenarios, and to facilitate gradient propagation. Experiments were conducted on two public datasets using the proposed model to determine the model hyperparameters, and ablation experiments were conducted to verify the effectiveness of the model; by comparing it with several models, the proposed model was found to achieve good results in multivariate time series-forecasting tasks. Full article
(This article belongs to the Special Issue Machine Learning in Data Analytics and Prediction)
Show Figures

Figure 1

Back to TopTop