Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (56)

Search Parameters:
Keywords = Matlab App

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
28 pages, 9652 KB  
Article
A Heritage Information System Based on Point-Clouds: Research and Intervention Analyses Made Accessible
by Paula Redweik, Manuel Sánchez-Fernández, María José Marín-Miranda and José Juan Sanjosé-Blasco
Heritage 2026, 9(2), 77; https://doi.org/10.3390/heritage9020077 - 17 Feb 2026
Abstract
Heritage buildings can now be surveyed in great detail using geospatial techniques such as photogrammetry and TLS to produce dense point-clouds. For the purposes of research and building analyses, data about interventions and other relevant semantic data from the building are available from [...] Read more.
Heritage buildings can now be surveyed in great detail using geospatial techniques such as photogrammetry and TLS to produce dense point-clouds. For the purposes of research and building analyses, data about interventions and other relevant semantic data from the building are available from many sources, though not always in a well-organized way. Allying semantic data to point-clouds requires the elaboration of an ontology and the segmentation and classification of the point-clouds in accordance with that ontology. The present paper deals with an approach to make semantic classified point-clouds accessible to researchers, heritage managers and members of the public who wish to explore the 3D point-cloud data with ease and without the need for geospatial expertise. The app presented here, ‘HISTERIA’ (Heritage Information System Tool to Enable Research and Intervention Analysis), was developed with MATLAB 2023 App Designer, an object-oriented programming software module. HISTERIA has an interface in which the user can choose which parts of the heritage building s/he wishes to analyze according to several criteria presented in pre-defined queries. The result of most queries is shown in a point-cloud viewer window inside the app. A point can also be selected in the viewer, and all the values attached to it can be accessed in the different classes. HISTERIA is intended to give to the exploration of semantic heritage data in 3D added value in a simplified way. Full article
Show Figures

Figure 1

31 pages, 8764 KB  
Article
Using Deep Learning for Predictive Maintenance: A Study on Exhaust Backpressure and Power Loss
by Soulaimane Idiri, Mohammed Said Boukhryss, Abdellah Azmani, Jabir El Aaraj and Said Amghar
Vehicles 2025, 7(4), 134; https://doi.org/10.3390/vehicles7040134 - 21 Nov 2025
Viewed by 1089
Abstract
This paper details the development of an embedded system for vehicle data acquisition using the On-Board Diagnostics version 2 (OBD2) protocol, with the objective of predicting power loss caused by exhaust gas backpressure (EBP). The system decodes and preprocesses vehicle data for subsequent [...] Read more.
This paper details the development of an embedded system for vehicle data acquisition using the On-Board Diagnostics version 2 (OBD2) protocol, with the objective of predicting power loss caused by exhaust gas backpressure (EBP). The system decodes and preprocesses vehicle data for subsequent analysis using predictive artificial intelligence algorithms. MATLAB’s 2023b Powertrain Blockset, along with the pre-built “Compression Ignition Dynamometer Reference Application (CIDynoRefApp)” model, was used to simulate engine behavior and its subsystems. This model facilitated the control of various engine subsystems and enabled simulation of dynamic environmental factors, including wind. Manipulation of the exhaust backpressure orifice revealed a consistent correlation between backpressure and power loss, consistent with theoretical expectations and prior research. For predictive analysis, two deep learning models—Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU)—were applied to the generated sensor data. The models were evaluated based on their ability to predict engine states, focusing on prediction accuracy and performance. The results showed that GRU achieved lower Mean Absolute Error (MAE) and Mean Squared Error (MSE), making GRU the more effective model for power loss prediction in automotive applications. These findings highlight the potential of using synthetic data and deep learning techniques to improve predictive maintenance in the automotive industry. Full article
Show Figures

Figure 1

28 pages, 1976 KB  
Article
ECG Signal Analysis and Abnormality Detection Application
by Ales Jandera, Yuliia Petryk, Martin Muzelak and Tomas Skovranek
Algorithms 2025, 18(11), 689; https://doi.org/10.3390/a18110689 - 29 Oct 2025
Viewed by 1365
Abstract
The electrocardiogram (ECG) signal carries information crucial for health assessment, but its analysis can be challenging due to noise and signal variability; therefore, automated processing focused on noise removal and detection of key features is necessary. This paper introduces an ECG signal analysis [...] Read more.
The electrocardiogram (ECG) signal carries information crucial for health assessment, but its analysis can be challenging due to noise and signal variability; therefore, automated processing focused on noise removal and detection of key features is necessary. This paper introduces an ECG signal analysis and abnormality detection application developed to process single-lead ECG signals. In this study, the Lobachevsky University database (LUDB) was used as the source of ECG signals, as it includes annotated recordings using a multi-class, multi-label taxonomy that covers several diagnostic categories, each with specific diagnoses that reflect clinical ECG interpretation practices. The main aim of the paper is to provide a tool that efficiently filters noisy ECG data, accurately detects the QRS complex, PQ and QT intervals, calculates heart rate, and compares these values with normal ranges based on age and gender. Additionally, a multi-class, multi-label SVM-based model was developed and integrated into the application for heart abnormality diagnostics, i.e., assigning one or several diagnoses from various diagnostic categories. The MATLAB-based application is capable of processing raw ECG signals, allowing the use of ECG records not only from LUDB but also from other databases. Full article
(This article belongs to the Special Issue Algorithms for Computer Aided Diagnosis: 2nd Edition)
Show Figures

Figure 1

24 pages, 4431 KB  
Article
Fault Classification in Power Transformers Using Dissolved Gas Analysis and Optimized Machine Learning Algorithms
by Vuyani M. N. Dladla and Bonginkosi A. Thango
Machines 2025, 13(8), 742; https://doi.org/10.3390/machines13080742 - 20 Aug 2025
Viewed by 1108
Abstract
Power transformers are critical assets in electrical power systems, yet their fault diagnosis often relies on conventional dissolved gas analysis (DGA) methods such as the Duval Pentagon and Triangle, Key Gas, and Rogers Ratio methods. Even though these methods are commonly used, they [...] Read more.
Power transformers are critical assets in electrical power systems, yet their fault diagnosis often relies on conventional dissolved gas analysis (DGA) methods such as the Duval Pentagon and Triangle, Key Gas, and Rogers Ratio methods. Even though these methods are commonly used, they present limitations in classification accuracy, concurrent fault identification, and manual sample handling. In this study, a framework of optimized machine learning algorithms that integrates Chi-squared statistical feature selection with Random Search hyperparameter optimization algorithms was developed to enhance transformer fault classification accuracy using DGA data, thereby addressing the limitations of conventional methods and improving diagnostic precision. Utilizing the R2024b MATLAB Classification Learner App, five optimized machine learning algorithms were trained and tested using 282 transformer oil samples with varying DGA gas concentrations obtained from industrial transformers, the IEC TC10 database, and the literature. The optimized and assessed models are Linear Discriminant, Naïve Bayes, Decision Trees, Support Vector Machine, Neural Networks, k-Nearest Neighbor, and the Ensemble Algorithm. From the proposed models, the best performing algorithm, Optimized k-Nearest Neighbor, achieved an overall performance accuracy of 92.478%, followed by the Optimized Neural Network at 89.823%. To assess their performance against the conventional methods, the same dataset used for the optimized machine learning algorithms was used to evaluate the performance of the Duval Triangle and Duval Pentagon methods using VAISALA DGA software version 1.1.0; the proposed models outperformed the conventional methods, which could only achieve a classification accuracy of 35.757% and 30.818%, respectively. This study concludes that the application of the proposed optimized machine learning algorithms can enhance the classification accuracy of DGA-based faults in power transformers, supporting more reliable diagnostics and proactive maintenance strategies. Full article
(This article belongs to the Section Electrical Machines and Drives)
Show Figures

Figure 1

19 pages, 2558 KB  
Article
Development of Patient-Specific Lattice Structured Femoral Stems Based on Finite Element Analysis and Machine Learning
by Rashwan Alkentar, Sándor Manó, Dávid Huri and Tamás Mankovits
Crystals 2025, 15(7), 650; https://doi.org/10.3390/cryst15070650 - 15 Jul 2025
Cited by 1 | Viewed by 1557 | Correction
Abstract
Hip implant optimization is increasingly receiving attention due to the development of manufacturing technology and artificial intelligence interaction in the current research. This study investigates the development of hip implant stem design with the application of lattice structures, and the utilization of the [...] Read more.
Hip implant optimization is increasingly receiving attention due to the development of manufacturing technology and artificial intelligence interaction in the current research. This study investigates the development of hip implant stem design with the application of lattice structures, and the utilization of the MATLAB regression learner app in finding the best predictive regression model to calculate the mechanical behavior of the implant’s stem based on some of the design parameters. Many cases of latticed hip implants (using 3D lattice infill type) were designed in the ANSYS software, and then 3D printed to undergo simulations and lab experiments. A surrogate model of the implant was used in the finite element analysis (FEA) instead of the geometrically latticed model to save computation time. The model was then generalized and used to calculate the mechanical behavior of new variables of hip implant stem and a database was generated for surgeon so they can choose the lattice parameters for desirable mechanical behavior. This study shows that neural networks algorithms showed the highest accuracy with predicting the mechanical behavior reaching a percentage above 90%. Patients’ weight and shell thickness were proven to be the most affecting factors on the implant’s mechanical behavior. Full article
(This article belongs to the Special Issue Celebrating the 10th Anniversary of International Crystallography)
Show Figures

Figure 1

25 pages, 6573 KB  
Article
Remote Real-Time Monitoring and Control of Small Wind Turbines Using Open-Source Hardware and Software
by Jesus Clavijo-Camacho, Gabriel Gomez-Ruiz, Reyes Sanchez-Herrera and Nicolas Magro
Appl. Sci. 2025, 15(12), 6887; https://doi.org/10.3390/app15126887 - 18 Jun 2025
Cited by 5 | Viewed by 2865
Abstract
This paper presents a real-time remote-control platform for small wind turbines (SWTs) equipped with a permanent magnet synchronous generator (PMSG). The proposed system integrates a DC–DC boost converter controlled by an Arduino® microcontroller, a Raspberry Pi® hosting a WebSocket server, and [...] Read more.
This paper presents a real-time remote-control platform for small wind turbines (SWTs) equipped with a permanent magnet synchronous generator (PMSG). The proposed system integrates a DC–DC boost converter controlled by an Arduino® microcontroller, a Raspberry Pi® hosting a WebSocket server, and a desktop application developed using MATLAB® App Designer (version R2024b). The platform enables seamless remote monitoring and control by allowing upper layers to select the turbine’s operating mode—either Maximum Power Point Tracking (MPPT) or Power Curtailment—based on real-time wind speed data transmitted via the WebSocket protocol. The communication architecture follows the IEC 61400-25 standard for wind power system communication, ensuring reliable and standardized data exchange. Experimental results demonstrate high accuracy in controlling the turbine’s operating points. The platform offers a user-friendly interface for real-time decision-making while ensuring robust and efficient system performance. This study highlights the potential of combining open-source hardware and software technologies to optimize SWT operations and improve their integration into distributed renewable energy systems. The proposed solution addresses the growing demand for cost-effective, flexible, and remote-control technologies in small-scale renewable energy applications. Full article
Show Figures

Figure 1

14 pages, 14349 KB  
Article
A Novel Study for Machine-Learning-Based Ship Energy Demand Forecasting in Container Port
by Alper Seyhan
Sustainability 2025, 17(12), 5612; https://doi.org/10.3390/su17125612 - 18 Jun 2025
Cited by 3 | Viewed by 1698
Abstract
Maritime transportation is crucial for global trade, yet it is a significant source of emissions. This study aims to enhance the operational efficiency and sustainability of container ports by accurately estimating energy needs. Analyzing data from 440 ships visiting a container port within [...] Read more.
Maritime transportation is crucial for global trade, yet it is a significant source of emissions. This study aims to enhance the operational efficiency and sustainability of container ports by accurately estimating energy needs. Analyzing data from 440 ships visiting a container port within a year, including parameters such as main engine (ME) power, auxiliary engine (AE) power, gross registered tonnage (GRT), twenty-foot equivalent unit (TEU), and hoteling time, regression analysis techniques were employed within MATLAB’s Regression Learner App. The model predicted future energy demands with an accuracy of 82%, providing a robust framework for energy management and infrastructure investment. The strategic planning based on these predictions supports sustainability goals and enhances energy supply reliability. The study highlights the dual benefit for port and ship owners in precise energy need assessments, enabling cost-effective energy management. This research offers valuable insights for stakeholders, paving the way for greener and more efficient port operations. Full article
(This article belongs to the Special Issue Sustainable Fuel, Carbon Emission and Sustainable Green Energy)
Show Figures

Figure 1

19 pages, 4646 KB  
Article
Computational Tool for Curve Smoothing Methods Analysis and Surface Plasmon Resonance Biosensor Characterization
by Mariana Rodrigues Villarim, Andréa Willa Rodrigues Villarim, Mario Gazziro, Marco Roberto Cavallari, Diomadson Rodrigues Belfort and Oswaldo Hideo Ando Junior
Inventions 2025, 10(2), 31; https://doi.org/10.3390/inventions10020031 - 18 Apr 2025
Cited by 4 | Viewed by 2226
Abstract
Biosensors based on the surface plasmon resonance (SPR) technique are widely used for analyte detection due to their high selectivity and real-time detection capabilities. However, conventional SPR spectrum analysis can be affected by experimental noise and environmental variations, reducing the accuracy of results. [...] Read more.
Biosensors based on the surface plasmon resonance (SPR) technique are widely used for analyte detection due to their high selectivity and real-time detection capabilities. However, conventional SPR spectrum analysis can be affected by experimental noise and environmental variations, reducing the accuracy of results. To address these limitations, this study presents the development of an open-source computational tool to optimize SPR biosensor characterization, implemented using MATLAB App Designer (Version R2024b). The tool enables the importation of experimental data, application of different smoothing methods, and integration of traditional and hybrid approaches to enhance accuracy in determining the resonance angle. The proposed tool offers several innovations, such as integration of both traditional and hybrid (angle vs wavelength) analysis modes, implementation of four advanced curve smoothing techniques, including Gaussian filter, Savitzky–Golay, smoothing splines, and EWMA, as well as a user-friendly graphical interface supporting real-time data visualization, experimental data import, and result export. Unlike conventional approaches, the hybrid framework enables multidimensional optimization of SPR parameters, resulting in greater accuracy and robustness in detecting resonance conditions. Experimental validation demonstrated a marked reduction in spectral noise and improved consistency in resonance angle detection across conditions. The results confirm the effectiveness and practical relevance of the tool, contributing to the advancement of SPR biosensor analysis. Full article
(This article belongs to the Section Inventions and Innovation in Biotechnology and Materials)
Show Figures

Figure 1

12 pages, 10206 KB  
Proceeding Paper
Portable Biomedical System for Acquisition, Display and Analysis of Cardiac Signals (SCG, ECG, ICG and PPG)
by Valery Sofía Zúñiga Gómez, Adonis José Pabuena García, Breiner David Solorzano Ramos, Saúl Antonio Pérez Pérez, Jean Pierre Coll Velásquez, Pablo Daniel Bonaveri and Carlos Gabriel Díaz Sáenz
Eng. Proc. 2025, 83(1), 19; https://doi.org/10.3390/engproc2025083019 - 23 Jan 2025
Viewed by 2166
Abstract
This study introduces a mechatronic biomedical device engineered for concurrent acquisition and analysis of four cardiac non-invasive signals: Electrocardiogram (ECG), Phonocardiogram (PCG), Impedance Cardiogram (ICG), and Photoplethysmogram (PPG). The system enables assessment of individual and simultaneous waveforms, allowing for detailed scrutiny of cardiac [...] Read more.
This study introduces a mechatronic biomedical device engineered for concurrent acquisition and analysis of four cardiac non-invasive signals: Electrocardiogram (ECG), Phonocardiogram (PCG), Impedance Cardiogram (ICG), and Photoplethysmogram (PPG). The system enables assessment of individual and simultaneous waveforms, allowing for detailed scrutiny of cardiac electrical and mechanical dynamics, encompassing heart rate variability, systolic time intervals, pre-ejection period (PEP), and aortic valve opening and closing timings (ET) through an application programmed with MATLAB App Designer, which applies derivative filters, smoothing, and FIR digital filters and evaluates the delay of each one, allowing the synchronization of all signals. These metrics are indispensable for deriving critical hemodynamic indices such as Stroke Volume (SV) and Cardiac Output (CO), paramount in the diagnostic armamentarium against cardiovascular pathologies. The device integrates an assembly of components including five electrodes, operational and instrumental amplifiers, infrared opto-couplers, accelerometers, and advanced filtering subsystems, synergistically tailored for precision and fidelity in signal processing. Rigorous validation utilizing a cohort of healthy subjects and benchmarking against established commercial instrumentation substantiates an accuracy threshold below 4.3% and an Interclass Correlation Coefficient (ICC) surpassing 0.9, attesting to the instrument’s exceptional reliability and robustness in quantification. These findings underscore the clinical potency and technical prowess of the developed device, empowering healthcare practitioners with an advanced toolset for refined diagnosis and management of cardiovascular disorders. Full article
Show Figures

Figure 1

20 pages, 4809 KB  
Article
Design and Evaluation of Noise Simulation Algorithm Using MATLAB Ray Tracing Engine for Noise Assessment and Prediction
by Precin Kalisalvan, Mohd Sayuti Ab Karim and Siti Nurmaya Musa
Appl. Sci. 2025, 15(3), 1009; https://doi.org/10.3390/app15031009 - 21 Jan 2025
Cited by 1 | Viewed by 2109
Abstract
The Malaysian Department of Occupational Safety and Health (DOSH) reported that noise-induced hearing loss (NIHL) accounted for 92% of occupational diseases in 2019. To address this, accurate risk assessment is crucial. The current noise evaluation methods are complex and time-consuming, relying on manual [...] Read more.
The Malaysian Department of Occupational Safety and Health (DOSH) reported that noise-induced hearing loss (NIHL) accounted for 92% of occupational diseases in 2019. To address this, accurate risk assessment is crucial. The current noise evaluation methods are complex and time-consuming, relying on manual calculations and field measurements. An easy-to-use, open-source noise simulator that directly compares the output with national standards would help mitigate this issue. This research aims to develop an advanced noise evaluation tool to assess and predict unregulated workplace noise, providing tailored safety recommendations. Using a representative plant layout, the Sound Pressure Level (SPL) is calculated using MATLAB’s ray tracing propagation model. The model simulates all possible transmission paths from the source to the receiver to derive the resultant SPL. A noise simulation application featuring a graphical user interface (GUI) built with MATLAB’s App Designer (version: R2024a) automates these computations. The simulation results are validated against the DOSH’s safety standards in Malaysia. Additional safety metrics, such as the recommended maximum exposure time and the required Noise Reduction Rating (NRR) for hearing protection, are calculated based on the SPLs for hazardous locations. The simulation algorithm’s functionality is validated against manual calculations, with an average deviation of just 3.06 dB, demonstrating the model’s precision. This tool can assess and predict indoor noise levels, provide information on optimal exposure limits, and recommend necessary protective measures, ultimately reducing the risk of NIHL in factory environments. It can potentially optimise plant floor operations for existing and new facilities, ensuring safer shift operations and reducing worker noise hazard exposure. Full article
Show Figures

Figure 1

15 pages, 1376 KB  
Article
Dynamic Prediction of Physical Exertion: Leveraging AI Models and Wearable Sensor Data During Cycling Exercise
by Aref Smiley and Joseph Finkelstein
Diagnostics 2025, 15(1), 52; https://doi.org/10.3390/diagnostics15010052 - 28 Dec 2024
Cited by 6 | Viewed by 2440
Abstract
Background/Objectives: This study aimed to explore machine learning approaches for predicting physical exertion using physiological signals collected from wearable devices. Methods: Both traditional machine learning and deep learning methods for classification and regression were assessed. The research involved 27 healthy participants [...] Read more.
Background/Objectives: This study aimed to explore machine learning approaches for predicting physical exertion using physiological signals collected from wearable devices. Methods: Both traditional machine learning and deep learning methods for classification and regression were assessed. The research involved 27 healthy participants engaged in controlled cycling exercises. Physiological data, including ECG, heart rate, oxygen saturation, and pedal speed (RPM), were collected during these sessions, which were divided into eight two-minute segments. Heart rate variability (HRV) was also calculated to serve as a predictive indicator. We employed two feature selection algorithms to identify the most relevant features for model training: Minimum Redundancy Maximum Relevance (MRMR) for both classification and regression, and Univariate Feature Ranking for Classification. A total of 34 traditional models were developed using MATLAB’s Classification Learner App, utilizing 20% of the data for testing. In addition, Long Short-Term Memory (LSTM) networks were trained on the top features selected by the MRMR and Univariate Feature Ranking algorithms to enhance model performance. Finally, the MRMR-selected features were used for regression to train the LSTM model for predicting continuous outcomes. Results: The LSTM model for regression demonstrated robust predictive capabilities, achieving a mean squared error (MSE) of 0.8493 and an R-squared value of 0.7757. The classification models also showed promising results, with the highest testing accuracy reaching 89.2% and an F1 score of 91.7%. Conclusions: These results underscore the effectiveness of combining feature selection algorithms with advanced machine learning (ML) and deep learning techniques for predicting physical exertion levels using wearable sensor data. Full article
(This article belongs to the Special Issue Advances in Artificial Intelligence in Healthcare)
Show Figures

Figure 1

19 pages, 5545 KB  
Article
Edge Computing for AI-Based Brain MRI Applications: A Critical Evaluation of Real-Time Classification and Segmentation
by Khuhed Memon, Norashikin Yahya, Mohd Zuki Yusoff, Rabani Remli, Aida-Widure Mustapha Mohd Mustapha, Hilwati Hashim, Syed Saad Azhar Ali and Shahabuddin Siddiqui
Sensors 2024, 24(21), 7091; https://doi.org/10.3390/s24217091 - 4 Nov 2024
Cited by 5 | Viewed by 4751
Abstract
Medical imaging plays a pivotal role in diagnostic medicine with technologies like Magnetic Resonance Imagining (MRI), Computed Tomography (CT), Positron Emission Tomography (PET), and ultrasound scans being widely used to assist radiologists and medical experts in reaching concrete diagnosis. Given the recent massive [...] Read more.
Medical imaging plays a pivotal role in diagnostic medicine with technologies like Magnetic Resonance Imagining (MRI), Computed Tomography (CT), Positron Emission Tomography (PET), and ultrasound scans being widely used to assist radiologists and medical experts in reaching concrete diagnosis. Given the recent massive uplift in the storage and processing capabilities of computers, and the publicly available big data, Artificial Intelligence (AI) has also started contributing to improving diagnostic radiology. Edge computing devices and handheld gadgets can serve as useful tools to process medical data in remote areas with limited network and computational resources. In this research, the capabilities of multiple platforms are evaluated for the real-time deployment of diagnostic tools. MRI classification and segmentation applications developed in previous studies are used for testing the performance using different hardware and software configurations. Cost–benefit analysis is carried out using a workstation with a NVIDIA Graphics Processing Unit (GPU), Jetson Xavier NX, Raspberry Pi 4B, and Android phone, using MATLAB, Python, and Android Studio. The mean computational times for the classification app on the PC, Jetson Xavier NX, and Raspberry Pi are 1.2074, 3.7627, and 3.4747 s, respectively. On the low-cost Android phone, this time is observed to be 0.1068 s using the Dynamic Range Quantized TFLite version of the baseline model, with slight degradation in accuracy. For the segmentation app, the times are 1.8241, 5.2641, 6.2162, and 3.2023 s, respectively, when using JPEG inputs. The Jetson Xavier NX and Android phone stand out as the best platforms due to their compact size, fast inference times, and affordability. Full article
(This article belongs to the Section Biomedical Sensors)
Show Figures

Figure 1

9 pages, 2138 KB  
Proceeding Paper
An Intelligent System Approach for Predicting the Risk of Heart Failure
by Imran Raihan Khan Rabbi, Hamza Zouaghi and Wei Peng
Eng. Proc. 2024, 76(1), 23; https://doi.org/10.3390/engproc2024076023 - 18 Oct 2024
Viewed by 1033
Abstract
Heart failure, a chronic and progressive condition where the heart muscle fails to pump sufficient blood for the body’s needs, leads to complications like irregular heartbeat and organ damage. It is a leading cause of death worldwide, with 17.9 million annual fatalities. Often [...] Read more.
Heart failure, a chronic and progressive condition where the heart muscle fails to pump sufficient blood for the body’s needs, leads to complications like irregular heartbeat and organ damage. It is a leading cause of death worldwide, with 17.9 million annual fatalities. Often diagnosed late due to complex, costly screenings, current treatments are less effective at advanced stages, necessitating novel early detection methods. This research develops intelligent systems using a Fuzzy Inference System and Feed Forward Back Propagation Neural Network, focusing on eleven heart-affecting parameters. The study shows artificial intelligence-based models outperform current medical diagnostics in early heart disease detection. The models were evaluated using 221 datasets. The obtained result demonstrates that the performance parameters of the FIS model provide superior results compared to the ANN model. The developed FIS system’s accuracy, precision, sensitivity, and specificity are 90.50%, 90.91%, 90.50%, and 90.31%, respectively. A graphical user interface (GUI) is developed using the MATLAB App Designer tool to facilitate the system’s practical applicability for the end users. Full article
Show Figures

Figure 1

26 pages, 2545 KB  
Article
An Inquiry into the Evolutionary Game among Tripartite Entities and Strategy Selection within the Framework of Personal Information Authorization
by Jie Tang, Zhiyi Peng and Wei Wei
Big Data Cogn. Comput. 2024, 8(8), 90; https://doi.org/10.3390/bdcc8080090 - 8 Aug 2024
Viewed by 1968
Abstract
Mobile applications (Apps) serve as vital conduits for information exchange in the mobile internet era, yet they also engender significant cybersecurity risks due to their real-time handling of vast quantities of data. This manuscript constructs a tripartite evolutionary game model, “users-App providers-government”, to [...] Read more.
Mobile applications (Apps) serve as vital conduits for information exchange in the mobile internet era, yet they also engender significant cybersecurity risks due to their real-time handling of vast quantities of data. This manuscript constructs a tripartite evolutionary game model, “users-App providers-government”, to illuminate a pragmatic pathway for orderly information circulation within the App marketplace and sustainable industry development. It then scrutinizes the evolutionary process and emergence conditions of their stabilizing equilibrium strategies and employs simulation analysis via MATLAB. The findings reveal that (1) there exists a high degree of coupling among the strategic selections of the three parties, wherein any alteration in one actor’s decision-making trajectory exerts an impact on the evolutionary course of the remaining two actors. (2) The initial strategies significantly influence the pace of evolutionary progression and its outcome. Broadly speaking, the higher the initial probabilities of users opting for information authorization, App providers adopting compliant data solicitation practices, and the government enforcing stringent oversight, the more facile the attainment of an evolutionarily optimal solution. (3) The strategic preferences of the triadic stakeholders are subject to a composite influence of respective costs, benefits, and losses. Of these, users’ perceived benefits serve as the impetus for their strategic decisions, while privacy concerns act as a deterrent. App providers’ strategy decisions are influenced by a number of important elements, including their corporate reputation and fines levied by the government. Costs associated with government regulations are the main barrier to the adoption of strict supervision practices. Drawing upon these analytical outcomes, we posit several feasible strategies. Full article
(This article belongs to the Special Issue Research on Privacy and Data Security)
Show Figures

Figure 1

19 pages, 6482 KB  
Article
Field-Programmable Gate Array Architecture for the Discrete Orthonormal Stockwell Transform (DOST) Hardware Implementation
by Martin Valtierra-Rodriguez, Jose-Luis Contreras-Hernandez, David Granados-Lieberman, Jesus Rooney Rivera-Guillen, Juan Pablo Amezquita-Sanchez and David Camarena-Martinez
J. Low Power Electron. Appl. 2024, 14(3), 42; https://doi.org/10.3390/jlpea14030042 - 7 Aug 2024
Cited by 2 | Viewed by 2306
Abstract
Time–frequency analysis is critical in studying linear and non-linear signals that exhibit variations across both time and frequency domains. Such analysis not only facilitates the identification of transient events and extraction of key features but also aids in displaying signal properties and pattern [...] Read more.
Time–frequency analysis is critical in studying linear and non-linear signals that exhibit variations across both time and frequency domains. Such analysis not only facilitates the identification of transient events and extraction of key features but also aids in displaying signal properties and pattern recognition. Recently, the Discrete Orthonormal Stockwell Transform (DOST) has become increasingly utilized in many specialized signal processing tasks. Given its growing importance, this work proposes a reconfigurable field-programmable gate array (FPGA) architecture designed to efficiently implement the DOST algorithm on cost-effective FPGA chips. An accompanying MATLAB app enables the automatic configuration of the DOST method for varying sizes (64, 128, 256, 512, and 1024 points). For the implementation, a Cyclone V series FPGA device from Intel Altera, featuring the 5CSEMA5F31C6N chip, is used. To provide a complete hardware solution, the proposed DOST core has been integrated into a hybrid ARM-HPS (Advanced RISC Machine–Hard Processor System) control unit, which allows the control of different peripherals, such as communication protocols and VGA-based displays. Results show that less than 5% of the chip’s resources are occupied, indicating a low-cost architecture that can be easily integrated into other FPGA structures or hardware systems for diverse applications. Moreover, the accuracy of the proposed FPGA-based implementation is underscored by a root mean squared error of 6.0155 × 10−3 when compared with results from floating-point processors, highlighting its accuracy. Full article
Show Figures

Figure 1

Back to TopTop