Sign in to use this feature.

Years

Between: -

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (730)

Search Parameters:
Journal = ASI

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
27 pages, 19279 KiB  
Article
Smart Hydroponic Cultivation System for Lettuce (Lactuca sativa L.) Growth Under Different Nutrient Solution Concentrations in a Controlled Environment
by Raul Herrera-Arroyo, Juan Martínez-Nolasco, Enrique Botello-Álvarez, Víctor Sámano-Ortega, Coral Martínez-Nolasco and Cristal Moreno-Aguilera
Appl. Syst. Innov. 2025, 8(4), 110; https://doi.org/10.3390/asi8040110 (registering DOI) - 7 Aug 2025
Abstract
The inclusion of the Internet of Things (IoT) in indoor agricultural systems has become a fundamental tool for improving cultivation systems by providing key information for decision-making in pursuit of better performance. This article presents the design and implementation of an IoT-based agricultural [...] Read more.
The inclusion of the Internet of Things (IoT) in indoor agricultural systems has become a fundamental tool for improving cultivation systems by providing key information for decision-making in pursuit of better performance. This article presents the design and implementation of an IoT-based agricultural system installed in a plant growth chamber for hydroponic cultivation under controlled conditions. The growth chamber is equipped with sensors for air temperature, relative humidity (RH), carbon dioxide (CO2) and photosynthetically active photon flux, as well as control mechanisms such as humidifiers, full-spectrum Light Emitting Diode (LED) lamps, mini split air conditioner, pumps, a Wi-Fi surveillance camera, remote monitoring via a web application and three Nutrient Film Technique (NFT) hydroponic systems with a capacity of ten plants each. An ATmega2560 microcontroller manages the smart system using the MODBUS RS-485 communication protocol. To validate the proper functionality of the proposed system, a case study was conducted using lettuce crops, in which the impact of different nutrient solution concentrations (50%, 75% and 100%) on the phenotypic development and nutritional content of the plants was evaluated. The results obtained from the cultivation experiment, analyzed through analysis of variance (ANOVA), show that the treatment with 75% nutrient concentration provides an appropriate balance between resource use and nutritional quality, without affecting the chlorophyll content. This system represents a scalable and replicable alternative for protected agriculture. Full article
(This article belongs to the Special Issue Smart Sensors and Devices: Recent Advances and Applications Volume II)
Show Figures

Figure 1

20 pages, 1971 KiB  
Article
FFG-YOLO: Improved YOLOv8 for Target Detection of Lightweight Unmanned Aerial Vehicles
by Tongxu Wang, Sizhe Yang, Ming Wan and Yanqiu Liu
Appl. Syst. Innov. 2025, 8(4), 109; https://doi.org/10.3390/asi8040109 - 4 Aug 2025
Viewed by 228
Abstract
Target detection is essential in intelligent transportation and autonomous control of unmanned aerial vehicles (UAVs), with single-stage detection algorithms used widely due to their speed. However, these algorithms face limitations in detecting small targets, especially in aerial photography from unmanned aerial vehicles (UAVs), [...] Read more.
Target detection is essential in intelligent transportation and autonomous control of unmanned aerial vehicles (UAVs), with single-stage detection algorithms used widely due to their speed. However, these algorithms face limitations in detecting small targets, especially in aerial photography from unmanned aerial vehicles (UAVs), where small targets are often occluded, multi-scale semantic information is easily lost, and there is a trade-off between real-time processing and computational resources. Existing algorithms struggle to effectively extract multi-dimensional features and deep semantic information from images and to balance detection accuracy with model complexity. To address these limitations, we developed FFG-YOLO, a lightweight small-target detection method for UAVs based on YOLOv8. FFG-YOLO incorporates three modules: a feature enhancement block (FEB), a feature concat block (FCB), and a global context awareness block (GCAB). These modules strengthen feature extraction from small targets, resolve semantic bias in multi-scale feature fusion, and help differentiate small targets from complex backgrounds. We also improved the positioning accuracy of small targets using the Wasserstein distance loss function. Experiments showed that FFG-YOLO outperformed other algorithms, including YOLOv8n, in small-target detection due to its lightweight nature, meeting the stringent real-time performance and deployment requirements of UAVs. Full article
Show Figures

Figure 1

25 pages, 2082 KiB  
Article
XTTS-Based Data Augmentation for Profanity Keyword Recognition in Low-Resource Speech Scenarios
by Shin-Chi Lai, Yi-Chang Zhu, Szu-Ting Wang, Yen-Ching Chang, Ying-Hsiu Hung, Jhen-Kai Tang and Wen-Kai Tsai
Appl. Syst. Innov. 2025, 8(4), 108; https://doi.org/10.3390/asi8040108 - 31 Jul 2025
Viewed by 191
Abstract
As voice cloning technology rapidly advances, the risk of personal voices being misused by malicious actors for fraud or other illegal activities has significantly increased, making the collection of speech data increasingly challenging. To address this issue, this study proposes a data augmentation [...] Read more.
As voice cloning technology rapidly advances, the risk of personal voices being misused by malicious actors for fraud or other illegal activities has significantly increased, making the collection of speech data increasingly challenging. To address this issue, this study proposes a data augmentation method based on XText-to-Speech (XTTS) synthesis to tackle the challenges of small-sample, multi-class speech recognition, using profanity as a case study to achieve high-accuracy keyword recognition. Two models were therefore evaluated: a CNN model (Proposed-I) and a CNN-Transformer hybrid model (Proposed-II). Proposed-I leverages local feature extraction, improving accuracy on a real human speech (RHS) test set from 55.35% without augmentation to 80.36% with XTTS-enhanced data. Proposed-II integrates CNN’s local feature extraction with Transformer’s long-range dependency modeling, further boosting test set accuracy to 88.90% while reducing the parameter count by approximately 41%, significantly enhancing computational efficiency. Compared to a previously proposed incremental architecture, the Proposed-II model achieves an 8.49% higher accuracy while reducing parameters by about 98.81% and MACs by about 98.97%, demonstrating exceptional resource efficiency. By utilizing XTTS and public corpora to generate a novel keyword speech dataset, this study enhances sample diversity and reduces reliance on large-scale original speech data. Experimental analysis reveals that an optimal synthetic-to-real speech ratio of 1:5 significantly improves the overall system accuracy, effectively addressing data scarcity. Additionally, the Proposed-I and Proposed-II models achieve accuracies of 97.54% and 98.66%, respectively, in distinguishing real from synthetic speech, demonstrating their strong potential for speech security and anti-spoofing applications. Full article
(This article belongs to the Special Issue Advancements in Deep Learning and Its Applications)
20 pages, 1426 KiB  
Article
Hybrid CNN-NLP Model for Detecting LSB Steganography in Digital Images
by Karen Angulo, Danilo Gil, Andrés Yáñez and Helbert Espitia
Appl. Syst. Innov. 2025, 8(4), 107; https://doi.org/10.3390/asi8040107 - 30 Jul 2025
Viewed by 304
Abstract
This paper proposes a hybrid model that combines convolutional neural networks with natural language processing techniques for least significant bit-based steganography detection in grayscale digital images. The proposed approach identifies hidden messages by analyzing subtle alterations in the least significant bits and validates [...] Read more.
This paper proposes a hybrid model that combines convolutional neural networks with natural language processing techniques for least significant bit-based steganography detection in grayscale digital images. The proposed approach identifies hidden messages by analyzing subtle alterations in the least significant bits and validates the linguistic coherence of the extracted content using a semantic filter implemented with spaCy. The system is trained and evaluated on datasets ranging from 5000 to 12,500 images per class, consistently using an 80% training and 20% validation partition. As a result, the model achieves a maximum accuracy and precision of 99.96%, outperforming recognized architectures such as Xu-Net, Yedroudj-Net, and SRNet. Unlike traditional methods, the model reduces false positives by discarding statistically suspicious but semantically incoherent outputs, which is essential in forensic contexts. Full article
Show Figures

Figure 1

36 pages, 1411 KiB  
Review
A Critical Analysis and Roadmap for the Development of Industry 4-Oriented Facilities for Education, Training, and Research in Academia
by Ziyue Jin, Romeo M. Marian and Javaan S. Chahl
Appl. Syst. Innov. 2025, 8(4), 106; https://doi.org/10.3390/asi8040106 - 29 Jul 2025
Viewed by 537
Abstract
The development of Industry 4-oriented facilities in academia for training and research purposes is playing a significant role in pushing forward the Fourth Industrial Revolution. This study can serve academic staff who are intending to build their Industry 4 facilities, to better understand [...] Read more.
The development of Industry 4-oriented facilities in academia for training and research purposes is playing a significant role in pushing forward the Fourth Industrial Revolution. This study can serve academic staff who are intending to build their Industry 4 facilities, to better understand the key features, constraints, and opportunities. This paper presents a systematic literature review of 145 peer-reviewed studies published between 2011 and 2023, which are identified across Scopus, SpringerLink, and Web of Science. As a result, we emphasise the significance of developing Industry 4 learning facilities in academia and outline the main design principles of the Industry 4 ecosystems. We also investigate and discuss the key Industry 4-related technologies that have been extensively used and represented in the reviewed literature, and summarise the challenges and roadblocks that current participants are facing. From these insights, we identify research gaps, outline technology mapping and maturity level, and propose a strategic roadmap for future implementation of Industry 4 facilities. The results of the research are expected to support current and future participants in increasing their awareness of the significance of the development, clarifying the research scope and objectives, and preparing them to deal with inherent complexity and skills issues. Full article
Show Figures

Figure 1

18 pages, 1072 KiB  
Article
Complexity of Supply Chains Using Shannon Entropy: Strategic Relationship with Competitive Priorities
by Miguel Afonso Sellitto, Ismael Cristofer Baierle and Marta Rinaldi
Appl. Syst. Innov. 2025, 8(4), 105; https://doi.org/10.3390/asi8040105 - 29 Jul 2025
Viewed by 256
Abstract
Entropy is a foundational concept across scientific domains, playing a role in understanding disorder, randomness, and uncertainty within systems. This study applies Shannon’s entropy in information theory to evaluate and manage complexity in industrial supply chain management. The purpose of the study is [...] Read more.
Entropy is a foundational concept across scientific domains, playing a role in understanding disorder, randomness, and uncertainty within systems. This study applies Shannon’s entropy in information theory to evaluate and manage complexity in industrial supply chain management. The purpose of the study is to propose a quantitative modeling method, employing Shannon’s entropy model as a proxy to assess the complexity in SCs. The underlying assumption is that information entropy serves as a proxy for the complexity of the SC. The research method is quantitative modeling, which is applied to four focal companies from the agrifood and metalworking industries in Southern Brazil. The results showed that companies prioritizing cost and quality exhibit lower complexity compared to those emphasizing flexibility and dependability. Additionally, information flows related to specially engineered products and deliveries show significant differences in average entropies, indicating that organizational complexities vary according to competitive priorities. The implications of this suggest that a focus on cost and quality in SCM may lead to lower complexity, in opposition to a focus on flexibility and dependability, influencing strategic decision making in industrial contexts. This research introduces the novel application of information entropy to assess and control complexity within industrial SCs. Future studies can explore and validate these insights, contributing to the evolving field of supply chain management. Full article
Show Figures

Figure 1

23 pages, 3847 KiB  
Article
Optimizing Sentiment Analysis in Multilingual Balanced Datasets: A New Comparative Approach to Enhancing Feature Extraction Performance with ML and DL Classifiers
by Hamza Jakha, Souad El Houssaini, Mohammed-Alamine El Houssaini, Souad Ajjaj and Abdelali Hadir
Appl. Syst. Innov. 2025, 8(4), 104; https://doi.org/10.3390/asi8040104 - 28 Jul 2025
Viewed by 364
Abstract
Social network platforms have a big impact on the development of companies by influencing clients’ behaviors and sentiments, which directly affect corporate reputations. Analyzing this feedback has become an essential component of business intelligence, supporting the improvement of long-term marketing strategies on a [...] Read more.
Social network platforms have a big impact on the development of companies by influencing clients’ behaviors and sentiments, which directly affect corporate reputations. Analyzing this feedback has become an essential component of business intelligence, supporting the improvement of long-term marketing strategies on a larger scale. The implementation of powerful sentiment analysis models requires a comprehensive and in-depth examination of each stage of the process. In this study, we present a new comparative approach for several feature extraction techniques, including TF-IDF, Word2Vec, FastText, and BERT embeddings. These methods are applied to three multilingual datasets collected from hotel review platforms in the tourism sector in English, French, and Arabic languages. Those datasets were preprocessed through cleaning, normalization, labeling, and balancing before being trained on various machine learning and deep learning algorithms. The effectiveness of each feature extraction method was evaluated using metrics such as accuracy, F1-score, precision, recall, ROC AUC curve, and a new metric that measures the execution time for generating word representations. Our extensive experiments demonstrate significant and excellent results, achieving accuracy rates of approximately 99% for the English dataset, 94% for the Arabic dataset, and 89% for the French dataset. These findings confirm the important impact of vectorization techniques on the performance of sentiment analysis models. They also highlight the important relationship between balanced datasets, effective feature extraction methods, and the choice of classification algorithms. So, this study aims to simplify the selection of feature extraction methods and appropriate classifiers for each language, thereby contributing to advancements in sentiment analysis. Full article
(This article belongs to the Topic Social Sciences and Intelligence Management, 2nd Volume)
Show Figures

Figure 1

26 pages, 984 KiB  
Article
Assessing and Prioritizing Service Innovation Challenges in UAE Government Entities: A Network-Based Approach for Effective Decision-Making
by Abeer Abuzanjal and Hamdi Bashir
Appl. Syst. Innov. 2025, 8(4), 103; https://doi.org/10.3390/asi8040103 - 28 Jul 2025
Viewed by 388
Abstract
Public service innovation research often focuses on the private or general public sectors, leaving the distinct challenges government entities face unexplored. An empirical study was carried out to bridge this gap using survey results from the United Arab Emirates (UAE) government entities. This [...] Read more.
Public service innovation research often focuses on the private or general public sectors, leaving the distinct challenges government entities face unexplored. An empirical study was carried out to bridge this gap using survey results from the United Arab Emirates (UAE) government entities. This study built on that research by further analyzing the relationships among these challenges through a social network approach, visualizing and analyzing the connections between them by utilizing betweenness centrality and eigenvector centrality as key metrics. Based on this analysis, the challenges were classified into different categories; 8 out of 22 challenges were identified as critical due to their high values in both metrics. Addressing these critical challenges is expected to create a cascading impact, helping to resolve many others. Targeted strategies are proposed, and leveraging open innovation is highlighted as an effective and versatile solution to address and mitigate these challenges. This study is one of the few to adopt a social network analysis perspective to visualize and analyze the relationships among challenges, enabling the identification of critical ones. This research offers novel and valuable insights that could assist decision-makers in UAE government entities and countries with similar contexts with actionable strategies to advance public service innovation. Full article
Show Figures

Figure 1

18 pages, 500 KiB  
Article
Hybrid Model-Based Traffic Network Control Using Population Games
by Sindy Paola Amaya, Pablo Andrés Ñañez, David Alejandro Martínez Vásquez, Juan Manuel Calderón Chávez and Armando Mateus Rojas
Appl. Syst. Innov. 2025, 8(4), 102; https://doi.org/10.3390/asi8040102 - 25 Jul 2025
Viewed by 249
Abstract
Modern traffic management requires sophisticated approaches to address the complexities of urban road networks, which continue to grow in complexity due to increasing urbanization and vehicle usage. Traditional methods often fall short in mitigating congestion and optimizing traffic flow, inducing the exploration of [...] Read more.
Modern traffic management requires sophisticated approaches to address the complexities of urban road networks, which continue to grow in complexity due to increasing urbanization and vehicle usage. Traditional methods often fall short in mitigating congestion and optimizing traffic flow, inducing the exploration of innovative traffic control strategies based on advanced theoretical frameworks. In this sense, we explore different game theory-based control strategies in an eight-intersection traffic network modeled by means of hybrid systems and graph theory, using a software simulator that combines the multi-modal traffic simulation software VISSIM and MATLAB to integrate traffic network parameters and population game criteria. Across five distinct network scenarios with varying saturation conditions, we explore a fixed-time scheme of signaling by means of fictitious play dynamics and adaptive schemes, using dynamics such as Smith, replicator, Logit and Brown–Von Neumann–Nash (BNN). Results show better performance for Smith and replicator dynamics in terms of traffic parameters both for fixed and variable signaling times, with an interesting outcome of fictitious play over BNN and Logit. Full article
Show Figures

Figure 1

22 pages, 2652 KiB  
Article
Niching-Driven Divide-and-Conquer Hill Exploration
by Junchen Wang, Changhe Li and Yiya Diao
Appl. Syst. Innov. 2025, 8(4), 101; https://doi.org/10.3390/asi8040101 - 22 Jul 2025
Viewed by 310
Abstract
Optimization problems often feature local optima with a significant difference in the basin of attraction (BoA), making evolutionary computation methods prone to discarding solutions located in less-attractive BoAs, thereby posing challenges to the search for optima in these BoAs. To enhance the ability [...] Read more.
Optimization problems often feature local optima with a significant difference in the basin of attraction (BoA), making evolutionary computation methods prone to discarding solutions located in less-attractive BoAs, thereby posing challenges to the search for optima in these BoAs. To enhance the ability to find these optima, various niching methods have been proposed to restrict the competition scope of individuals to their specific neighborhoods. However, redundant searches in more-attractive BoAs as well as necessary searches in less-attractive BoAs can only be promoted simultaneously by these methods. To address this issue, we propose a general framework for niching methods named niching-driven divide-and-conquer hill exploration (NDDCHE). Through gradually learning BoAs from the search results of a niching method and dividing the problem into subproblems with a much smaller number of optima, NDDCHE aims to bring a more balanced distribution of searches in the BoAs of optima found so far, and thus enhance the niching method’s ability to find optima in less-attractive BoAs. Through experiments where niching methods with different categories of niching techniques are integrated with NDDCHE and tested on problems with significant differences in the size of the BoA, the effectiveness and the generalization ability of NDDCHE are proven. Full article
Show Figures

Figure 1

25 pages, 1344 KiB  
Article
Cloud-Based Data-Driven Framework for Optimizing Operational Efficiency and Sustainability in Tube Manufacturing
by Michael Maiko Matonya and István Budai
Appl. Syst. Innov. 2025, 8(4), 100; https://doi.org/10.3390/asi8040100 - 22 Jul 2025
Viewed by 350
Abstract
Modern manufacturing strives for peak efficiency while facing pressing demands for environmental sustainability. Balancing these often-conflicting objectives represents a fundamental trade-off in modern manufacturing, as traditional methods typically address them in isolation, leading to suboptimal outcomes. Process mining offers operational insights but often [...] Read more.
Modern manufacturing strives for peak efficiency while facing pressing demands for environmental sustainability. Balancing these often-conflicting objectives represents a fundamental trade-off in modern manufacturing, as traditional methods typically address them in isolation, leading to suboptimal outcomes. Process mining offers operational insights but often lacks dynamic environmental indicators, while standard Life Cycle Assessment (LCA) provides environmental evaluation but uses static data unsuitable for real-time optimization. Frameworks integrating real-time data for dynamic multi-objective optimization are scarce. This study proposes a comprehensive, data-driven, cloud-based framework that overcomes these limitations. It uniquely combines three key components: (1) real-time Process Mining for actual workflows and operational KPIs; (2) dynamic LCA using live sensor data for instance-level environmental impacts (energy, emissions, waste) and (3) Multi-Objective Optimization (NSGA-II) to identify Pareto-optimal solutions balancing efficiency and sustainability. TOPSIS assists decision-making by ranking these solutions. Validated using extensive real-world data from a tube manufacturing facility processing over 390,000 events, the framework demonstrated significant, quantifiable improvements. The optimization yielded a Pareto front of solutions that surpassed baseline performance (87% efficiency; 2007.5 kg CO2/day). The optimal balanced solution identified by TOPSIS simultaneously increased operational efficiency by 5.1% and reduced carbon emissions by 12.4%. Further analysis quantified the efficiency-sustainability trade-offs and confirmed the framework’s adaptability to varying strategic priorities through sensitivity analysis. This research offers a validated framework for industrial applications that enables manufacturers to improve both operational efficiency and environmental sustainability in a unified manner, moving beyond the limitations of disconnected tools. The validated integrated framework provides a powerful, data-driven tool, recommended as a valuable approach for industrial applications seeking continuous improvement in both economic and environmental performance dimensions. Full article
Show Figures

Figure 1

32 pages, 1156 KiB  
Article
A Study of the Response Surface Methodology Model with Regression Analysis in Three Fields of Engineering
by Hsuan-Yu Chen and Chiachung Chen
Appl. Syst. Innov. 2025, 8(4), 99; https://doi.org/10.3390/asi8040099 - 21 Jul 2025
Viewed by 399
Abstract
Researchers conduct experiments to discover factors influencing the experimental subjects, so the experimental design is essential. The response surface methodology (RSM) is a special experimental design used to evaluate factors significantly affecting a process and determine the optimal conditions for different factors. The [...] Read more.
Researchers conduct experiments to discover factors influencing the experimental subjects, so the experimental design is essential. The response surface methodology (RSM) is a special experimental design used to evaluate factors significantly affecting a process and determine the optimal conditions for different factors. The relationship between response values and influencing factors is mainly established using regression analysis techniques. These equations are then used to generate contour and surface response plots to provide researchers with further insights. The impact of regression techniques on response surface methodology (RSM) model building has not been studied in detail. This study uses complete regression techniques to analyze sixteen datasets from the literature on semiconductor manufacturing, steel materials, and nanomaterials. Whether each variable significantly affected the response value was assessed using backward elimination and a t-test. The complete regression techniques used in this study included considering the significant influencing variables of the model, testing for normality and constant variance, using predictive performance criteria, and examining influential data points. The results of this study revealed some problems with model building in RSM studies in the literature from three engineering fields, including the direct use of complete equations without statistical testing, deletion of variables with p-values above a preset value without further examination, existence of non-normality and non-constant variance conditions of the dataset without testing, and presence of some influential data points without examination. Researchers should strengthen training in regression techniques to enhance the RSM model-building process. Full article
Show Figures

Figure 1

25 pages, 4186 KiB  
Review
Total Productive Maintenance and Industry 4.0: A Literature-Based Path Toward a Proposed Standardized Framework
by Zineb Mouhib, Maryam Gallab, Safae Merzouk, Aziz Soulhi and Mario Di Nardo
Appl. Syst. Innov. 2025, 8(4), 98; https://doi.org/10.3390/asi8040098 - 21 Jul 2025
Viewed by 611
Abstract
In the context of Industry 4.0, Total Productive Maintenance (TPM) is undergoing a major shift driven by digital technologies such as the IoT, AI, cloud computing, and Cyber–Physical systems. This study explores how these technologies reshape traditional TPM pillars and practices through a [...] Read more.
In the context of Industry 4.0, Total Productive Maintenance (TPM) is undergoing a major shift driven by digital technologies such as the IoT, AI, cloud computing, and Cyber–Physical systems. This study explores how these technologies reshape traditional TPM pillars and practices through a two-phase methodology: bibliometric analysis, which reveals global research trends, key contributors, and emerging themes, and a systematic review, which discusses how core TPM practices are being transformed by advanced technologies. It also identifies key challenges of this transition, including data aggregation, a lack of skills, and resistance. However, despite the growing body of research on digital TPM, a major gap persists: the lack of a standardized model applicable across industries. Existing approaches are often fragmented or too context-specific, limiting scalability. Addressing this gap requires a structured approach that aligns technological advancements with TPM’s foundational principles. Taking a cue from these findings, this article formulates a systematic and scalable framework for TPM 4.0 deployment. The framework is based on four pillars: modular technological architecture, phased deployment, workforce integration, and standardized performance indicators. The ultimate goal is to provide a basis for a universal digital TPM standard that enhances the efficiency, resilience, and efficacy of smart maintenance systems. Full article
(This article belongs to the Section Industrial and Manufacturing Engineering)
Show Figures

Figure 1

27 pages, 2527 KiB  
Review
A Systematic Review of Responsible Artificial Intelligence Principles and Practice
by Lakshitha Gunasekara, Nicole El-Haber, Swati Nagpal, Harsha Moraliyage, Zafar Issadeen, Milos Manic and Daswin De Silva
Appl. Syst. Innov. 2025, 8(4), 97; https://doi.org/10.3390/asi8040097 - 21 Jul 2025
Viewed by 756
Abstract
The accelerated development of Artificial Intelligence (AI) capabilities and systems is driving a paradigm shift in productivity, innovation and growth. Despite this generational opportunity, AI is fraught with significant challenges and risks. To address these challenges, responsible AI has emerged as a modus [...] Read more.
The accelerated development of Artificial Intelligence (AI) capabilities and systems is driving a paradigm shift in productivity, innovation and growth. Despite this generational opportunity, AI is fraught with significant challenges and risks. To address these challenges, responsible AI has emerged as a modus operandi that ensures protections while not stifling innovations. Responsible AI minimizes risks to people, society, and the environment. However, responsible AI principles and practice are impacted by ‘principle proliferation’ as they are diverse and distributed across the applications, stakeholders, risks, and downstream impact of AI systems. This article presents a systematic review of responsible AI principles and practice with the objectives of discovering the current state, the foundations and the need for responsible AI, followed by the principles of responsible AI, and translation of these principles into the responsible practice of AI. Starting with 22,711 relevant peer-reviewed articles from comprehensive bibliographic databases, the review filters through to 9700 at de-duplication, 5205 at abstract screening, 1230 at semantic screening and 553 at final full-text screening. The analysis of this final corpus is presented as six findings that contribute towards the increased understanding and informed implementation of responsible AI. Full article
Show Figures

Figure 1

26 pages, 3622 KiB  
Article
Shear Strength Prediction for RCDBs Utilizing Data-Driven Machine Learning Approach: Enhanced CatBoost with SHAP and PDPs Analyses
by Imad Shakir Abbood, Noorhazlinda Abd Rahman and Badorul Hisham Abu Bakar
Appl. Syst. Innov. 2025, 8(4), 96; https://doi.org/10.3390/asi8040096 - 10 Jul 2025
Viewed by 412
Abstract
Reinforced concrete deep beams (RCDBs) provide significant strength and serviceability for building structures. However, a simple, general, and universally accepted procedure for predicting their shear strength (SS) has yet to be established. This study proposes a novel data-driven approach to predicting the SS [...] Read more.
Reinforced concrete deep beams (RCDBs) provide significant strength and serviceability for building structures. However, a simple, general, and universally accepted procedure for predicting their shear strength (SS) has yet to be established. This study proposes a novel data-driven approach to predicting the SS of RCDBs using an enhanced CatBoost (CB) model. For this purpose, a newly comprehensive database of RCDBs with shear failure, including 950 experimental specimens, was established and adopted. The model was developed through a customized procedure including feature selection, data preprocessing, hyperparameter tuning, and model evaluation. The CB model was further evaluated against three data-driven models (e.g., Random Forest, Extra Trees, and AdaBoost) as well as three prominent mechanics-driven models (e.g., ACI 318, CSA A23.3, and EU2). Finally, the SHAP algorithm was employed for interpretation to increase the model’s reliability. The results revealed that the CB model yielded a superior accuracy and outperformed all other models. In addition, the interpretation results showed similar trends between the CB model and mechanics-driven models. The geometric dimensions and concrete properties are the most influential input features on the SS, followed by reinforcement properties. In which the SS can be significantly improved by increasing beam width and concert strength, and by reducing shear span-to-depth ratio. Thus, the proposed interpretable data-driven model has a high potential to be an alternative approach for design practice in structural engineering. Full article
(This article belongs to the Special Issue Recent Developments in Data Science and Knowledge Discovery)
Show Figures

Figure 1

Back to TopTop