Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (1,145)

Search Parameters:
Keywords = research software engineers

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
17 pages, 2920 KiB  
Article
Device Reliability Analysis of NNBI Beam Source System Based on Fault Tree
by Qian Cao and Lizhen Liang
Appl. Sci. 2025, 15(15), 8556; https://doi.org/10.3390/app15158556 (registering DOI) - 1 Aug 2025
Abstract
Negative Ion Source Neutral beam Injection (NNBI), as a critical auxiliary heating system for magnetic confinement fusion devices, directly affects the plasma heating efficiency of tokamak devices through the reliability of its beam source system. The single-shot experiment constitutes a significant experimental program [...] Read more.
Negative Ion Source Neutral beam Injection (NNBI), as a critical auxiliary heating system for magnetic confinement fusion devices, directly affects the plasma heating efficiency of tokamak devices through the reliability of its beam source system. The single-shot experiment constitutes a significant experimental program for NNBI. This study addresses the frequent equipment failures encountered by the NNBI beam source system during a cycle of experiments, employing fault tree analysis (FTA) to conduct a systematic reliability assessment. Utilizing the AutoFTA 3.9 software platform, a fault tree model of the beam source system was established. Minimal cut set analysis was performed to identify the system’s weak points. The research employed AutoFTA 3.9 for both qualitative analysis and quantitative calculations, obtaining the failure probabilities of critical components. Furthermore, the F-V importance measure and mean time between failures (MTBF) were applied to analyze the system. This provides a theoretical basis and practical engineering guidance for enhancing the operational reliability of the NNBI system. The evaluation methodology developed in this study can be extended and applied to the reliability analysis of other high-power particle acceleration systems. Full article
Show Figures

Figure 1

22 pages, 3483 KiB  
Review
The Paradigm Shift in Scientific Interest on Flood Risk: From Hydraulic Analysis to Integrated Land Use Planning Approaches
by Ángela Franco and Salvador García-Ayllón
Water 2025, 17(15), 2276; https://doi.org/10.3390/w17152276 (registering DOI) - 31 Jul 2025
Abstract
Floods are natural hazards that have the greatest socioeconomic impact worldwide, given that 23% of the global population live in urban areas at risk of flooding. In this field of research, the analysis of flood risk has traditionally been studied based mainly on [...] Read more.
Floods are natural hazards that have the greatest socioeconomic impact worldwide, given that 23% of the global population live in urban areas at risk of flooding. In this field of research, the analysis of flood risk has traditionally been studied based mainly on approaches specific to civil engineering such as hydraulics and hydrology. However, these patterns of approaching the problem in research seem to be changing in recent years. During the last few years, a growing trend has been observed towards the use of methodology-based approaches oriented towards urban planning and land use management. In this context, this study analyzes the evolution of these research patterns in the field by developing a bibliometric meta-analysis of 2694 scientific publications on this topic published in recent decades. Evaluating keyword co-occurrence using VOSviewer software version 1.6.20, we analyzed how phenomena such as climate change have modified the way of addressing the study of this problem, giving growing weight to the use of integrated approaches improving territorial planning or implementing adaptive strategies, as opposed to the more traditional vision of previous decades, which only focused on the construction of hydraulic infrastructures for flood control. Full article
(This article belongs to the Special Issue Spatial Analysis of Flooding Phenomena: Challenges and Case Studies)
Show Figures

Figure 1

48 pages, 2275 KiB  
Article
Intersectional Software Engineering as a Field
by Alicia Julia Wilson Takaoka, Claudia Maria Cutrupi and Letizia Jaccheri
Software 2025, 4(3), 18; https://doi.org/10.3390/software4030018 - 30 Jul 2025
Abstract
Intersectionality is a concept used to explain the power dynamics and inequalities that some groups experience owing to the interconnection of social differences such as in gender, sexual identity, poverty status, race, geographic location, disability, and education. The relation between software engineering, feminism, [...] Read more.
Intersectionality is a concept used to explain the power dynamics and inequalities that some groups experience owing to the interconnection of social differences such as in gender, sexual identity, poverty status, race, geographic location, disability, and education. The relation between software engineering, feminism, and intersectionality has been addressed by some studies thus far, but it has never been codified before. In this paper, we employ the commonly used ABC Framework for empirical software engineering to show the contributions of intersectional software engineering (ISE) as a field of software engineering. In addition, we highlight the power dynamic, unique to ISE studies, and define gender-forward intersectionality as a way to use gender as a starting point to identify and examine inequalities and discrimination. We show that ISE is a field of study in software engineering that uses gender-forward intersectionality to produce knowledge about power dynamics in software engineering in its specific domains and environments. Employing empirical software engineering research strategies, we explain the importance of recognizing and evaluating ISE through four dimensions of dynamics, which are people, processes, products, and policies. Beginning with a set of 10 seminal papers that enable us to define the initial concepts and the query for the systematic mapping study, we conduct a systematic mapping study leads to a dataset of 140 primary papers, of which 15 are chosen as example papers. We apply the principles of ISE to these example papers to show how the field functions. Finally, we conclude the paper by advocating the recognition of ISE as a specialized field of study in software engineering. Full article
(This article belongs to the Special Issue Women’s Special Issue Series: Software)
Show Figures

Figure 1

31 pages, 4576 KiB  
Article
Detection, Isolation, and Identification of Multiplicative Faults in a DC Motor and Amplifier Using Parameter Estimation Techniques
by Sanja Antić, Marko Rosić, Branko Koprivica, Alenka Milovanović and Milentije Luković
Appl. Sci. 2025, 15(15), 8322; https://doi.org/10.3390/app15158322 - 26 Jul 2025
Viewed by 149
Abstract
The increasing complexity of modern control systems highlights the need for reliable and robust fault detection, isolation, and identification (FDII) methods, particularly in safety-critical and industrial applications. The study focuses on the FDII of multiplicative faults in a DC motor and its electronic [...] Read more.
The increasing complexity of modern control systems highlights the need for reliable and robust fault detection, isolation, and identification (FDII) methods, particularly in safety-critical and industrial applications. The study focuses on the FDII of multiplicative faults in a DC motor and its electronic amplifier. To simulate such scenarios, a complete laboratory platform was developed for real-time FDII, using relay-based switching and custom LabVIEW software 2009. This platform enables real-time experimentation and represents an important component of the study. Two estimation-based fault detection (FD) algorithms were implemented: the Sliding Window Algorithm (SWA) for discrete-time models and a modified Sliding Integral Algorithm (SIA) for continuous-time models. The modification introduced to the SIA limits the data length used in least squares estimation, thereby reducing the impact of transient effects on parameter accuracy. Both algorithms achieved high model output-to-measured signal agreement, up to 98.6% under nominal conditions and above 95% during almost all fault scenarios. Moreover, the proposed fault isolation and identification methods, including a decision algorithm and an indirect estimation approach, successfully isolated and identified faults in key components such as amplifier resistors (R1, R9, R12), capacitor (C8), and motor parameters, including armature resistance (Ra), inertia (J), and friction coefficient (B). The decision algorithm, based on continuous-time model coefficients, demonstrated reliable fault isolation and identification, while the reduced Jacobian-based approach in the discrete model enhanced fault magnitude estimation, with deviations typically below 10%. Additionally, the platform supports remote experimentation, offering a valuable resource for advancing model-based FDII research and engineering education. Full article
Show Figures

Figure 1

16 pages, 944 KiB  
Article
Artificial Intelligence in the Oil and Gas Industry: Applications, Challenges, and Future Directions
by Marcelo dos Santos Póvoas, Jéssica Freire Moreira, Severino Virgínio Martins Neto, Carlos Antonio da Silva Carvalho, Bruno Santos Cezario, André Luís Azevedo Guedes and Gilson Brito Alves Lima
Appl. Sci. 2025, 15(14), 7918; https://doi.org/10.3390/app15147918 - 16 Jul 2025
Viewed by 977
Abstract
This study aims to provide a comprehensive overview of the application of artificial intelligence (AI) methods to solve real-world problems in the oil and gas sector. The methodology involved a two-step process for analyzing AI applications. In the first step, an initial exploration [...] Read more.
This study aims to provide a comprehensive overview of the application of artificial intelligence (AI) methods to solve real-world problems in the oil and gas sector. The methodology involved a two-step process for analyzing AI applications. In the first step, an initial exploration of scientific articles in the Scopus database was conducted using keywords related to AI and computational intelligence, resulting in a total of 11,296 articles. The bibliometric analysis conducted using VOS Viewer version 1.6.15 software revealed an average annual growth of approximately 15% in the number of publications related to AI in the sector between 2015 and 2024, indicating the growing importance of this technology. In the second step, the research focused on the OnePetro database, widely used by the oil industry, selecting articles with terms associated with production and drilling, such as “production system”, “hydrate formation”, “machine learning”, “real-time”, and “neural network”. The results highlight the transformative impact of AI on production operations, with key applications including optimizing operations through real-time data analysis, predictive maintenance to anticipate failures, advanced reservoir management through improved modeling, image and video analysis for continuous equipment monitoring, and enhanced safety through immediate risk detection. The bibliometric analysis identified a significant concentration of publications at Society of Petroleum Engineers (SPE) events, which accounted for approximately 40% of the selected articles. Overall, the integration of AI into production operations has driven significant improvements in efficiency and safety, and its continued evolution is expected to advance industry practices further and address emerging challenges. Full article
Show Figures

Figure 1

26 pages, 871 KiB  
Review
Addressing Challenges in Large-Scale Bioprocess Simulations: A Circular Economy Approach Using SuperPro Designer
by Juan Silvestre Aranda-Barradas, Claudia Guerrero-Barajas and Alberto Ordaz
Processes 2025, 13(7), 2259; https://doi.org/10.3390/pr13072259 - 15 Jul 2025
Viewed by 303
Abstract
Bioprocess simulation is a powerful tool for leveraging circular economy principles in the analysis of large-scale bioprocesses, enhancing decision-making for efficient and sustainable production. By simulating different process scenarios, researchers and engineers can evaluate the techno-economic feasibility of different approaches. This approach enables [...] Read more.
Bioprocess simulation is a powerful tool for leveraging circular economy principles in the analysis of large-scale bioprocesses, enhancing decision-making for efficient and sustainable production. By simulating different process scenarios, researchers and engineers can evaluate the techno-economic feasibility of different approaches. This approach enables the identification of cost-effective and sustainable solutions, optimizing resource use and minimizing waste, thereby enhancing the overall efficiency and viability of bioprocesses within a circular economy framework. In this review, we provide an overview of circular economy concepts and trends before discussing design methodologies and challenges in large-scale bioprocesses. The analysis highlights the application and advantages of using process simulators like SuperPro Designer v.14 in bioprocess development. Process design methodologies have evolved to use specialized software that integrates chemical and biochemical processes, physical properties, and economic and environmental considerations. By embracing circular economy principles, these methodologies evaluate projects that transform waste into valuable products, aiming to reduce pollution and resources use, thereby shifting from a linear to a circular economy. In process engineering, exciting perspectives are emerging, particularly in large-scale bioprocess simulations, which are expected to contribute to the improvement of bioprocess technology and computer applications. Full article
(This article belongs to the Special Issue Trends in Biochemical Processing Techniques)
Show Figures

Figure 1

34 pages, 6467 KiB  
Article
Predictive Sinusoidal Modeling of Sedimentation Patterns in Irrigation Channels via Image Analysis
by Holger Manuel Benavides-Muñoz
Water 2025, 17(14), 2109; https://doi.org/10.3390/w17142109 - 15 Jul 2025
Viewed by 300
Abstract
Sediment accumulation in irrigation channels poses a significant challenge to water resource management, impacting hydraulic efficiency and agricultural sustainability. This study introduces an innovative multidisciplinary framework that integrates advanced image analysis (FIJI/ImageJ 1.54p), statistical validation (RStudio), and vector field modeling with a novel [...] Read more.
Sediment accumulation in irrigation channels poses a significant challenge to water resource management, impacting hydraulic efficiency and agricultural sustainability. This study introduces an innovative multidisciplinary framework that integrates advanced image analysis (FIJI/ImageJ 1.54p), statistical validation (RStudio), and vector field modeling with a novel Sinusoidal Morphodynamic Bedload Transport Equation (SMBTE) to predict sediment deposition patterns with high precision. Conducted along the Malacatos River in La Tebaida Linear Park, Loja, Ecuador, the research captured a natural sediment transport event under controlled flow conditions, transitioning from pressurized pipe flow to free-surface flow. Observed sediment deposition reduced the hydraulic cross-section by approximately 5 cm, notably altering flow dynamics and water distribution. The final SMBTE model (Model 8) demonstrated exceptional predictive accuracy, achieving RMSE: 0.0108, R2: 0.8689, NSE: 0.8689, MAE: 0.0093, and a correlation coefficient exceeding 0.93. Complementary analyses, including heatmaps, histograms, and vector fields, revealed spatial heterogeneity, local gradients, and oscillatory trends in sediment distribution. These tools identified high-concentration sediment zones and quantified variability, providing actionable insights for optimizing canal design, maintenance schedules, and sediment control strategies. By leveraging open-source software and real-world validation, this methodology offers a scalable, replicable framework applicable to diverse water conveyance systems. The study advances understanding of sediment dynamics under subcritical (Fr ≈ 0.07) and turbulent flow conditions (Re ≈ 41,000), contributing to improved irrigation efficiency, system resilience, and sustainable water management. This research establishes a robust foundation for future advancements in sediment transport modeling and hydrological engineering, addressing critical challenges in agricultural water systems. Full article
(This article belongs to the Section Water Erosion and Sediment Transport)
Show Figures

Figure 1

24 pages, 3294 KiB  
Review
Trends and Applications of Principal Component Analysis in Forestry Research: A Literature and Bibliometric Review
by Gabriel Murariu, Lucian Dinca and Dan Munteanu
Forests 2025, 16(7), 1155; https://doi.org/10.3390/f16071155 - 13 Jul 2025
Viewed by 412
Abstract
Principal component analysis (PCA) is a widely applied multivariate statistical technique across scientific disciplines, with forestry being one of its most dynamic areas of use. Its primary strength lies in reducing data dimensionality and classifying parameters within complex ecological datasets. This study provides [...] Read more.
Principal component analysis (PCA) is a widely applied multivariate statistical technique across scientific disciplines, with forestry being one of its most dynamic areas of use. Its primary strength lies in reducing data dimensionality and classifying parameters within complex ecological datasets. This study provides the first comprehensive bibliometric and literature review focused exclusively on PCA applications in forestry. A total of 96 articles published between 1993 and 2024 were analyzed using the Web of Science database and visualized using VOSviewer software, version 1.6.20. The bibliometric analysis revealed that the most active scientific fields were environmental sciences, forestry, and engineering, and the most frequently published journals were Forests and Sustainability. Contributions came from 198 authors across 44 countries, with China, Spain, and Brazil identified as leading contributors. PCA has been employed in a wide range of forestry applications, including species classification, biomass modeling, environmental impact assessment, and forest structure analysis. It is increasingly used to support decision-making in forest management, biodiversity conservation, and habitat evaluation. In recent years, emerging research has demonstrated innovative integrations of PCA with advanced technologies such as hyperspectral imaging, LiDAR, unmanned aerial vehicles (UAVs), and remote sensing platforms. These integrations have led to substantial improvements in forest fire detection, disease monitoring, and species discrimination. Furthermore, PCA has been combined with other analytical methods and machine learning models—including Lasso regression, support vector machines, and deep learning algorithms—resulting in enhanced data classification, feature extraction, and ecological modeling accuracy. These hybrid approaches underscore PCA’s adaptability and relevance in addressing contemporary challenges in forestry research. By systematically mapping the evolution, distribution, and methodological innovations associated with PCA, this study fills a critical gap in the literature. It offers a foundational reference for researchers and practitioners, highlighting both current trends and future directions for leveraging PCA in forest science and environmental monitoring. Full article
(This article belongs to the Section Forest Ecology and Management)
Show Figures

Figure 1

19 pages, 26396 KiB  
Article
Development of a Networked Multi-Participant Driving Simulator with Synchronized EEG and Telemetry for Traffic Research
by Poorendra Ramlall, Ethan Jones and Subhradeep Roy
Systems 2025, 13(7), 564; https://doi.org/10.3390/systems13070564 - 10 Jul 2025
Viewed by 422
Abstract
This paper presents a multi-participant driving simulation framework designed to support traffic experiments involving the simultaneous collection of vehicle telemetry and cognitive data. The system integrates motion-enabled driving cockpits, high-fidelity steering and pedal systems, immersive visual displays (monitor or virtual reality), and the [...] Read more.
This paper presents a multi-participant driving simulation framework designed to support traffic experiments involving the simultaneous collection of vehicle telemetry and cognitive data. The system integrates motion-enabled driving cockpits, high-fidelity steering and pedal systems, immersive visual displays (monitor or virtual reality), and the Assetto Corsa simulation engine. To capture cognitive states, dry-electrode EEG headsets are used alongside a custom-built software tool that synchronizes EEG signals with vehicle telemetry across multiple drivers. The primary contribution of this work is the development of a modular, scalable, and customizable experimental platform with robust data synchronization, enabling the coordinated collection of neural and telemetry data in multi-driver scenarios. The synchronization software developed through this study is freely available to the research community. This architecture supports the study of human–human interactions by linking driver actions with corresponding neural activity across a range of driving contexts. It provides researchers with a powerful tool to investigate perception, decision-making, and coordination in dynamic, multi-participant traffic environments. Full article
(This article belongs to the Special Issue Modelling and Simulation of Transportation Systems)
Show Figures

Figure 1

26 pages, 3079 KiB  
Article
Implementing CAD API Automated Processes in Engineering Design: A Case Study Approach
by Konstantinos Sofias, Zoe Kanetaki, Constantinos Stergiou, Antreas Kantaros, Sébastien Jacques and Theodore Ganetsos
Appl. Sci. 2025, 15(14), 7692; https://doi.org/10.3390/app15147692 - 9 Jul 2025
Viewed by 579
Abstract
Increasing mechanical design complexity and volume, particularly in component-based manufacturing, require scalable, traceable, and efficient design processes. In this research, a modular in-house automation platform using Autodesk Inventor’s Application Programming Interface (API) and Visual Basic for Applications (VBA) is developed to automate recurrent [...] Read more.
Increasing mechanical design complexity and volume, particularly in component-based manufacturing, require scalable, traceable, and efficient design processes. In this research, a modular in-house automation platform using Autodesk Inventor’s Application Programming Interface (API) and Visual Basic for Applications (VBA) is developed to automate recurrent tasks such as CAD file generation, drawing production, structured archiving, and cost estimation. The proposed framework was implemented and tested on three real-world case studies in a turbocharger reconditioning unit with varying degrees of automation. Findings indicate remarkable time savings of up to 90% in certain documentation tasks with improved consistency, traceability, and reduced manual intervention. Moreover, the system also facilitated automatic generation of metadata-rich Excel and Word documents, allowing centralized documentation and access to data. In comparison with commercial automation software, the solution is flexible, cost-effective, and responsive to project changes and thus suitable for small and medium enterprises. Though automation reduced workload and rendered the system more reliable, some limitations remain, especially in fully removing engineering judgment, especially in complex design scenarios. Overall, this study investigates how API-based automation can significantly increase productivity and data integrity in CAD-intensive environments and explores future integration opportunities using AI and other CAD software. Full article
(This article belongs to the Section Mechanical Engineering)
Show Figures

Figure 1

16 pages, 3150 KiB  
Article
Predictive ANN Modeling and Optimization of Injection Molding Parameters to Minimize Warpage in Polypropylene Rectangular Parts
by Juan Luis Gámez, Amparo Jordá-Vilaplana, Miguel Angel Peydro, Miguel Angel Selles and Samuel Sanchez-Caballero
J. Manuf. Mater. Process. 2025, 9(7), 236; https://doi.org/10.3390/jmmp9070236 - 9 Jul 2025
Viewed by 288
Abstract
Injection molding is a fundamental process for transforming plastics into various industrial components. Among the critical aspects studied in this process, volumetric contraction and warpage of plastic parts are of particular importance. Achieving precise control over warpage is crucial for ensuring the production [...] Read more.
Injection molding is a fundamental process for transforming plastics into various industrial components. Among the critical aspects studied in this process, volumetric contraction and warpage of plastic parts are of particular importance. Achieving precise control over warpage is crucial for ensuring the production of high-quality components. This research explores optimizing injection process parameters to minimize volumetric contraction and warpage in rectangular polypropylene (PP) parts. The study employs experimental analysis, MoldFlow simulation, and Artificial Neural Network (ANN) modeling. MoldFlow simulation software provides valuable data on warpage, serving as input for the ANN model. Based on the Backpropagation Neural Network algorithm, the optimized ANN model accurately predicts warpage by considering factors such as part thickness, flow path distance, and flow path tangent. The study highlights the importance of accurately setting injection parameters to achieve optimal warpage results. The BPNN-based approach offers a faster and more efficient alternative to computer-aided engineering (CAE) processes for studying warpage. Full article
Show Figures

Figure 1

34 pages, 3185 KiB  
Article
A Student-Centric Evaluation Survey to Explore the Impact of LLMs on UML Modeling
by Bilal Al-Ahmad, Anas Alsobeh, Omar Meqdadi and Nazimuddin Shaikh
Information 2025, 16(7), 565; https://doi.org/10.3390/info16070565 - 1 Jul 2025
Viewed by 410
Abstract
Unified Modeling Language (UML) diagrams serve as essential tools for visualizing system structure and behavior in software design. With the emergence of Large Language Models (LLMs) that automate various phases of software development, there is growing interest in leveraging these models for UML [...] Read more.
Unified Modeling Language (UML) diagrams serve as essential tools for visualizing system structure and behavior in software design. With the emergence of Large Language Models (LLMs) that automate various phases of software development, there is growing interest in leveraging these models for UML diagram generation. This study presents a comprehensive empirical investigation into the effectiveness of GPT-4-turbo in generating four fundamental UML diagram types: Class, Deployment, Use Case, and Sequence diagrams. We developed a novel rule-based prompt-engineering framework that transforms domain scenarios into optimized prompts for LLM processing. The generated diagrams were then synthesized using PlantUML and evaluated through a rigorous survey involving 121 computer science and software engineering students across three U.S. universities. Participants assessed both the completeness and correctness of LLM-assisted and human-created diagrams by examining specific elements within each diagram type. Statistical analyses, including paired t-tests, Wilcoxon signed-rank tests, and effect size calculations, validate the significance of our findings. The results reveal that while LLM-assisted diagrams achieve meaningful levels of completeness and correctness (ranging from 61.1% to 67.7%), they consistently underperform compared to human-created diagrams. The performance gap varies by diagram type, with Sequence diagrams showing the closest alignment to human quality and Use Case diagrams exhibiting the largest discrepancy. This research contributes a validated framework for evaluating LLM-generated UML diagrams and provides empirically-grounded insights into the current capabilities and limitations of LLMs in software modeling education. Full article
Show Figures

Figure 1

23 pages, 2350 KiB  
Article
Comparative Evaluation of the Effects of Variable Spark Timing and Ethanol-Supplemented Fuel Use on the Performance and Emission Characteristics of an Aircraft Piston Engine
by Roussos Papagiannakis and Nikolaos Lytras
Energies 2025, 18(13), 3440; https://doi.org/10.3390/en18133440 - 30 Jun 2025
Viewed by 244
Abstract
Nowadays, there are many studies that have been conducted in order to reduce the emissions of modern reciprocating engines without, at the same time, having a negative impact on the performance characteristics. One method to accomplish that is by using ethanol-supplemented fuels instead [...] Read more.
Nowadays, there are many studies that have been conducted in order to reduce the emissions of modern reciprocating engines without, at the same time, having a negative impact on the performance characteristics. One method to accomplish that is by using ethanol-supplemented fuels instead of conventional gasoline. On the other side of the spectrum, spark timing is one of the most important parameters that affects the combustion mechanism inside a reciprocating engine and is basically controlled by the ignition advance of the engine. Therefore, the main purpose of this study is to investigate the effect of spark timing alteration on the performance characteristics and emissions of a modern reciprocating, naturally aspirated, aircraft SI engine (i.e., ROTAX 912s), operated under four different engine operating points (i.e., combination of engine speed and throttle opening), by using ethanol-supplemented fuel. The implementation of the aforementioned method is achieved through the use of an advanced simulating software (i.e., GT-POWER), which provides the user with the possibility to completely design a piston engine and parameterize it, by using a comprehensive single-zone phenomenological model, for any operating conditions in the entire range of its operating points. The predictive ability of the designed engine model is evaluated by comparing the results with the experimental values obtained from the technical manuals of the engine. For all test cases examined in the present work, the results are affiliated with important performance characteristics, i.e., brake power, brake torque, and brake-specific fuel consumption, as well as specific NO and CO concentrations. Thus, the primary objectives of this study were to examine and evaluate the results of the combination of using ethanol-supplemented fuel instead of gasoline and the alteration of the spark timing, to asses their effects on the basic performance characteristics and emissions of the aforementioned type of engine. By examining the results of this study, it is revealed that the increase in the ethanol concentration in the gasoline–ethanol fuel blend combined with the increase in the ignition advance might be an auspicious solution in order to meliorate both the performance and the environmental behavior of a naturally aspirated SI aircraft piston engine. In a nutshell, the outcoming results of this research show that the combination of the two methods examined may be a valuable solution if applied to existing reciprocating SI engines. Full article
(This article belongs to the Special Issue Internal Combustion Engine Performance 2025)
Show Figures

Figure 1

56 pages, 1008 KiB  
Review
Machine Learning Techniques for Requirements Engineering: A Comprehensive Literature Review
by António Miguel Rosado da Cruz and Estrela Ferreira Cruz
Software 2025, 4(3), 14; https://doi.org/10.3390/software4030014 - 28 Jun 2025
Viewed by 672
Abstract
Software requirements engineering is one of the most critical and time-consuming phases of the software-development process. The lack of communication with stakeholders and the use of natural language for communicating leads to misunderstanding and misidentification of requirements or the creation of ambiguous requirements, [...] Read more.
Software requirements engineering is one of the most critical and time-consuming phases of the software-development process. The lack of communication with stakeholders and the use of natural language for communicating leads to misunderstanding and misidentification of requirements or the creation of ambiguous requirements, which can jeopardize all subsequent steps in the software-development process and can compromise the quality of the final software product. Natural Language Processing (NLP) is an old area of research; however, it is currently undergoing strong and very positive impacts with recent advances in the area of Machine Learning (ML), namely with the emergence of Deep Learning and, more recently, with the so-called transformer models such as BERT and GPT. Software requirements engineering is also being strongly affected by the entire evolution of ML and other areas of Artificial Intelligence (AI). In this article we conduct a systematic review on how AI, ML and NLP are being used in the various stages of requirements engineering, including requirements elicitation, specification, classification, prioritization, requirements management, requirements traceability, etc. Furthermore, we identify which algorithms are most used in each of these stages, uncover challenges and open problems and suggest future research directions. Full article
(This article belongs to the Topic Applications of NLP, AI, and ML in Software Engineering)
Show Figures

Figure 1

23 pages, 20961 KiB  
Article
Bridging In Situ Testing and Constitutive Modelling: An Automated Approach to Soil Parameter Identification
by Islam Marzouk and Franz Tschuchnigg
Appl. Sci. 2025, 15(13), 7224; https://doi.org/10.3390/app15137224 - 26 Jun 2025
Viewed by 229
Abstract
In situ testing is essential in geotechnical engineering, providing valuable insights into both soil stratification and material behaviour. This paper illustrates an automated framework for deriving constitutive model parameters from in situ test data. The framework employs a graph-based approach that enhances both [...] Read more.
In situ testing is essential in geotechnical engineering, providing valuable insights into both soil stratification and material behaviour. This paper illustrates an automated framework for deriving constitutive model parameters from in situ test data. The framework employs a graph-based approach that enhances both transparency and adaptability—transparency by explicitly tracing the computation of each parameter and adaptability by allowing users to incorporate their expertise. The study applies this framework to a marine clay test site, demonstrating its ability to determine soil parameters efficiently. Additionally, the framework is directly integrated into a finite element software, enabling seamless parameter transfer for numerical modelling. A case study is presented in which a shallow foundation is simulated to illustrate the practical application of this approach. This framework is particularly valuable in the early stages of geotechnical projects, providing detailed soil characterisation when site data is limited. Validating the accuracy of the derived parameters and incorporating additional in situ test methods are part of ongoing research. Full article
Show Figures

Figure 1

Back to TopTop