Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (553)

Search Parameters:
Keywords = technical codes

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
24 pages, 386 KB  
Article
AI as Co-Creator: Fostering Social Equity Towards Social Sustainability in Entrepreneurial Development for Women and Minority Entrepreneurs
by Joanne Scillitoe, Deone Zell, Latha Poonamallee and Kene Turner
Sustainability 2025, 17(21), 9613; https://doi.org/10.3390/su17219613 (registering DOI) - 29 Oct 2025
Abstract
This paper examines how artificial intelligence (AI) can act as a co-creation partner to foster social equity leading to social sustainability by addressing persistent barriers faced by women and minority entrepreneurs. We develop a theoretical framework integrating social capital theory and the resource-based [...] Read more.
This paper examines how artificial intelligence (AI) can act as a co-creation partner to foster social equity leading to social sustainability by addressing persistent barriers faced by women and minority entrepreneurs. We develop a theoretical framework integrating social capital theory and the resource-based view to analyze how AI can systematically address resource gaps across structural, relational, and cognitive dimensions while serving as a strategic capability that enables competitive advantage. Modern AI systems including ChatGPT, Claude, Gemini, and Perplexity represent practical technologies already operational for everyday entrepreneurs through accessible platforms, low-cost subscriptions, and no-code tools enabling workflow automation with minimal technical skill. While prior work has explored how social capital creates competitive advantages, little research explains how AI technologies specifically enhance both social capital development and resource-based competitive advantage simultaneously for ventures of underrepresented entrepreneurs. This study explicitly identifies the entrepreneurial venture as the unit of analysis and articulates five testable propositions on AI’s influence across structural, relational, and cognitive capital, clarifying mechanisms by which AI functions as a technological mediator that democratizes access to both network resources and strategic capabilities for underrepresented founders. Using AI-generated hypotheticals from Los Angeles demonstrating replicable processes with current technologies like retrieval-augmented generation and cloud AI workspaces, we show that AI-enhanced social capital can reduce venture development disparities while generating distinctive advantages for strategically adopting entrepreneurs. The framework requires empirical validation through longitudinal studies and acknowledges dependencies on infrastructure, ecosystem support, and cultural context, ultimately reconceptualizing AI as an active partner, illustrating that equity and competitive excellence are complementary and achievable through deliberate AI-enabled social capital development. Full article
(This article belongs to the Special Issue Sustainability Management Strategies and Practices—2nd Edition)
Show Figures

Figure 1

12 pages, 1202 KB  
Data Descriptor
Toward Responsible AI in High-Stakes Domains: A Dataset for Building Static Analysis with LLMs in Structural Engineering
by Carlos Avila, Daniel Ilbay, Paola Tapia and David Rivera
Data 2025, 10(11), 169; https://doi.org/10.3390/data10110169 - 24 Oct 2025
Viewed by 221
Abstract
Modern engineering increasingly operates within socio-technical networks, such as the interdependence of energy grids, transport systems, and building codes, where decisions must be reliable and transparent. Large language models (LLMs) such as GPT promise efficiency by interpreting domain-specific queries and generating outputs, yet [...] Read more.
Modern engineering increasingly operates within socio-technical networks, such as the interdependence of energy grids, transport systems, and building codes, where decisions must be reliable and transparent. Large language models (LLMs) such as GPT promise efficiency by interpreting domain-specific queries and generating outputs, yet their predictive nature can introduce biases or fabricated values—risks that are unacceptable in structural engineering, where safety and compliance are paramount. This work presents a dataset that embeds generative AI into validated computational workflows through the Model Context Protocol (MCP). MCP enables API-based integration between ChatGPT (GPT-4o) and numerical solvers by converting natural-language prompts into structured solver commands. This creates context-aware exchanges—for example, transforming a query on seismic drift limits into an OpenSees analysis—whose results are benchmarked against manually generated ETABS models. This architecture ensures traceability, reproducibility, and alignment with seismic design standards. The dataset contains prompts, GPT outputs, solver-based analyses, and comparative error metrics for four reinforced concrete frame models designed under Ecuadorian (NEC-15) and U.S. (ASCE 7-22) codes. The end-to-end runtime for these scenarios, including LLM prompting, MCP orchestration, and solver execution, ranged between 6 and 12 s, demonstrating feasibility for design and verification workflows. Beyond providing records, the dataset establishes a reproducible methodology for integrating LLMs into engineering practice, with three goals: enabling independent verification, fostering collaboration across AI and civil engineering, and setting benchmarks for responsible AI use in high-stakes domains. Full article
Show Figures

Figure 1

33 pages, 4858 KB  
Article
Multi-Criteria Assessment: A Case Study Integrating Eco-design Principles in Sustainable Manufacturing
by Khadija Sarquah, Caitlin Walls, Marta Revello, Maja Jelić and Gesa Beck
Information 2025, 16(11), 925; https://doi.org/10.3390/info16110925 - 22 Oct 2025
Viewed by 242
Abstract
This study integrates Eco-design principles and the Life Cycle approach in an MCA to evaluate the sustainability performance of manufacturing routes. The assessment is applied to conventional production across five use cases involving complex geometry parts. The aim is to evaluate areas of [...] Read more.
This study integrates Eco-design principles and the Life Cycle approach in an MCA to evaluate the sustainability performance of manufacturing routes. The assessment is applied to conventional production across five use cases involving complex geometry parts. The aim is to evaluate areas of material criticality, environmental impacts, chemical risks, as well as social aspects, including gender dimensions (C-MET-ESG). Outcomes are synthesised into colour-coded hotspot tables and Eco-design recommendations. Key findings highlight opportunities such as substituting high-criticality alloys, increasing material efficiency, and promoting gender inclusive workplace practices. Technological transitions from CNC machining and hazardous post-processing to laser and additive manufacturing further enhance safety, resource efficiency, and resilience. The novelty of this study lies in the integration of LCA principles, the C-MET-ESG matrix, and CRA-SSbD guidelines within an MCA, establishing a hazard-aware, socially inclusive, and technically robust framework. This approach provides life cycle linked evidence that connects early design choices to sustainability outcomes. Furthermore, the study offers a transferable methodology for sustainable manufacturing in both established and emerging technologies. Full article
(This article belongs to the Special Issue New Applications in Multiple Criteria Decision Analysis, 3rd Edition)
Show Figures

Figure 1

23 pages, 3113 KB  
Review
Integrated Building Retrofit for Seismic Resilience and Environmental Sustainability: A Critical Review
by Ghada Karaki and Rami A. Hawileh
Buildings 2025, 15(20), 3800; https://doi.org/10.3390/buildings15203800 - 21 Oct 2025
Viewed by 238
Abstract
Integrated seismic–environmental retrofit is gaining attention in research and practice, as it combines resilience and sustainability objectives in building retrofits. However, current research and practice remain fragmented. This paper presents a systematic literature review to analyse how retrofit is addressed across four key [...] Read more.
Integrated seismic–environmental retrofit is gaining attention in research and practice, as it combines resilience and sustainability objectives in building retrofits. However, current research and practice remain fragmented. This paper presents a systematic literature review to analyse how retrofit is addressed across four key dimensions: structural, environmental, social, and governance. A thematic analysis in NVivo was combined with Python-based quantitative analysis of code frequency and co-occurrence. The integrated retrofit literature reframes environmental assessment, shifting towards whole-building lifecycle assessment and having seismic environmental impacts and energy efficiency as embedded components. Retrofit practices are mainly discussed in technical and compliance terms, but are not properly examined using unified quantitative metrics; the broad use of metrics and indicators limits comparability and replication. Social and governance dimensions remain peripheral, with weak connections to structural and environmental dimensions, which constrain cross-domain integration and challenge scaling up retrofit interventions. These gaps encompass the barriers facing integrated retrofit, with potential pathways to overcome, including aligned standards and datasets, capacity building, community engagement, and coordinated regulatory frameworks. Full article
(This article belongs to the Special Issue Challenges in Structural Repairs and Renovations)
Show Figures

Figure 1

24 pages, 468 KB  
Article
Mining User Perspectives: Multi Case Study Analysis of Data Quality Characteristics
by Minnu Malieckal and Anjula Gurtoo
Information 2025, 16(10), 920; https://doi.org/10.3390/info16100920 - 21 Oct 2025
Viewed by 241
Abstract
With the growth of digital economies, data quality forms a key factor in enabling use and delivering value. Existing research defines quality through technical benchmarks or provider-led frameworks. Our study shifts the focus to actual users. Thirty-seven distinct data quality dimensions identified through [...] Read more.
With the growth of digital economies, data quality forms a key factor in enabling use and delivering value. Existing research defines quality through technical benchmarks or provider-led frameworks. Our study shifts the focus to actual users. Thirty-seven distinct data quality dimensions identified through a comprehensive review of the literature provide limited applicability for practitioners seeking actionable guidance. To address the gap, in-depth interviews of senior professionals from 25 organizations were conducted, representing sectors like computer science and technology, finance, environmental, social, and governance, and urban infrastructure. Data are analysed using content analysis methodology, with 2 level coding, supported by NVivo R1 software. Several newer perspectives emerged. Firstly, data quality is not simply about accuracy or completeness, rather it depends on suitability for real-world tasks. Secondly, trust grows with data transparency. Knowing where the data comes from and the nature of data processing matters as much as the data per se. Thirdly, users are open to paying for data, provided the data is clean, reliable, and ready to use. These and other results suggest data users focus on a narrower, more practical set of priorities, considered essential in actual workflows. Rethinking quality from a consumer’s perspective offers a practical path to building credible and accessible data ecosystems. This study is particularly useful for data platform designers, policymakers, and organisations aiming to strengthen data quality and trust in data exchange ecosystems. Full article
Show Figures

Graphical abstract

25 pages, 1171 KB  
Article
A Framework for Holistic Assessment of Professional Competencies in Environmental Health WIL at a University of Technology
by Louella M. Daries, Lizel S. Hudson and Lalini Reddy
Educ. Sci. 2025, 15(10), 1387; https://doi.org/10.3390/educsci15101387 - 17 Oct 2025
Viewed by 342
Abstract
The benefits of work-integrated learning (WIL) in higher education are well documented. Conversely, the assessment thereof, across disciplines, remains challenging. WIL is embedded in the environmental health (EH) degree at a University of Technology (UoT) in South Africa (SA), with similar challenges. The [...] Read more.
The benefits of work-integrated learning (WIL) in higher education are well documented. Conversely, the assessment thereof, across disciplines, remains challenging. WIL is embedded in the environmental health (EH) degree at a University of Technology (UoT) in South Africa (SA), with similar challenges. The absence of explicit competency indicators and associated assessment criteria in the current curriculum necessitates an understanding of the full range of professional competencies necessary for achieving environmental health goals. Concomitantly, research relating to EH WIL and its assessment in EH programmes are sparse. The aim of this study is to present a holistic assessment framework for competencies developed due to WIL. Using a qualitative design, data collection occurred through in-depth interviews, document analysis and focus group discussions. Data analysis was guided by the Legitimation Code Theory’s (LCT) Specialization dimension. The results foregrounds competencies beyond mere knowledge integration and technical skill development in WIL. It is thus recommended that current assessment be altered to adopt holistic assessment of EH WIL and include the full range of professional competencies essential for EH practice success. Further research may explore where additional opportunities exist for authentic assessment of the foregrounded competencies, throughout the programme, additional to the WIL assessments. Full article
(This article belongs to the Collection Trends and Challenges in Higher Education)
Show Figures

Figure 1

53 pages, 3157 KB  
Article
Large Language Models for Machine Learning Design Assistance: Prompt-Driven Algorithm Selection and Optimization in Diverse Supervised Learning Tasks
by Fidan Kaya Gülağız
Appl. Sci. 2025, 15(20), 10968; https://doi.org/10.3390/app152010968 - 13 Oct 2025
Viewed by 613
Abstract
Large language models (LLMs) are playing an increasingly important role in data science applications. In this study, the performance of LLMs in generating code and designing solutions for data science tasks is systematically evaluated based on different real-world tasks from the Kaggle platform. [...] Read more.
Large language models (LLMs) are playing an increasingly important role in data science applications. In this study, the performance of LLMs in generating code and designing solutions for data science tasks is systematically evaluated based on different real-world tasks from the Kaggle platform. Models from different LLM families were tested under both default settings and configurations with hyperparameter tuning (HPT) applied. In addition, the effects of few-shot prompting (FSP) and Tree of Thought (ToT) strategies on code generation were compared. Alongside technical metrics such as accuracy, F1 score, Root Mean Squared Error (RMSE), execution time, and peak memory consumption, LLM outputs were also evaluated against Kaggle user-submitted solutions, leaderboard scores, and two established AutoML frameworks (auto-sklearn and AutoGluon). The findings suggest that, with effective prompting strategies and HPT, models can deliver competitive results on certain tasks. The ability of some LLMS to suggest appropriate algorithms reveals that LLMs can be seen not only as code generators, but also as systems capable of designing machine learning (ML) solutions. This study presents a comprehensive analysis of how strategic decisions such as prompting methods, tuning approaches, and algorithm selection, affect the design of LLM-based data science systems, offering insights for future hybrid human–LLM systems. Full article
Show Figures

Figure 1

24 pages, 1813 KB  
Article
Research on Multi-Level Monitoring Architecture Pattern of Cloud-Based Safety Computing Platform
by Lei Yuan, Bokai Zhang, Yu Liu, Qiang Fu and Yixiong Wu
Symmetry 2025, 17(10), 1706; https://doi.org/10.3390/sym17101706 - 11 Oct 2025
Viewed by 221
Abstract
As rail transit systems advance toward greater automation and intelligence, cloud computing technology, with its remarkable scalability and robust data processing capabilities, has been steadily expanding its footprint in this domain. However, the adoption of cloud computing also introduces new safety challenges for [...] Read more.
As rail transit systems advance toward greater automation and intelligence, cloud computing technology, with its remarkable scalability and robust data processing capabilities, has been steadily expanding its footprint in this domain. However, the adoption of cloud computing also introduces new safety challenges for train control systems. Traditional safety computers in train control systems rely on heterogeneous redundancy with symmetry to enhance safety. Nevertheless, the software in cloud computing environments, even if heterogeneous, may share the same source code, thereby triggering the risk of common-cause failures in the software. To address these issues, this study proposes a multi-level monitoring architecture system tailored to the characteristics of cloud-based safety computing platforms. This architecture innovatively integrates the three-level monitoring architecture pattern from the automotive field, the secure channel pattern, and the distributed safety mechanism architecture. It monitors the levels of common-cause software failures that cannot be eliminated through heterogeneity. The introduction of multi-level active monitoring for risk control has reduced the impact of common-cause software failures on system security. By constructing a formal security model, quantitative evaluations are conducted separately on the single-channel L2 and L3, the dual-channel L4 without degradation or monitoring, and the dual-channel L4 monitoring architecture with complete functions. This verifies the effectiveness of the proposed monitoring architecture in reducing the risk of common-cause software failures in the virtualization layer. This study provides a robust theoretical foundation and technical support for the security-oriented design and development of the next-generation intelligent rail transit systems. Full article
(This article belongs to the Section Computer)
Show Figures

Figure 1

24 pages, 1149 KB  
Review
Shaping Architecture with Generative Artificial Intelligence: Deep Learning Models in Architectural Design Workflow
by Socrates Yiannoudes
Architecture 2025, 5(4), 94; https://doi.org/10.3390/architecture5040094 - 10 Oct 2025
Viewed by 1675
Abstract
Deep-learning generative AI promises to transform architectural design, yet its potential employment and ready-to-use capacity for professional workflows are unclear. This study presents a systematic review conducted in accordance with PRISMA 2020 guidelines, synthesizing peer-reviewed work from 2015 to 2025 to assess how [...] Read more.
Deep-learning generative AI promises to transform architectural design, yet its potential employment and ready-to-use capacity for professional workflows are unclear. This study presents a systematic review conducted in accordance with PRISMA 2020 guidelines, synthesizing peer-reviewed work from 2015 to 2025 to assess how GenAI methods align with architectural practice. A total of 1566 records were initially retrieved across databases, of which 42 studies met eligibility criteria after structured screening and selection. Each was evaluated using five indicators with a three-tier rubric: Output Representation Type, Pipeline Integration, Workflow Standardization, Tool Readiness, and Technical Skillset. Results show that most outputs are raster images or non-editable objects, with only a minority producing CAD/BIM-ready geometry. Workflow pipelines are often fragmented with manual hand-offs and most GenAI methods map only onto the early conceptual design stage. Prototypes frequently require bespoke coding and advanced expertise. These findings indicate a persistent gap between experimentation with ideation-oriented GenAI and the pragmatism of CAD/BIM-centered delivery. By framing the proposed rubric as a workflow maturity model, this review contributes a replicable benchmark for assessing practice readiness and identifying pathways toward mainstream adoption. For GenAI to move from prototypes to mainstream architectural design practice, it is essential to address not only technical barriers, but also cultural issues such as professional skepticism and reliability concerns, as well as ecosystem challenges of data sharing, authorship, and liability. Full article
(This article belongs to the Special Issue Shaping Architecture with Computation)
Show Figures

Figure 1

19 pages, 5676 KB  
Article
Combustion and Emission Trade-Offs in Tier-Regulated EGR Modes: Comparative Insights from Shop and Sea Operation Data of a CPP Marine Diesel Engine
by Jaesung Moon
J. Mar. Sci. Eng. 2025, 13(10), 1935; https://doi.org/10.3390/jmse13101935 - 9 Oct 2025
Viewed by 309
Abstract
This study presents a comparative investigation of combustion and emission characteristics in a two-stroke MAN 5S35ME-B9.5 marine diesel engine equipped with a Controllable Pitch Propeller and an Exhaust Gas Recirculation system. Experimental data were obtained from both factory shop tests conducted under the [...] Read more.
This study presents a comparative investigation of combustion and emission characteristics in a two-stroke MAN 5S35ME-B9.5 marine diesel engine equipped with a Controllable Pitch Propeller and an Exhaust Gas Recirculation system. Experimental data were obtained from both factory shop tests conducted under the IMO NOx Technical Code 2008 E2 cycle and sea trials performed onboard the T/S Baek-Kyung. Engine performance was evaluated under Tier II-FB, ecoEGR, and Tier III modes, focusing on specific fuel oil consumption, peak cylinder pressure, exhaust gas temperature, and regulated emissions. Results indicate that Tier III achieved the greatest NOx abatement, reducing emissions by up to 76.4% (1464 to 346 ppm), but with penalties of 16.8% higher SFOC and 45.2% higher CO2 concentration. EcoEGR provided a more favorable compromise, reducing NOx by 52.3% while limiting SFOC increases to ≤15.4% and CO2 increases to ≤30.9%. Strong correlations were observed between NOx, Pmax, and exhaust gas temperature, reaffirming fundamental trade-offs, while O2 and CO correlations showed greater variability under sea operation. Despite operational scatter, sea trial results reproduced the key patterns observed in shop tests, confirming robustness across conditions. Overall, this correlation-based analysis provides quantified evidence of performance–emission trade-offs and offers a practical foundation for optimizing CPP-equipped two-stroke engines under varying EGR strategies. Full article
(This article belongs to the Special Issue Ship Performance and Emission Prediction)
Show Figures

Figure 1

27 pages, 32087 KB  
Article
A Label-Free Panel Recognition Method Based on Close-Range Photogrammetry and Feature Fusion
by Enshun Lu, Zhe Guo, Xiaofeng Li, Daode Zhang and Rui Lu
Appl. Sci. 2025, 15(19), 10835; https://doi.org/10.3390/app151910835 - 9 Oct 2025
Viewed by 205
Abstract
In the interior decoration panel industry, automated production lines have become the standard configuration for large-scale enterprises. However, during the panel processing procedures such as sanding and painting, the loss of traditional identification markers like QR codes or barcodes is inevitable. This creates [...] Read more.
In the interior decoration panel industry, automated production lines have become the standard configuration for large-scale enterprises. However, during the panel processing procedures such as sanding and painting, the loss of traditional identification markers like QR codes or barcodes is inevitable. This creates a critical technical bottleneck in the assembly stage of customized or multi-model parallel production lines, where identifying individual panels significantly limits production efficiency. To address this issue, this paper proposes a high-precision measurement method based on close-range photogrammetry for capturing panel dimensions and hole position features, enabling accurate extraction of identification markers. Building on this foundation, an identity discrimination method that integrates weighted dimension and hole position IDs has been developed, making it feasible to efficiently and automatically identify panels without physical identification markers. Experimental results demonstrate that the proposed method exhibits significant advantages in both recognition accuracy and production adaptability, providing an effective solution for intelligent manufacturing in the home decoration panel industry. Full article
(This article belongs to the Section Computing and Artificial Intelligence)
Show Figures

Figure 1

25 pages, 4961 KB  
Article
Automation and Genetic Algorithm Optimization for Seismic Modeling and Analysis of Tall RC Buildings
by Piero A. Cabrera, Gianella M. Medina and Rick M. Delgadillo
Buildings 2025, 15(19), 3618; https://doi.org/10.3390/buildings15193618 - 9 Oct 2025
Viewed by 363
Abstract
This article presents an innovative approach to optimizing the seismic modeling and analysis of high-rise buildings by automating the process with Python 3.13 and the ETABS 22.1.0 API. The process begins with the collection of information on the base building, a structure of [...] Read more.
This article presents an innovative approach to optimizing the seismic modeling and analysis of high-rise buildings by automating the process with Python 3.13 and the ETABS 22.1.0 API. The process begins with the collection of information on the base building, a structure of seventeen regular levels, which includes data from structural elements, material properties, geometric configuration, and seismic and gravitational loads. These data are organized in an Excel file for further processing. From this information, a code is developed in Python that automates the structural modeling in ETABS through its API. This code defines the sections, materials, edge conditions, and loads and models the elements according to their coordinates. The resulting base model is used as a starting point to generate an optimal solution using a genetic algorithm. The genetic algorithm adjusts column and beam sections using an approach that includes crossover and controlled mutation operations. Each solution is evaluated by the maximum displacement of the structure, calculating the fitness as the inverse of this displacement, favoring solutions with less deformation. The process is repeated across generations, selecting and crossing the best solutions. Finally, the model that generates the smallest displacement is saved as the optimal solution. Once the optimal solution has been obtained, it is implemented a second code in Python is implemented to perform static and dynamic seismic analysis. The key results, such as displacements, drifts, internal and basal shear forces, are processed and verified in accordance with the Peruvian Technical Standard E.030. The automated model with API shows a significant improvement in accuracy and efficiency compared to traditional methods, highlighting an R2 = 0.995 in the static analysis, indicating an almost perfect fit, and an RMSE = 1.93261 × 10−5, reflecting a near-zero error. In the dynamic drift analysis, the automated model reaches an R2 = 0.9385 and an RMSE = 5.21742 × 10−5, demonstrating its high precision. As for the lead time, the model automated completed the process in 13.2 min, which means a 99.5% reduction in comparison with the traditional method, which takes 3 h. On the other hand, the genetic algorithm had a run time of 191 min due to its stochastic nature and iterative process. The performance of the genetic algorithm shows that although the improvement is significant between Generation 1 and Generation 2, is stabilized in the following generations, with a slight decrease in Generation 5, suggesting that the algorithm has reached its level has reached a point of convergence. Full article
(This article belongs to the Special Issue Building Safety Assessment and Structural Analysis)
Show Figures

Figure 1

30 pages, 1769 KB  
Review
Decarbonizing the Cement Industry: Technological, Economic, and Policy Barriers to CO2 Mitigation Adoption
by Oluwafemi Ezekiel Ige and Musasa Kabeya
Clean Technol. 2025, 7(4), 85; https://doi.org/10.3390/cleantechnol7040085 - 9 Oct 2025
Viewed by 1254
Abstract
The cement industry accounts for approximately 7–8% of global CO2 emissions, primarily due to energy-intensive clinker production and limestone calcination. With cement demand continuing to rise, particularly in emerging economies, decarbonization has become an urgent global challenge. The objective of this study [...] Read more.
The cement industry accounts for approximately 7–8% of global CO2 emissions, primarily due to energy-intensive clinker production and limestone calcination. With cement demand continuing to rise, particularly in emerging economies, decarbonization has become an urgent global challenge. The objective of this study is to systematically map and synthesize existing evidence on technological pathways, policy measures, and economic barriers to four core decarbonization strategies: clinker substitution, energy efficiency, alternative fuels, as well as carbon capture, utilization, and storage (CCUS) in the cement sector, with the goal of identifying practical strategies that can align industry practice with long-term climate goals. A scoping review methodology was adopted, drawing on peer-reviewed journal articles, technical reports, and policy documents to ensure a comprehensive perspective. The results demonstrate that each mitigation pathway is technically feasible but faces substantial real-world constraints. Clinker substitution delivers immediate reduction but is limited by SCM availability/quality, durability qualification, and conservative codes; LC3 is promising where clay logistics allow. Energy-efficiency measures like waste-heat recovery and advanced controls reduce fuel use but face high capital expenditure, downtime, and diminishing returns in modern plants. Alternative fuels can reduce combustion-related emissions but face challenges of supply chains, technical integration challenges, quality, weak waste-management systems, and regulatory acceptance. CCUS, the most considerable long-term potential, addresses process CO2 and enables deep reductions, but remains commercially unviable due to current economics, high costs, limited policy support, lack of large-scale deployment, and access to transport and storage. Cross-cutting economic challenges, regulatory gaps, skill shortages, and social resistance including NIMBYism further slow adoption, particularly in low-income regions. This study concludes that a single pathway is insufficient. An integrated portfolio supported by modernized standards, targeted policy incentives, expanded access to SCMs and waste fuels, scaled CCUS investment, and international collaboration is essential to bridge the gap between climate ambition and industrial implementation. Key recommendations include modernizing cement standards to support higher clinker replacement, providing incentives for energy-efficient upgrades, scaling CCUS through joint investment and carbon pricing and expanding access to biomass and waste-derived fuels. Full article
Show Figures

Figure 1

17 pages, 2322 KB  
Article
Design of an Embedded Simulation Controller for a Model-Based Diesel Engine Parallel Power Unit
by Huan Liu, Pan Su, Jiechang Wu and Guanghui Chang
Processes 2025, 13(10), 3196; https://doi.org/10.3390/pr13103196 - 8 Oct 2025
Viewed by 332
Abstract
To address the limitations inherent in traditional simulation control schemes for dual-engine parallel operation systems in diesel engines—such as protracted development cycles, suboptimal interface compatibility, insufficient real-time performance, and inadequate support for dynamic condition simulation in applications like marine power systems—this paper proposes [...] Read more.
To address the limitations inherent in traditional simulation control schemes for dual-engine parallel operation systems in diesel engines—such as protracted development cycles, suboptimal interface compatibility, insufficient real-time performance, and inadequate support for dynamic condition simulation in applications like marine power systems—this paper proposes an embedded real-time controller based on model-based design. This methodology facilitates efficient development and high-precision real-time control of parallel operation systems. A multi-domain coupled simulation model integrating diesel power and parallel control algorithms is built in MATLAB/Simulink, with optimized C code auto-generated via Embedded Coder. Hardware centers on STM32F407VE, enabling 4–20 mA speed acquisition, CAN communication, and Ethernet transmission. Experimental results indicate that the architecture shortens development cycles from 8 to 3 weeks, with 895 microseconds of simulation steps meeting 1-millisecond real-time requirements. Vessel tests achieve ±1.8 r/min synchronization error and ±1.2% load distribution error at low cost. It adapts to varied diesel power via modular substitution and supports RS485/CAN-FD. In conclusion, the controller effectively handles real-time simulated diesel engine parallel systems and excels in efficiency, compatibility, and cost, offering a viable technical pathway for modernizing parallel power systems in applications such as marine vessels and power generation. Full article
(This article belongs to the Section Manufacturing Processes and Systems)
Show Figures

Figure 1

32 pages, 2305 KB  
Article
SCEditor-Web: Bridging Model-Driven Engineering and Generative AI for Smart Contract Development
by Yassine Ait Hsain, Naziha Laaz and Samir Mbarki
Information 2025, 16(10), 870; https://doi.org/10.3390/info16100870 - 7 Oct 2025
Viewed by 381
Abstract
Smart contracts are central to blockchain ecosystems, yet their development remains technically demanding, error-prone, and tied to platform-specific programming languages. This paper introduces SCEditor-Web, a web-based modeling environment that combines model-driven engineering (MDE) with generative artificial intelligence (Gen-AI) to simplify contract design and [...] Read more.
Smart contracts are central to blockchain ecosystems, yet their development remains technically demanding, error-prone, and tied to platform-specific programming languages. This paper introduces SCEditor-Web, a web-based modeling environment that combines model-driven engineering (MDE) with generative artificial intelligence (Gen-AI) to simplify contract design and code generation. Developers specify the structural and behavioral aspects of smart contracts through a domain-specific visual language grounded in a formal metamodel. The resulting contract model is exported as structured JSON and transformed into executable, platform-specific code using large language models (LLMs) guided by a tailored prompt engineering process. A prototype implementation was evaluated on Solidity contracts as a proof of concept, using representative use cases. Experiments with state-of-the-art LLMs assessed the generated contracts for compilability, semantic alignment with the contract model, and overall code quality. Results indicate that the visual-to-code workflow reduces manual effort, mitigates common programming errors, and supports developers with varying levels of expertise. The contributions include an abstract smart contract metamodel, a structured prompt generation pipeline, and a web-based platform that bridges high-level modeling with practical multi-language code synthesis. Together, these elements advance the integration of MDE and LLMs, demonstrating a step toward more accessible and reliable smart contract engineering. Full article
(This article belongs to the Special Issue Using Generative Artificial Intelligence Within Software Engineering)
Show Figures

Figure 1

Back to TopTop