Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (2,008)

Search Parameters:
Keywords = cloud control

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
30 pages, 1034 KiB  
Article
A Privacy-Preserving Polymorphic Heterogeneous Security Architecture for Cloud–Edge Collaboration Industrial Control Systems
by Yukun Niu, Xiaopeng Han, Chuan He, Yunfan Wang, Zhigang Cao and Ding Zhou
Appl. Sci. 2025, 15(14), 8032; https://doi.org/10.3390/app15148032 - 18 Jul 2025
Abstract
Cloud–edge collaboration industrial control systems (ICSs) face critical security and privacy challenges that existing dynamic heterogeneous redundancy (DHR) architectures inadequately address due to two fundamental limitations: event-triggered scheduling approaches that amplify common-mode escape impacts in resource-constrained environments, and insufficient privacy-preserving arbitration mechanisms for [...] Read more.
Cloud–edge collaboration industrial control systems (ICSs) face critical security and privacy challenges that existing dynamic heterogeneous redundancy (DHR) architectures inadequately address due to two fundamental limitations: event-triggered scheduling approaches that amplify common-mode escape impacts in resource-constrained environments, and insufficient privacy-preserving arbitration mechanisms for sensitive industrial data processing. In contrast to existing work that treats scheduling and privacy as separate concerns, this paper proposes a unified polymorphic heterogeneous security architecture that integrates hybrid event–time triggered scheduling with adaptive privacy-preserving arbitration, specifically designed to address the unique challenges of cloud–edge collaboration ICSs where both security resilience and privacy preservation are paramount requirements. The architecture introduces three key innovations: (1) a hybrid event–time triggered scheduling algorithm with credibility assessment and heterogeneity metrics to mitigate common-mode escape scenarios, (2) an adaptive privacy budget allocation mechanism that balances privacy protection effectiveness with system availability based on attack activity levels, and (3) a unified framework that organically integrates privacy-preserving arbitration with heterogeneous redundancy management. Comprehensive evaluations using natural gas pipeline pressure control and smart grid voltage control systems demonstrate superior performance: the proposed method achieves 100% system availability compared to 62.57% for static redundancy and 86.53% for moving target defense, maintains 99.98% availability even under common-mode attacks (102 probability), and consistently outperforms moving target defense methods integrated with state-of-the-art detection mechanisms (99.7790% and 99.6735% average availability when false data deviations from true values are 5% and 3%, respectively) across different attack detection scenarios, validating its effectiveness in defending against availability attacks and privacy leakage threats in cloud–edge collaboration environments. Full article
33 pages, 534 KiB  
Review
Local AI Governance: Addressing Model Safety and Policy Challenges Posed by Decentralized AI
by Bahrad A. Sokhansanj
AI 2025, 6(7), 159; https://doi.org/10.3390/ai6070159 - 17 Jul 2025
Abstract
Policies and technical safeguards for artificial intelligence (AI) governance have implicitly assumed that AI systems will continue to operate via massive power-hungry data centers operated by large companies like Google and OpenAI. However, the present cloud-based AI paradigm is being challenged by rapidly [...] Read more.
Policies and technical safeguards for artificial intelligence (AI) governance have implicitly assumed that AI systems will continue to operate via massive power-hungry data centers operated by large companies like Google and OpenAI. However, the present cloud-based AI paradigm is being challenged by rapidly advancing software and hardware technologies. Open-source AI models now run on personal computers and devices, invisible to regulators and stripped of safety constraints. The capabilities of local-scale AI models now lag just months behind those of state-of-the-art proprietary models. Wider adoption of local AI promises significant benefits, such as ensuring privacy and autonomy. However, adopting local AI also threatens to undermine the current approach to AI safety. In this paper, we review how technical safeguards fail when users control the code, and regulatory frameworks cannot address decentralized systems as deployment becomes invisible. We further propose ways to harness local AI’s democratizing potential while managing its risks, aimed at guiding responsible technical development and informing community-led policy: (1) adapting technical safeguards for local AI, including content provenance tracking, configurable safe computing environments, and distributed open-source oversight; and (2) shaping AI policy for a decentralized ecosystem, including polycentric governance mechanisms, integrating community participation, and tailored safe harbors for liability. Full article
(This article belongs to the Section AI Systems: Theory and Applications)
Show Figures

Figure 1

22 pages, 5031 KiB  
Article
Numerical Simulation and Analysis of Micropile-Raft Joint Jacking Technology for Rectifying Inclined Buildings Due to Uneven Settlement
by Ming Xie, Li’e Yin, Zhangdong Wang, Fangbo Xu, Xiangdong Wu and Mengqi Xu
Buildings 2025, 15(14), 2485; https://doi.org/10.3390/buildings15142485 - 15 Jul 2025
Viewed by 103
Abstract
To address the issue of structural tilting caused by uneven foundation settlement in soft soil areas, this study combined a specific engineering case to conduct numerical simulations of the rectification process for an inclined reinforced concrete building using ABAQUS finite element software. Micropile-raft [...] Read more.
To address the issue of structural tilting caused by uneven foundation settlement in soft soil areas, this study combined a specific engineering case to conduct numerical simulations of the rectification process for an inclined reinforced concrete building using ABAQUS finite element software. Micropile-raft combined jacking technology was employed, applying staged jacking forces (2400 kN for Axis A, 2200 kN for Axis B, and 1700 kN for Axis C) with precise control through 20 incremental steps. The results demonstrate that this technology effectively halted structural tilting, reducing the maximum inclination rate from 0.51% to 0.05%, significantly below the standard limit. Post-rectification, the peak structural stress decreased by 42%, and displacements were markedly reduced. However, the jacking process led to a notable increase in the column axial forces and directional changes in beam bending moments, reflecting the dynamic redistribution of internal forces. The study confirms that micropile-raft combined jacking technology offers both controllability and safety, while optimized counterforce pile layouts enhance the long-term stability of the rectification system. Based on stress and displacement cloud analysis, a monitoring scheme is proposed, forming an integrated “rectification-monitoring-reinforcement” solution, which provides a technical framework for building rectification in soft soil regions. Full article
Show Figures

Figure 1

21 pages, 1118 KiB  
Review
Integrating Large Language Models into Robotic Autonomy: A Review of Motion, Voice, and Training Pipelines
by Yutong Liu, Qingquan Sun and Dhruvi Rajeshkumar Kapadia
AI 2025, 6(7), 158; https://doi.org/10.3390/ai6070158 - 15 Jul 2025
Viewed by 336
Abstract
This survey provides a comprehensive review of the integration of large language models (LLMs) into autonomous robotic systems, organized around four key pillars: locomotion, navigation, manipulation, and voice-based interaction. We examine how LLMs enhance robotic autonomy by translating high-level natural language commands into [...] Read more.
This survey provides a comprehensive review of the integration of large language models (LLMs) into autonomous robotic systems, organized around four key pillars: locomotion, navigation, manipulation, and voice-based interaction. We examine how LLMs enhance robotic autonomy by translating high-level natural language commands into low-level control signals, supporting semantic planning and enabling adaptive execution. Systems like SayTap improve gait stability through LLM-generated contact patterns, while TrustNavGPT achieves a 5.7% word error rate (WER) under noisy voice-guided conditions by modeling user uncertainty. Frameworks such as MapGPT, LLM-Planner, and 3D-LOTUS++ integrate multi-modal data—including vision, speech, and proprioception—for robust planning and real-time recovery. We also highlight the use of physics-informed neural networks (PINNs) to model object deformation and support precision in contact-rich manipulation tasks. To bridge the gap between simulation and real-world deployment, we synthesize best practices from benchmark datasets (e.g., RH20T, Open X-Embodiment) and training pipelines designed for one-shot imitation learning and cross-embodiment generalization. Additionally, we analyze deployment trade-offs across cloud, edge, and hybrid architectures, emphasizing latency, scalability, and privacy. The survey concludes with a multi-dimensional taxonomy and cross-domain synthesis, offering design insights and future directions for building intelligent, human-aligned robotic systems powered by LLMs. Full article
Show Figures

Figure 1

20 pages, 1753 KiB  
Article
Hybrid Cloud-Based Information and Control System Using LSTM-DNN Neural Networks for Optimization of Metallurgical Production
by Kuldashbay Avazov, Jasur Sevinov, Barnokhon Temerbekova, Gulnora Bekimbetova, Ulugbek Mamanazarov, Akmalbek Abdusalomov and Young Im Cho
Processes 2025, 13(7), 2237; https://doi.org/10.3390/pr13072237 - 13 Jul 2025
Viewed by 515
Abstract
A methodology for detecting systematic errors in sets of equally accurate, uncorrelated, aggregate measurements is proposed and applied within the automatic real-time dispatch control system of a copper concentrator plant (CCP) to refine the technical and economic performance indicators (EPIs) computed by the [...] Read more.
A methodology for detecting systematic errors in sets of equally accurate, uncorrelated, aggregate measurements is proposed and applied within the automatic real-time dispatch control system of a copper concentrator plant (CCP) to refine the technical and economic performance indicators (EPIs) computed by the system. This work addresses and solves the problem of selecting and obtaining reliable measurement data by exploiting the redundant measurements of process streams together with the balance equations linking those streams. This study formulates an approach for integrating cloud technologies, machine learning methods, and forecasting into information control systems (ICSs) via predictive analytics to optimize CCP production processes. A method for combining the hybrid cloud infrastructure with an LSTM-DNN neural network model has been developed, yielding a marked improvement in TEP for copper concentration operations. The forecasting accuracy for the key process parameters rose from 75% to 95%. Predictive control reduced energy consumption by 10% through more efficient resource use, while the copper losses to tailings fell by 15–20% thanks to optimized reagent dosing and the stabilization of the flotation process. Equipment failure prediction cut the amount of unplanned downtime by 30%. As a result, the control system became adaptive, automatically correcting the parameters in real time and lessening the reliance on operator decisions. The architectural model of an ICS for metallurgical production based on the hybrid cloud and the LSTM-DNN model was devised to enhance forecasting accuracy and optimize the EPIs of the CCP. The proposed model was experimentally evaluated against alternative neural network architectures (DNN, GRU, Transformer, and Hybrid_NN_TD_AIST). The results demonstrated the superiority of the LSTM-DNN in forecasting accuracy (92.4%), noise robustness (0.89), and a minimal root-mean-square error (RMSE = 0.079). The model shows a strong capability to handle multidimensional, non-stationary time series and to perform adaptive measurement correction in real time. Full article
(This article belongs to the Section AI-Enabled Process Engineering)
Show Figures

Figure 1

26 pages, 1271 KiB  
Article
The Effects of Interventions Using Support Tools to Reduce Household Food Waste: A Study Using a Cloud-Based Automatic Weighing System
by Yasuko Seta, Hajime Yamakawa, Tomoko Okayama, Kohei Watanabe and Maki Nonomura
Sustainability 2025, 17(14), 6392; https://doi.org/10.3390/su17146392 - 12 Jul 2025
Viewed by 230
Abstract
Food waste is a global sustainability issue, and in Japan, approximately half of all food waste is generated in households. This study focused on refrigerator management behaviors aimed at using up the food inventory in the home. An intervention study involving 119 households [...] Read more.
Food waste is a global sustainability issue, and in Japan, approximately half of all food waste is generated in households. This study focused on refrigerator management behaviors aimed at using up the food inventory in the home. An intervention study involving 119 households with two or more members across Japan, with a two-week baseline period and a two-week intervention, was conducted. Target behaviors were set as “search food that should be eaten quickly,” “move it to a visible place,” and “use the foods that should be eaten quickly,” and tools to support these behaviors were selected, including an organizer for the refrigerator, photos, and food management apps. Each tool was assigned to approximately 30 households, and a control group was established. Food waste was measured using a cloud-based automatic weighing system, and all participants were asked to separate avoidable food waste at home and dispose of it in the designated waste bin. During the intervention period, the average weekly food waste per household decreased by 29% to 51% in the intervention group, while there was little change in the control group. An analysis using a two-way mixed ANOVA revealed a marginally significant interaction (p < 0.10), indicating moderate effectiveness. Among the behaviors contributing to reduced food waste, three actions—“having trouble not being able to recall food inventory at home during shopping,” “moving foods that should be used sooner,” and “organizing refrigerator”—showed significant interaction effects (p < 0.05) in a two-way mixed ANOVA, indicating the effectiveness of the intervention. Full article
Show Figures

Figure 1

21 pages, 5918 KiB  
Article
Development of a Real-Time Online Automatic Measurement System for Propeller Manufacturing Quality Control
by Yuan-Ming Cheng and Kuan-Yu Hsu
Appl. Sci. 2025, 15(14), 7750; https://doi.org/10.3390/app15147750 - 10 Jul 2025
Viewed by 149
Abstract
The quality of machined marine propellers plays a critical role in underwater propulsion performance. Precision casting is the predominant manufacturing technique; however, deformation of wax models and rough blanks during manufacturing frequently cause deviations in the dimensions of final products and, thus, affect [...] Read more.
The quality of machined marine propellers plays a critical role in underwater propulsion performance. Precision casting is the predominant manufacturing technique; however, deformation of wax models and rough blanks during manufacturing frequently cause deviations in the dimensions of final products and, thus, affect propellers’ performance and service life. Current inspection methods primarily involve using coordinate measuring machines and sampling. This approach is time-consuming, has high labor costs, and cannot monitor manufacturing quality in real-time. This study developed a real-time online automated measurement system containing a high-resolution CITIZEN displacement sensor, a four-degree-of-freedom measurement platform, and programmable logic controller-based motion control technology to enable rapid, automated measurement of blade deformation across the wax model, rough blank, and final product processing stages. The measurement data are transmitted in real time to a cloud database. Tests conducted on a standardized platform and real propeller blades confirmed that the system consistently achieved measurement accuracy to the second decimal place under the continual measurement mode. The system also demonstrated excellent repeatability and stability. Furthermore, the continuous measurement mode outperformed the single-point measurement mode. Overall, the developed system effectively reduces labor requirements, shortens measurement times, and enables real-time monitoring of process variation. These capabilities underscore its strong potential for application in the smart manufacturing and quality control of marine propellers. Full article
Show Figures

Figure 1

32 pages, 2917 KiB  
Article
Self-Adapting CPU Scheduling for Mixed Database Workloads via Hierarchical Deep Reinforcement Learning
by Suchuan Xing, Yihan Wang and Wenhe Liu
Symmetry 2025, 17(7), 1109; https://doi.org/10.3390/sym17071109 - 10 Jul 2025
Viewed by 165
Abstract
Modern database systems require autonomous CPU scheduling frameworks that dynamically optimize resource allocation across heterogeneous workloads while maintaining strict performance guarantees. We present a novel hierarchical deep reinforcement learning framework augmented with graph neural networks to address CPU scheduling challenges in mixed database [...] Read more.
Modern database systems require autonomous CPU scheduling frameworks that dynamically optimize resource allocation across heterogeneous workloads while maintaining strict performance guarantees. We present a novel hierarchical deep reinforcement learning framework augmented with graph neural networks to address CPU scheduling challenges in mixed database environments comprising Online Transaction Processing (OLTP), Online Analytical Processing (OLAP), vector processing, and background maintenance workloads. Our approach introduces three key innovations: first, a symmetric two-tier control architecture where a meta-controller allocates CPU budgets across workload categories using policy gradient methods while specialized sub-controllers optimize process-level resource allocation through continuous action spaces; second, graph neural network-based dependency modeling that captures complex inter-process relationships and communication patterns while preserving inherent symmetries in database architectures; and third, meta-learning integration with curiosity-driven exploration enabling rapid adaptation to previously unseen workload patterns without extensive retraining. The framework incorporates a multi-objective reward function balancing Service Level Objective (SLO) adherence, resource efficiency, symmetric fairness metrics, and system stability. Experimental evaluation through high-fidelity digital twin simulation and production deployment demonstrates substantial performance improvements: 43.5% reduction in p99 latency violations for OLTP workloads and 27.6% improvement in overall CPU utilization, with successful scaling to 10,000 concurrent processes maintaining sub-3% scheduling overhead. This work represents a significant advancement toward truly autonomous database resource management, establishing a foundation for next-generation self-optimizing database systems with implications extending to broader orchestration challenges in cloud-native architectures. Full article
(This article belongs to the Section Computer)
Show Figures

Figure 1

37 pages, 10760 KiB  
Article
AI-Based Vehicle State Estimation Using Multi-Sensor Perception and Real-World Data
by Julian Ruggaber, Daniel Pölzleitner and Jonathan Brembeck
Sensors 2025, 25(14), 4253; https://doi.org/10.3390/s25144253 - 8 Jul 2025
Viewed by 204
Abstract
With the rise of vehicle automation, accurate estimation of driving dynamics has become crucial for ensuring safe and efficient operation. Vehicle dynamics control systems rely on these estimates to provide necessary control variables for stabilizing vehicles in various scenarios. Traditional model-based methods use [...] Read more.
With the rise of vehicle automation, accurate estimation of driving dynamics has become crucial for ensuring safe and efficient operation. Vehicle dynamics control systems rely on these estimates to provide necessary control variables for stabilizing vehicles in various scenarios. Traditional model-based methods use wheel-related measurements, such as steering angle or wheel speed, as inputs. However, under low-traction conditions, e.g., on icy surfaces, these measurements often fail to deliver trustworthy information about the vehicle states. In such critical situations, precise estimation is essential for effective system intervention. This work introduces an AI-based approach that leverages perception sensor data, specifically camera images and lidar point clouds. By using relative kinematic relationships, it bypasses the complexities of vehicle and tire dynamics and enables robust estimation across all scenarios. Optical and scene flow are extracted from the sensor data and processed by a recurrent neural network to infer vehicle states. The proposed method is vehicle-agnostic, allowing trained models to be deployed across different platforms without additional calibration. Experimental results based on real-world data demonstrate that the AI-based estimator presented in this work achieves accurate and robust results under various conditions. Particularly in low-friction scenarios, it significantly outperforms conventional model-based approaches. Full article
(This article belongs to the Section Vehicular Sensing)
Show Figures

Figure 1

26 pages, 5672 KiB  
Review
Development Status and Trend of Mine Intelligent Mining Technology
by Zhuo Wang, Lin Bi, Jinbo Li, Zhaohao Wu and Ziyu Zhao
Mathematics 2025, 13(13), 2217; https://doi.org/10.3390/math13132217 - 7 Jul 2025
Viewed by 480
Abstract
Intelligent mining technology, as the core driving force for the digital transformation of the mining industry, integrates cyber-physical systems, artificial intelligence, and industrial internet technologies to establish a “cloud–edge–end” collaborative system. In this paper, the development trajectory of intelligent mining technology has been [...] Read more.
Intelligent mining technology, as the core driving force for the digital transformation of the mining industry, integrates cyber-physical systems, artificial intelligence, and industrial internet technologies to establish a “cloud–edge–end” collaborative system. In this paper, the development trajectory of intelligent mining technology has been systematically reviewed, which has gone through four stages: stand-alone automation, integrated automation and informatization, digital and intelligent initial, and comprehensive intelligence. And the current development status of “cloud–edge–end” technologies has been reviewed: (i) The end layer achieves environmental state monitoring and precise control through a multi-source sensing network and intelligent equipment. (ii) The edge layer leverages 5G and edge computing to accomplish real-time data processing, 3D dynamic modeling, and safety early warning. (iii) The cloud layer realizes digital planning and intelligent decision-making, based on the industrial Internet platform. The three-layer collaboration forms a “perception–analysis–decision–execution” closed loop. Currently, there are still many challenges in the development of the technology, including the lack of a standardization system, the bottleneck of multi-source heterogeneous data fusion, the lack of a cross-process coordination of the equipment, and the shortage of interdisciplinary talents. Accordingly, this paper focuses on future development trends from four aspects, providing systematic solutions for a safe, efficient, and sustainable mining operation. Technological evolution will accelerate the formation of an intelligent ecosystem characterized by “standard-driven, data-empowered, equipment-autonomous, and human–machine collaboration”. Full article
(This article belongs to the Special Issue Mathematical Modeling and Analysis in Mining Engineering)
Show Figures

Figure 1

22 pages, 2953 KiB  
Article
Risk Assessment Model for Railway Track Maintenance Operations Based on Combined Weights and Nonlinear FCE
by Rui Luan and Rengkui Liu
Appl. Sci. 2025, 15(13), 7614; https://doi.org/10.3390/app15137614 - 7 Jul 2025
Viewed by 270
Abstract
Current risk assessment in railway track maintenance operations faces challenges (low spatiotemporal accuracy, limited adaptability to various scenarios, and tendency of linear fuzzy comprehensive evaluation (FCE) methods to underestimate high-risk factors). To address these, this study proposes a novel risk assessment model that [...] Read more.
Current risk assessment in railway track maintenance operations faces challenges (low spatiotemporal accuracy, limited adaptability to various scenarios, and tendency of linear fuzzy comprehensive evaluation (FCE) methods to underestimate high-risk factors). To address these, this study proposes a novel risk assessment model that integrates subjective–objective weighting techniques with a nonlinear FCE approach. By incorporating spatiotemporal information, the model enables precise localization of risk occurrence in individual maintenance operations. A comprehensive risk index system is constructed across four dimensions: human, equipment, environment, and management. The game theory combined weighting method, integrating the G1 method and entropy weight method, is employed; it balances expert judgment with data-driven analysis. A cloud model is introduced to generate risk membership matrices, accounting for the fuzziness and randomness of risk data. The nonlinear FCE framework enhances the influence of high-risk factors. Risk levels are determined using the combined weights, membership matrices, and the maximum membership principle. A case study on the Lanzhou–Xinjiang Railway demonstrates that the proposed model achieves higher consistency with actual risk conditions than conventional methods, improving assessment accuracy and reliability. This model offers a practical and effective tool for risk prevention and control in railway maintenance operations. Full article
Show Figures

Figure 1

46 pages, 1709 KiB  
Article
Federated Learning-Driven IoT Request Scheduling for Fault Tolerance in Cloud Data Centers
by Sheeja Rani S and Raafat Aburukba
Mathematics 2025, 13(13), 2198; https://doi.org/10.3390/math13132198 - 5 Jul 2025
Viewed by 330
Abstract
Cloud computing is a virtualized and distributed computing model that provides resources and services based on demand and self-service. Resource failure is one of the major challenges in cloud computing, and there is a need for fault tolerance mechanisms. This paper addresses the [...] Read more.
Cloud computing is a virtualized and distributed computing model that provides resources and services based on demand and self-service. Resource failure is one of the major challenges in cloud computing, and there is a need for fault tolerance mechanisms. This paper addresses the issue by proposing a multi-objective radial kernelized federated learning-based fault-tolerant scheduling (MRKFL-FTS) technique for allocating multiple IoT requests or user tasks to virtual machines in cloud IoT-based environments. The MRKFL-FTS technique includes Cloud RAN (C-RAN) and Virtual RAN (V-RAN). The proposed MRKFL-FTS technique comprises four entities, namely, IoT devices, cloud servers, task assigners, and virtual machines. Each IoT device generates several service requests and sends them to the control server. At first, radial kernelized support vector regression is applied in the local training model to identify resource-efficient virtual machines. After that, locally trained models are combined, and the resulting model is fed into the global aggregation model. Finally, using a weighted round-robin method, the task assigner allocates incoming IoT service requests to virtual machines. This approach improves resource awareness and fault tolerance in scheduling. The quantitatively analyzed results show that the MRKFL-FTS technique achieved an 8% improvement in task scheduling efficiency and fault prediction accuracy, a 36% improvement in throughput, and a 14% reduction in makespan and time complexity. In addition, the MRKFL-FTS technique resulted in a 13% reduction in response time. The energy consumption of the MRKFL-FTS technique is reduced by 17% and increases the scalability by 8% compared to conventional scheduling techniques. Full article
(This article belongs to the Special Issue Advanced Information and Signal Processing: Models and Algorithms)
Show Figures

Figure 1

17 pages, 3285 KiB  
Article
CF-mMIMO-Based Computational Offloading for UAV Swarms: System Design and Experimental Results
by Jian Sun, Hongxin Lin, Wei Shi, Wei Xu and Dongming Wang
Electronics 2025, 14(13), 2708; https://doi.org/10.3390/electronics14132708 - 4 Jul 2025
Viewed by 288
Abstract
Swarm-based unmanned aerial vehicle (UAV) systems offer enhanced spatial coverage, collaborative intelligence, and mission scalability for various applications, including environmental monitoring and emergency response. However, their onboard processing is limited by stringent size, weight, and power constraints, posing challenges for real-time computation and [...] Read more.
Swarm-based unmanned aerial vehicle (UAV) systems offer enhanced spatial coverage, collaborative intelligence, and mission scalability for various applications, including environmental monitoring and emergency response. However, their onboard processing is limited by stringent size, weight, and power constraints, posing challenges for real-time computation and autonomous control. This paper presents an integrated communication and computation framework that combines cloud–edge–end collaboration with cell-free massive multiple-input multiple-output (CF-mMIMO) to enable scalable and efficient task offloading in UAV swarms. Furthermore, we implement a prototype system testbed with nine UAVs and validate the proposed framework through real-time object detection tasks. Results demonstrate over 30% reduction in onboard computation and significant improvements in communication reliability, highlighting the framework’s potential for enabling intelligent, cooperative aerial systems. Full article
(This article belongs to the Section Circuit and Signal Processing)
Show Figures

Figure 1

27 pages, 4826 KiB  
Article
IoT-Driven Intelligent Curing of Face Slab Concrete in Rockfill Dams Based on Integrated Multi-Source Monitoring
by Yihong Zhou, Yuanyuan Fang, Zhipeng Liang, Dongfeng Li, Chunju Zhao, Huawei Zhou, Fang Wang, Lei Lei, Rui Wang, Dehang Kong, Tianbai Pei and Luyao Zhou
Buildings 2025, 15(13), 2344; https://doi.org/10.3390/buildings15132344 - 3 Jul 2025
Viewed by 281
Abstract
To better understand the temperature changes in face slab concrete and address challenges such as delayed curing and outdated methods in complex and variable environments, this study investigates the use of visualization and real-time feedback control in concrete construction. The conducted study systematically [...] Read more.
To better understand the temperature changes in face slab concrete and address challenges such as delayed curing and outdated methods in complex and variable environments, this study investigates the use of visualization and real-time feedback control in concrete construction. The conducted study systematically develops an intelligent curing control system for face slab concrete based on multi-source measured data. A tailored multi-source data acquisition scheme was proposed, supported by an IoT-based transmission framework. Cloud-based data analysis and feedback control mechanisms were implemented, along with a decoupled front-end and back-end system platform. This platform integrates essential functions such as two-way communication with gateway devices, data processing and analysis, system visualization, and intelligent curing control. In conjunction with the ongoing Maerdang concrete face rockfill dam (CFRD) project, located in a high-altitude, cold-climate region, an intelligent curing system platform for face slab concrete was developed. The platform enables three core visualization functions: (1) monitoring the pouring progress of face slab concrete, (2) the early warning and prediction of temperature exceedance, and (3) dynamic feedback and adjustment of curing measures. The research outcomes were successfully applied to the intelligent curing of the Maerdang face slab concrete, providing both theoretical insight and practical support for achieving scientific and precise curing control. Full article
(This article belongs to the Section Building Structures)
Show Figures

Figure 1

15 pages, 3444 KiB  
Article
A LiDAR-Driven Approach for Crop Row Detection and Navigation Line Extraction in Soybean–Maize Intercropping Systems
by Mingxiong Ou, Rui Ye, Yunfei Wang, Yaoyao Gu, Ming Wang, Xiang Dong and Weidong Jia
Appl. Sci. 2025, 15(13), 7439; https://doi.org/10.3390/app15137439 - 2 Jul 2025
Viewed by 173
Abstract
Crop row identification and navigation line extraction are essential components for enabling autonomous operations of agricultural machinery. Aiming at the soybean–maize strip intercropping system, this study proposes a LiDAR-based algorithm for crop row detection and navigation line extraction. The proposed method consists of [...] Read more.
Crop row identification and navigation line extraction are essential components for enabling autonomous operations of agricultural machinery. Aiming at the soybean–maize strip intercropping system, this study proposes a LiDAR-based algorithm for crop row detection and navigation line extraction. The proposed method consists of four primary stages: point cloud preprocessing, crop row region identification, feature point clustering, and navigation line extraction. Specifically, a combination of K-means and Euclidean clustering algorithms is employed to extract feature points representing crop rows. The central lines of the crop rows are then fitted using the least squares method, and a stable navigation path is constructed based on angle bisector principles. Field experiments were conducted under three representative scenarios: broken rows with missing plants, low occlusion, and high occlusion. The results demonstrate that the proposed method exhibits strong adaptability and robustness across various environments, achieving over 80% accuracy in navigation line extraction, with up to 90% in low-occlusion settings. The average navigation angle was controlled within 0.28°, with the minimum reaching 0.17°, and the average processing time remained below 75.62 ms. Moreover, lateral deviation tests confirmed the method’s high precision and consistency in path tracking, validating its feasibility and practicality for application in strip intercropping systems. Full article
(This article belongs to the Section Agricultural Science and Technology)
Show Figures

Figure 1

Back to TopTop