Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (1,157)

Search Parameters:
Keywords = cloud adoption

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
21 pages, 1800 KiB  
Article
GAPSO: Cloud-Edge-End Collaborative Task Offloading Based on Genetic Particle Swarm Optimization
by Wu Wen, Yibin Huang, Zhong Xiao, Lizhuang Tan and Peiying Zhang
Symmetry 2025, 17(8), 1225; https://doi.org/10.3390/sym17081225 - 3 Aug 2025
Viewed by 89
Abstract
In the 6G era, the proliferation of smart devices has led to explosive growth in data volume. The traditional cloud computing can no longer meet the demand for efficient processing of large amounts of data. Edge computing can solve the energy loss problems [...] Read more.
In the 6G era, the proliferation of smart devices has led to explosive growth in data volume. The traditional cloud computing can no longer meet the demand for efficient processing of large amounts of data. Edge computing can solve the energy loss problems caused by transmission delay and multi-level forwarding in cloud computing by processing data close to the data source. In this paper, we propose a cloud–edge–end collaborative task offloading strategy with task response time and execution energy consumption as the optimization targets under a limited resource environment. The tasks generated by smart devices can be processed using three kinds of computing nodes, including user devices, edge servers, and cloud servers. The computing nodes are constrained by bandwidth and computing resources. For the target optimization problem, a genetic particle swarm optimization algorithm considering three layers of computing nodes is designed. The task offloading optimization is performed by introducing (1) opposition-based learning algorithm, (2) adaptive inertia weights, and (3) adjustive acceleration coefficients. All metaheuristic algorithms adopt a symmetric training method to ensure fairness and consistency in evaluation. Through experimental simulation, compared with the classic evolutionary algorithm, our algorithm reduces the objective function value by about 6–12% and has higher algorithm convergence speed, accuracy, and stability. Full article
Show Figures

Figure 1

21 pages, 2077 KiB  
Article
Quantitative Risk Assessment of Liquefied Natural Gas Bunkering Hoses in Maritime Operations: A Case of Shenzhen Port
by Yimiao Gu, Yanmin Zeng and Hui Shan Loh
J. Mar. Sci. Eng. 2025, 13(8), 1494; https://doi.org/10.3390/jmse13081494 - 2 Aug 2025
Viewed by 215
Abstract
The widespread adoption of liquefied natural gas (LNG) as a marine fuel has driven the development of LNG bunkering operations in global ports. Major international hubs, such as Shenzhen Port, have implemented ship-to-ship (STS) bunkering practices. However, this process entails unique safety risks, [...] Read more.
The widespread adoption of liquefied natural gas (LNG) as a marine fuel has driven the development of LNG bunkering operations in global ports. Major international hubs, such as Shenzhen Port, have implemented ship-to-ship (STS) bunkering practices. However, this process entails unique safety risks, particularly hazards associated with vapor cloud dispersion caused by bunkering hose releases. This study employs the Phast software developed by DNV to systematically simulate LNG release scenarios during STS operations, integrating real-world meteorological data and storage conditions. The dynamic effects of transfer flow rates, release heights, and release directions on vapor cloud dispersion are quantitatively analyzed under daytime and nighttime conditions. The results demonstrate that transfer flow rate significantly regulates dispersion range, with recommendations to limit the rate below 1500 m3/h and prioritize daytime operations to mitigate risks. Release heights exceeding 10 m significantly amplify dispersion effects, particularly at night (nighttime dispersion area at a height of 20 m is 3.5 times larger than during the daytime). Optimizing release direction effectively suppresses dispersion, with vertically downward releases exhibiting minimal impact. Horizontal releases require avoidance of downwind alignment, and daytime operations are prioritized to reduce lateral dispersion risks. Full article
(This article belongs to the Section Ocean Engineering)
Show Figures

Figure 1

23 pages, 2029 KiB  
Systematic Review
Exploring the Role of Industry 4.0 Technologies in Smart City Evolution: A Literature-Based Study
by Nataliia Boichuk, Iwona Pisz, Anna Bruska, Sabina Kauf and Sabina Wyrwich-Płotka
Sustainability 2025, 17(15), 7024; https://doi.org/10.3390/su17157024 - 2 Aug 2025
Viewed by 206
Abstract
Smart cities are technologically advanced urban environments where interconnected systems and data-driven technologies enhance public service delivery and quality of life. These cities rely on information and communication technologies, the Internet of Things, big data, cloud computing, and other Industry 4.0 tools to [...] Read more.
Smart cities are technologically advanced urban environments where interconnected systems and data-driven technologies enhance public service delivery and quality of life. These cities rely on information and communication technologies, the Internet of Things, big data, cloud computing, and other Industry 4.0 tools to support efficient city management and foster citizen engagement. Often referred to as digital cities, they integrate intelligent infrastructures and real-time data analytics to improve mobility, security, and sustainability. Ubiquitous sensors, paired with Artificial Intelligence, enable cities to monitor infrastructure, respond to residents’ needs, and optimize urban conditions dynamically. Given the increasing significance of Industry 4.0 in urban development, this study adopts a bibliometric approach to systematically review the application of these technologies within smart cities. Utilizing major academic databases such as Scopus and Web of Science the research aims to identify the primary Industry 4.0 technologies implemented in smart cities, assess their impact on infrastructure, economic systems, and urban communities, and explore the challenges and benefits associated with their integration. The bibliometric analysis included publications from 2016 to 2023, since the emergence of urban researchers’ interest in the technologies of the new industrial revolution. The task is to contribute to a deeper understanding of how smart cities evolve through the adoption of advanced technological frameworks. Research indicates that IoT and AI are the most commonly used tools in urban spaces, particularly in smart mobility and smart environments. Full article
Show Figures

Figure 1

32 pages, 2702 KiB  
Article
Research on Safety Vulnerability Assessment of Subway Station Construction Based on Evolutionary Resilience Perspective
by Leian Zhang, Junwu Wang, Miaomiao Zhang and Jingyi Guo
Buildings 2025, 15(15), 2732; https://doi.org/10.3390/buildings15152732 - 2 Aug 2025
Viewed by 259
Abstract
With the continuous increase in urban population, the subway is the main way to alleviate traffic congestion. However, the construction environment of subway stations is complex, and the safety risks are extremely high. Therefore, it is of great practical significance to scientifically and [...] Read more.
With the continuous increase in urban population, the subway is the main way to alleviate traffic congestion. However, the construction environment of subway stations is complex, and the safety risks are extremely high. Therefore, it is of great practical significance to scientifically and systematically evaluate the safety vulnerability of subway station construction. This paper takes the Chengdu subway project as an example, and establishes a metro station construction safety vulnerability evaluation index system based on the driving forces–pressures–state–impacts–responses (DPSIR) theory with 5 first-level indexes and 23 second-level indexes, and adopts the fuzzy hierarchical analysis method (FAHP) to calculate the subjective weights, and the improved Harris Hawks optimization–projection pursuit method (HHO-PPM) to determine the objective weights, combined with game theory to calculate the comprehensive weights of the indicators, and finally uses the improved cloud model of Bayesian feedback to determine the vulnerability level of subway station construction safety. The study found that the combined empowerment–improvement cloud model assessment method is reliable, and the case study verifies that the vulnerability level of the project is “very low risk”, and the investigations of safety hazards and the pressure of surrounding traffic are the key influencing factors, allowing for the proposal of more scientific and effective management strategies for the construction of subway stations. Full article
(This article belongs to the Section Construction Management, and Computers & Digitization)
Show Figures

Figure 1

26 pages, 1033 KiB  
Article
Internet of Things Platform for Assessment and Research on Cybersecurity of Smart Rural Environments
by Daniel Sernández-Iglesias, Llanos Tobarra, Rafael Pastor-Vargas, Antonio Robles-Gómez, Pedro Vidal-Balboa and João Sarraipa
Future Internet 2025, 17(8), 351; https://doi.org/10.3390/fi17080351 - 1 Aug 2025
Viewed by 148
Abstract
Rural regions face significant barriers to adopting IoT technologies, due to limited connectivity, energy constraints, and poor technical infrastructure. While urban environments benefit from advanced digital systems and cloud services, rural areas often lack the necessary conditions to deploy and evaluate secure and [...] Read more.
Rural regions face significant barriers to adopting IoT technologies, due to limited connectivity, energy constraints, and poor technical infrastructure. While urban environments benefit from advanced digital systems and cloud services, rural areas often lack the necessary conditions to deploy and evaluate secure and autonomous IoT solutions. To help overcome this gap, this paper presents the Smart Rural IoT Lab, a modular and reproducible testbed designed to replicate the deployment conditions in rural areas using open-source tools and affordable hardware. The laboratory integrates long-range and short-range communication technologies in six experimental scenarios, implementing protocols such as MQTT, HTTP, UDP, and CoAP. These scenarios simulate realistic rural use cases, including environmental monitoring, livestock tracking, infrastructure access control, and heritage site protection. Local data processing is achieved through containerized services like Node-RED, InfluxDB, MongoDB, and Grafana, ensuring complete autonomy, without dependence on cloud services. A key contribution of the laboratory is the generation of structured datasets from real network traffic captured with Tcpdump and preprocessed using Zeek. Unlike simulated datasets, the collected data reflect communication patterns generated from real devices. Although the current dataset only includes benign traffic, the platform is prepared for future incorporation of adversarial scenarios (spoofing, DoS) to support AI-based cybersecurity research. While experiments were conducted in an indoor controlled environment, the testbed architecture is portable and suitable for future outdoor deployment. The Smart Rural IoT Lab addresses a critical gap in current research infrastructure, providing a realistic and flexible foundation for developing secure, cloud-independent IoT solutions, contributing to the digital transformation of rural regions. Full article
Show Figures

Figure 1

29 pages, 540 KiB  
Systematic Review
Digital Transformation in International Trade: Opportunities, Challenges, and Policy Implications
by Sina Mirzaye and Muhammad Mohiuddin
J. Risk Financial Manag. 2025, 18(8), 421; https://doi.org/10.3390/jrfm18080421 - 1 Aug 2025
Viewed by 370
Abstract
This study synthesizes the rapidly expanding evidence on how digital technologies reshape international trade, with a particular focus on small and medium-sized enterprises (SMEs). Guided by two research questions—(RQ1) How do digital tools influence the volume and composition of cross-border trade? and (RQ2) [...] Read more.
This study synthesizes the rapidly expanding evidence on how digital technologies reshape international trade, with a particular focus on small and medium-sized enterprises (SMEs). Guided by two research questions—(RQ1) How do digital tools influence the volume and composition of cross-border trade? and (RQ2) How do these effects vary by countries’ development level and firm size?—we conducted a PRISMA-compliant systematic literature review covering 2010–2024. Searches across eight major databases yielded 1857 records; after duplicate removal, title/abstract screening, full-text assessment, and Mixed Methods Appraisal Tool (MMAT 2018) quality checks, 86 peer-reviewed English-language studies were retained. Findings reveal three dominant technology clusters: (1) e-commerce platforms and cloud services, (2) IoT-enabled supply chain solutions, and (3) emerging AI analytics. E-commerce and cloud adoption consistently raise export intensity—doubling it for digitally mature SMEs—while AI applications are the fastest-growing research strand, particularly in East Asia and Northern Europe. However, benefits are uneven: firms in low-infrastructure settings face higher fixed digital costs, and cybersecurity and regulatory fragmentation remain pervasive obstacles. By integrating trade economics with development and SME internationalization studies, this review offers the first holistic framework that links national digital infrastructure and policy support to firm-level export performance. It shows that the trade-enhancing effects of digitalization are contingent on robust broadband penetration, affordable cloud access, and harmonized data-governance regimes. Policymakers should, therefore, prioritize inclusive digital-readiness programs, while business leaders should invest in complementary capabilities—data analytics, cyber-risk management, and cross-border e-logistics—to fully capture digital trade gains. This balanced perspective advances theory and practice on building resilient, equitable digital trade ecosystems. Full article
(This article belongs to the Special Issue Modern Enterprises/E-Commerce Logistics and Supply Chain Management)
Show Figures

Figure 1

16 pages, 5301 KiB  
Article
TSINet: A Semantic and Instance Segmentation Network for 3D Tomato Plant Point Clouds
by Shanshan Ma, Xu Lu and Liang Zhang
Appl. Sci. 2025, 15(15), 8406; https://doi.org/10.3390/app15158406 - 29 Jul 2025
Viewed by 142
Abstract
Accurate organ-level segmentation is essential for achieving high-throughput, non-destructive, and automated plant phenotyping. To address the challenge of intelligent acquisition of phenotypic parameters in tomato plants, we propose TSINet, an end-to-end dual-task segmentation network designed for effective and precise semantic labeling and instance [...] Read more.
Accurate organ-level segmentation is essential for achieving high-throughput, non-destructive, and automated plant phenotyping. To address the challenge of intelligent acquisition of phenotypic parameters in tomato plants, we propose TSINet, an end-to-end dual-task segmentation network designed for effective and precise semantic labeling and instance recognition of tomato point clouds, based on the Pheno4D dataset. TSINet adopts an encoder–decoder architecture, where a shared encoder incorporates four Geometry-Aware Adaptive Feature Extraction Blocks (GAFEBs) to effectively capture local structures and geometric relationships in raw point clouds. Two parallel decoder branches are employed to independently decode shared high-level features for the respective segmentation tasks. Additionally, a Dual Attention-Based Feature Enhancement Module (DAFEM) is introduced to further enrich feature representations. The experimental results demonstrate that TSINet achieves superior performance in both semantic and instance segmentation, particularly excelling in challenging categories such as stems and large-scale instances. Specifically, TSINet achieves 97.00% mean precision, 96.17% recall, 96.57% F1-score, and 93.43% IoU in semantic segmentation and 81.54% mPrec, 81.69% mRec, 81.60% mCov, and 86.40% mWCov in instance segmentation. Compared with state-of-the-art methods, TSINet achieves balanced improvements across all metrics, significantly reducing false positives and false negatives while enhancing spatial completeness and segmentation accuracy. Furthermore, we conducted ablation studies and generalization tests to systematically validate the effectiveness of each TSINet component and the overall robustness of the model. This study provides an effective technological approach for high-throughput automated phenotyping of tomato plants, contributing to the advancement of intelligent agricultural management. Full article
Show Figures

Figure 1

19 pages, 650 KiB  
Article
LEMAD: LLM-Empowered Multi-Agent System for Anomaly Detection in Power Grid Services
by Xin Ji, Le Zhang, Wenya Zhang, Fang Peng, Yifan Mao, Xingchuang Liao and Kui Zhang
Electronics 2025, 14(15), 3008; https://doi.org/10.3390/electronics14153008 - 28 Jul 2025
Viewed by 364
Abstract
With the accelerated digital transformation of the power industry, critical infrastructures such as power grids are increasingly migrating to cloud-native architectures, leading to unprecedented growth in service scale and complexity. Traditional operation and maintenance (O&M) methods struggle to meet the demands for real-time [...] Read more.
With the accelerated digital transformation of the power industry, critical infrastructures such as power grids are increasingly migrating to cloud-native architectures, leading to unprecedented growth in service scale and complexity. Traditional operation and maintenance (O&M) methods struggle to meet the demands for real-time monitoring, accuracy, and scalability in such environments. This paper proposes a novel service performance anomaly detection system based on large language models (LLMs) and multi-agent systems (MAS). By integrating the semantic understanding capabilities of LLMs with the distributed collaboration advantages of MAS, we construct a high-precision and robust anomaly detection framework. The system adopts a hierarchical architecture, where lower-layer agents are responsible for tasks such as log parsing and metric monitoring, while an upper-layer coordinating agent performs multimodal feature fusion and global anomaly decision-making. Additionally, the LLM enhances the semantic analysis and causal reasoning capabilities for logs. Experiments conducted on real-world data from the State Grid Corporation of China, covering 1289 service combinations, demonstrate that our proposed system significantly outperforms traditional methods in terms of the F1-score across four platforms, including customer services and grid resources (achieving up to a 10.3% improvement). Notably, the system excels in composite anomaly detection and root cause analysis. This study provides an industrial-grade, scalable, and interpretable solution for intelligent power grid O&M, offering a valuable reference for the practical implementation of AIOps in critical infrastructures. Evaluated on real-world data from the State Grid Corporation of China (SGCC), our system achieves a maximum F1-score of 88.78%, with a precision of 92.16% and recall of 85.63%, outperforming five baseline methods. Full article
(This article belongs to the Special Issue Advanced Techniques for Multi-Agent Systems)
Show Figures

Figure 1

25 pages, 3791 KiB  
Article
Optimizing Multitenancy: Adaptive Resource Allocation in Serverless Cloud Environments Using Reinforcement Learning
by Mohammed Naif Alatawi
Electronics 2025, 14(15), 3004; https://doi.org/10.3390/electronics14153004 - 28 Jul 2025
Viewed by 153
Abstract
The growing adoption of serverless computing has highlighted critical challenges in resource allocation, policy fairness, and energy efficiency within multitenancy cloud environments. This research proposes a reinforcement learning (RL)-based adaptive resource allocation framework to address these issues. The framework models resource allocation as [...] Read more.
The growing adoption of serverless computing has highlighted critical challenges in resource allocation, policy fairness, and energy efficiency within multitenancy cloud environments. This research proposes a reinforcement learning (RL)-based adaptive resource allocation framework to address these issues. The framework models resource allocation as a Markov Decision Process (MDP) with dynamic states that include latency, resource utilization, and energy consumption. A reward function is designed to optimize the throughput, latency, and energy efficiency while ensuring fairness among tenants. The proposed model demonstrates significant improvements over heuristic approaches, achieving a 50% reduction in latency (from 250 ms to 120 ms), a 38.9% increase in throughput (from 180 tasks/s to 250 tasks/s), and a 35% improvement in energy efficiency. Additionally, the model reduces operational costs by 40%, achieves SLA compliance rates above 98%, and enhances fairness by lowering the Gini coefficient from 0.25 to 0.10. Under burst loads, the system maintains a service level objective success rate of 94% with a time to scale of 6 s. These results underscore the potential of RL-based solutions for dynamic workload management, paving the way for more scalable, cost-effective, and sustainable serverless multitenancy systems. Full article
(This article belongs to the Special Issue New Advances in Cloud Computing and Its Latest Applications)
Show Figures

Figure 1

15 pages, 753 KiB  
Article
A Novel Cloud Energy Consumption Heuristic Based on a Network Slicing–Ring Fencing Ratio
by Vinay Sriram Iyer, Yasantha Samarawickrama and Giovani Estrada
Network 2025, 5(3), 27; https://doi.org/10.3390/network5030027 - 25 Jul 2025
Viewed by 202
Abstract
The widespread adoption of cloud computing has amplified the demand for electric power. It is strategically important to address the limitations of reliable sources and sustainability of power. Research and investment in data centres and power infrastructure are therefore critically important for our [...] Read more.
The widespread adoption of cloud computing has amplified the demand for electric power. It is strategically important to address the limitations of reliable sources and sustainability of power. Research and investment in data centres and power infrastructure are therefore critically important for our digital economy. A novel heuristic for the minimisation of energy consumption in cloud computing is presented. It draws similarities to the concept of “network slices”, in which an orchestrator enables multiplexing to reduce the network “churn” often associated with significant losses of energy consumption. The novel network slicing–ring fencing ratio is a heuristic calculated through an iterative procedure for the reduction in cloud energy consumption. Simulation results show how the non-convex equation optimises power by reducing energy from 10,680 kJ to 912 kJ, which is a 91.46% efficiency gain. In comparison, the Heuristic AUGMENT Non-Convex algorithm (HA-NC, by Hossain and Ansari) reported a 312.74% increase in energy consumption from 2464 kJ to 10,168 kJ, while the Priority Selection Offloading algorithm (PSO, by Anajemba et al.) also reported a 150% increase in energy consumption, from 10,738 kJ to 26,845 kJ. The proposed network slicing–ring fencing ratio is seen to successfully balance energy consumption and computing performance. We therefore think the novel approach could be of interest to network architects and cloud operators. Full article
Show Figures

Figure 1

32 pages, 5164 KiB  
Article
Decentralized Distributed Sequential Neural Networks Inference on Low-Power Microcontrollers in Wireless Sensor Networks: A Predictive Maintenance Case Study
by Yernazar Bolat, Iain Murray, Yifei Ren and Nasim Ferdosian
Sensors 2025, 25(15), 4595; https://doi.org/10.3390/s25154595 - 24 Jul 2025
Viewed by 370
Abstract
The growing adoption of IoT applications has led to increased use of low-power microcontroller units (MCUs) for energy-efficient, local data processing. However, deploying deep neural networks (DNNs) on these constrained devices is challenging due to limitations in memory, computational power, and energy. Traditional [...] Read more.
The growing adoption of IoT applications has led to increased use of low-power microcontroller units (MCUs) for energy-efficient, local data processing. However, deploying deep neural networks (DNNs) on these constrained devices is challenging due to limitations in memory, computational power, and energy. Traditional methods like cloud-based inference and model compression often incur bandwidth, privacy, and accuracy trade-offs. This paper introduces a novel Decentralized Distributed Sequential Neural Network (DDSNN) designed for low-power MCUs in Tiny Machine Learning (TinyML) applications. Unlike the existing methods that rely on centralized cluster-based approaches, DDSNN partitions a pre-trained LeNet across multiple MCUs, enabling fully decentralized inference in wireless sensor networks (WSNs). We validate DDSNN in a real-world predictive maintenance scenario, where vibration data from an industrial pump is analyzed in real-time. The experimental results demonstrate that DDSNN achieves 99.01% accuracy, explicitly maintaining the accuracy of the non-distributed baseline model and reducing inference latency by approximately 50%, highlighting its significant enhancement over traditional, non-distributed approaches, demonstrating its practical feasibility under realistic operating conditions. Full article
Show Figures

Figure 1

27 pages, 4152 KiB  
Article
Recent Advances in the EAGLE Concept—Monitoring the Earth’s Surface Based on a New Land Characterisation Approach
by Stephan Arnold, Geoffrey Smith, Geir-Harald Strand, Gerard Hazeu, Michael Bock, Barbara Kosztra, Christoph Perger, Gebhard Banko, Tomas Soukup, Nuria Valcarcel Sanz, Stefan Kleeschulte, Julián Delgado Hernández and Emanuele Mancosu
Land 2025, 14(8), 1525; https://doi.org/10.3390/land14081525 - 24 Jul 2025
Viewed by 280
Abstract
The demand for land monitoring information continues to increase, but the range and diversity of the available products to date have made their integrated use challenging and, at times, counterproductive. There has therefore been a growing need to enhance and harmonise the practice [...] Read more.
The demand for land monitoring information continues to increase, but the range and diversity of the available products to date have made their integrated use challenging and, at times, counterproductive. There has therefore been a growing need to enhance and harmonise the practice of land monitoring on a pan-European level with the formulation of a more consistent and standardised set of modelling criteria. The outcome has been a paradigm shift away from a “paper map”-based world where features are given a single, fixed label to one where features have a rich characterisation which is more informative, flexible and powerful. The approach allows the characteristics to be dynamic so that, over time, a feature may only change part of its description (i.e., a forest can be felled, but it may remain as forestry if replanted) or it can have multiple descriptors (i.e., a forest may be used for both timber production and recreation). The concept proposed by the authors has evolved since 2008 from first drafts to a comprehensive and powerful tool adopted by the European Union’s Copernicus programme. It provides for the semantic decomposition of existing nomenclatures, as well as supports a descriptive approach to the mapping of all landscape features in a flexible and object-oriented manner. In this way, the key move away from classification towards the characterisation of the Earth’s surface represents a novel and innovate approach to handling complex land surface information more suited to the age of distributed databases, cloud computing and object-oriented data modelling. In this paper, the motivation for and technical approach of the EAGLE concept with its matrix and UML model implementation are explained. This is followed by an update of the latest developments and the presentation of a number of experimental and operational use cases at national and European levels, and it then concludes with thoughts on the future outlook. Full article
Show Figures

Figure 1

31 pages, 528 KiB  
Article
An Exploratory Factor Analysis Approach on Challenging Factors for Government Cloud Service Adoption Intention
by Ndukwe Ukeje, Jairo A. Gutierrez, Krassie Petrova and Ugochukwu Chinonso Okolie
Future Internet 2025, 17(8), 326; https://doi.org/10.3390/fi17080326 - 23 Jul 2025
Viewed by 308
Abstract
This study explores the challenges hindering the government’s adoption of cloud computing despite its benefits in improving services, reducing costs, and enhancing collaboration. Key barriers include information security, privacy, compliance, and perceived risks. Using the Unified Theory of Acceptance and Use of Technology [...] Read more.
This study explores the challenges hindering the government’s adoption of cloud computing despite its benefits in improving services, reducing costs, and enhancing collaboration. Key barriers include information security, privacy, compliance, and perceived risks. Using the Unified Theory of Acceptance and Use of Technology (UTAUT) model, the study conceptualises a model incorporating privacy, governance framework, performance expectancy, and information security as independent variables, with perceived risk as a moderator and government intention as the dependent variable. The study employs exploratory factor analysis (EFA) based on survey data from 71 participants in Nigerian government organisations to validate the measurement scale for these factors. The analysis evaluates variable validity, factor relationships, and measurement reliability. Cronbach’s alpha values range from 0.807 to 0.950, confirming high reliability. Measurement items with a common variance above 0.40 were retained, explaining 70.079% of the total variance on the measurement items, demonstrating reliability and accuracy in evaluating the challenging factors. These findings establish a validated scale for assessing government cloud adoption challenges and highlight complex relationships among influencing factors. This study provides a reliable measurement scale and model for future research and policymakers on the government’s intention to adopt cloud services. Full article
(This article belongs to the Special Issue Privacy and Security in Computing Continuum and Data-Driven Workflows)
Show Figures

Figure 1

25 pages, 6462 KiB  
Article
Phenotypic Trait Acquisition Method for Tomato Plants Based on RGB-D SLAM
by Penggang Wang, Yuejun He, Jiguang Zhang, Jiandong Liu, Ran Chen and Xiang Zhuang
Agriculture 2025, 15(15), 1574; https://doi.org/10.3390/agriculture15151574 - 22 Jul 2025
Viewed by 202
Abstract
The acquisition of plant phenotypic traits is essential for selecting superior varieties, improving crop yield, and supporting precision agriculture and agricultural decision-making. Therefore, it plays a significant role in modern agriculture and plant science research. Traditional manual measurements of phenotypic traits are labor-intensive [...] Read more.
The acquisition of plant phenotypic traits is essential for selecting superior varieties, improving crop yield, and supporting precision agriculture and agricultural decision-making. Therefore, it plays a significant role in modern agriculture and plant science research. Traditional manual measurements of phenotypic traits are labor-intensive and inefficient. In contrast, combining 3D reconstruction technologies with autonomous vehicles enables more intuitive and efficient trait acquisition. This study proposes a 3D semantic reconstruction system based on an improved ORB-SLAM3 framework, which is mounted on an unmanned vehicle to acquire phenotypic traits in tomato cultivation scenarios. The vehicle is also equipped with the A * algorithm for autonomous navigation. To enhance the semantic representation of the point cloud map, we integrate the BiSeNetV2 network into the ORB-SLAM3 system as a semantic segmentation module. Furthermore, a two-stage filtering strategy is employed to remove outliers and improve the map accuracy, and OctoMap is adopted to store the point cloud data, significantly reducing the memory consumption. A spherical fitting method is applied to estimate the number of tomato fruits. The experimental results demonstrate that BiSeNetV2 achieves a mean intersection over union (mIoU) of 95.37% and a frame rate of 61.98 FPS on the tomato dataset, enabling real-time segmentation. The use of OctoMap reduces the memory consumption by an average of 96.70%. The relative errors when predicting the plant height, canopy width, and volume are 3.86%, 14.34%, and 27.14%, respectively, while the errors concerning the fruit count and fruit volume are 14.36% and 14.25%. Localization experiments on a field dataset show that the proposed system achieves a mean absolute trajectory error (mATE) of 0.16 m and a root mean square error (RMSE) of 0.21 m, indicating high localization accuracy. Therefore, the proposed system can accurately acquire the phenotypic traits of tomato plants, providing data support for precision agriculture and agricultural decision-making. Full article
(This article belongs to the Section Artificial Intelligence and Digital Agriculture)
Show Figures

Figure 1

32 pages, 2529 KiB  
Article
Cloud Adoption in the Digital Era: An Interpretable Machine Learning Analysis of National Readiness and Structural Disparities Across the EU
by Cristiana Tudor, Margareta Florescu, Persefoni Polychronidou, Pavlos Stamatiou, Vasileios Vlachos and Konstadina Kasabali
Appl. Sci. 2025, 15(14), 8019; https://doi.org/10.3390/app15148019 - 18 Jul 2025
Viewed by 289
Abstract
As digital transformation accelerates across Europe, cloud computing plays an increasingly central role in modernizing public services and private enterprises. Yet adoption rates vary markedly among EU member states, reflecting deeper structural differences in digital capacity. This study employs explainable machine learning to [...] Read more.
As digital transformation accelerates across Europe, cloud computing plays an increasingly central role in modernizing public services and private enterprises. Yet adoption rates vary markedly among EU member states, reflecting deeper structural differences in digital capacity. This study employs explainable machine learning to uncover the drivers of national cloud adoption across 27 EU countries using harmonized panel datasets spanning 2014–2021 and 2014–2024. A methodological pipeline combining Random Forests (RF), XGBoost, Support Vector Machines (SVM), and Elastic Net regression is implemented, with model tuning conducted via nested cross-validation. Among individual models, Elastic Net and SVM delivered superior predictive performance, while a stacked ensemble achieved the best overall accuracy (MAE = 0.214, R2 = 0.948). The most interpretable model, a standardized RF with country fixed effects, attained MAE = 0.321, and R2 = 0.864, making it well-suited for policy analysis. Variable importance analysis reveals that the density of ICT specialists is the strongest predictor of adoption, followed by broadband access and higher education. Fixed-effect modeling confirms significant national heterogeneity, with countries like Finland and Luxembourg consistently leading adoption, while Bulgaria and Romania exhibit structural barriers. Partial dependence and SHAP analyses reveal nonlinear complementarities between digital skills and infrastructure. A hierarchical clustering of countries reveals three distinct digital maturity profiles, offering tailored policy pathways. These results directly support the EU Digital Decade’s strategic targets and provide actionable insights for advancing inclusive and resilient digital transformation across the Union. Full article
(This article belongs to the Special Issue Advanced Technologies Applied in Digital Media Era)
Show Figures

Figure 1

Back to TopTop