Next Article in Journal
Rapid Determination of Molybdenum (VI) in Water Using Phenylfluorone-Modified Test Strips Combined with Colorimetry and LAB Color Space Analysis
Previous Article in Journal
Driver Monitoring System Using Computer Vision for Real-Time Detection of Fatigue, Distraction and Emotion via Facial Landmarks and Deep Learning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Privacy Protection Optimization Method for Cloud Platforms Based on Federated Learning and Homomorphic Encryption

School of Computer Science and Engineering, Southeast University, Nanjing 211189, China
*
Author to whom correspondence should be addressed.
Sensors 2026, 26(3), 890; https://doi.org/10.3390/s26030890
Submission received: 25 November 2025 / Revised: 31 December 2025 / Accepted: 15 January 2026 / Published: 29 January 2026
(This article belongs to the Section Sensor Networks)

Abstract

With the wide application of cloud computing in multi-tenant, heterogeneous nodes and high-concurrency environments, model parameters frequently interact during distributed training, which easily leads to privacy leakage, communication redundancy, and decreased aggregation efficiency. To realize the collaborative optimization of privacy protection and computing performance, this study proposes the Heterogeneous Federated Homomorphic Encryption Cloud (HFHE-Cloud) model, which integrates federated learning (FL) and homomorphic encryption and constructs a secure and efficient collaborative learning framework for cloud platforms. Under the condition of not exposing the original data, the model effectively reduces the performance bottleneck caused by encryption calculation and communication delay through hierarchical key mapping and dynamic scheduling mechanism of heterogeneous nodes. The experimental results show that HFHE-Cloud is significantly superior to Federated Averaging (FedAvg), Federated Proximal (FedProx), Federated Personalization (FedPer) and Federated Normalized Averaging (FedNova) in comprehensive performance, Homomorphically Encrypted Federated Averaging (HE-FedAvg) and other five baseline models. In the dimension of privacy protection, the global accuracy is up to 94.25%, and the Loss is stable within 0.09. In terms of computing performance, the encryption and decryption time is shortened by about one third, and the encryption overhead is controlled at 13%. In terms of distributed training efficiency, the number of communication rounds is reduced by about one fifth, and the node participation rate is stable at over 90%. The results verify the model’s ability to achieve high security and high scalability in multi-tenant environment. This study aims to provide cloud service providers and enterprise data holders with a technical solution of high-intensity privacy protection and efficient collaborative training that can be deployed in real cloud platforms.
Keywords: cloud computing; federal study; homomorphic encryption; privacy protection; distributed training cloud computing; federal study; homomorphic encryption; privacy protection; distributed training

Share and Cite

MDPI and ACS Style

Wang, J.; Wang, Y. Privacy Protection Optimization Method for Cloud Platforms Based on Federated Learning and Homomorphic Encryption. Sensors 2026, 26, 890. https://doi.org/10.3390/s26030890

AMA Style

Wang J, Wang Y. Privacy Protection Optimization Method for Cloud Platforms Based on Federated Learning and Homomorphic Encryption. Sensors. 2026; 26(3):890. https://doi.org/10.3390/s26030890

Chicago/Turabian Style

Wang, Jing, and Yun Wang. 2026. "Privacy Protection Optimization Method for Cloud Platforms Based on Federated Learning and Homomorphic Encryption" Sensors 26, no. 3: 890. https://doi.org/10.3390/s26030890

APA Style

Wang, J., & Wang, Y. (2026). Privacy Protection Optimization Method for Cloud Platforms Based on Federated Learning and Homomorphic Encryption. Sensors, 26(3), 890. https://doi.org/10.3390/s26030890

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop