Privacy Protection Optimization Method for Cloud Platforms Based on Federated Learning and Homomorphic Encryption
Abstract
1. Introduction
2. Related Work
2.1. Research Progress of Privacy Protection and Encryption Technology on Cloud Platforms
2.2. Privacy Protection Method Based on FL and Homomorphic Encryption
2.3. Existing Research and Analysis
3. Realization Principle and Analysis Process of HFHE-Cloud Model
3.1. Realization Principle of HFHE-Cloud Model
3.1.1. Data Access and Local Computing Layer
3.1.2. Encryption and Key Management Layer
3.1.3. Aggregation and Collaborative Optimization Layer
3.1.4. Global Control and Audit Layer
3.2. Analysis Process of HFHE-Cloud Model
3.2.1. Experimental Data Collection Process
3.2.2. Experimental Environment
3.2.3. Threshold Determination of HFHE-Cloud Equilibrium Coefficient
3.2.4. Verification Index System of HFHE-Cloud
4. Analysis on the Optimization Effect of Privacy Protection of HFHE-Cloud Model Cloud Platform
4.1. Analysis of Privacy Protection Effect
4.2. Performance Analysis of Distributed Training
4.3. The Experimental Analysis of Expandable Ablation of HFHE-Cloud
4.4. Discussion
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Jerkovic, F.; Sarkar, N.I.; Ali, J. Smart Grid IoT Framework for Predicting Energy Consumption Using Federated Learning Homomorphic Encryption. Sensors 2025, 25, 3700. [Google Scholar] [CrossRef]
- Gong, M.; Zhang, Y.; Gao, Y.; Qin, A.K.; Wu, Y.; Wang, S.; Zhang, Y. A multi-modal vertical federated learning framework based on homomorphic encryption. IEEE Trans. Inf. Forensics Secur. 2023, 19, 1826–1839. [Google Scholar] [CrossRef]
- Tripathy, R.; Meshram, J.; Bera, P. HalfFedLearn: A secure federated learning with local data partitioning and homomorphic encryption. Future Gener. Comput. Syst. 2025, 171, 107858. [Google Scholar] [CrossRef]
- Hu, C.; Li, B. Maskcrypt: Federated learning with selective homomorphic encryption. IEEE Trans. Dependable Secur. Comput. 2024, 22, 221–233. [Google Scholar] [CrossRef]
- Cai, Y.; Ding, W.; Xiao, Y.; Yan, Z.; Liu, X.; Wan, Z. Secfed: A secure and efficient federated learning based on multi-key homomorphic encryption. IEEE Trans. Dependable Secur. Comput. 2023, 21, 3817–3833. [Google Scholar] [CrossRef]
- Rieyan, S.A.; News, M.R.K.; Rahman, A.B.M.M.; Khan, S.A.; Zaarif, S.T.J.; Alam, G.R.; Hassan, M.M.; Ianni, M.; Fortino, G. An advanced data fabric architecture leveraging homomorphic encryption and federated learning. Inf. Fusion 2024, 102, 102004. [Google Scholar] [CrossRef]
- Chowdhury, K.; Deb, S.; Kar, N.; Sarkar, J.L.; Alkhayyat, A.H.; Lewis, N. PwLMμ-TPE: A Visual-Usability and Privacy-Enhanced Thumbnail Preserving Encryption Scheme of Cloud Computing for Consumers. IEEE Trans. Consum. Electron. 2024, 71, 9–17. [Google Scholar] [CrossRef]
- Dewangan, R.R.; Soni, S.; Mishal, A. An approach of privacy preservation and data security in cloud computing for secured data sharing. Recent Adv. Electr. Electron. Eng. 2025, 18, 176–195. [Google Scholar] [CrossRef]
- Qiu, M.; Han, J.; Hao, F.; Sun, C.; Wu, G. Privacy-Preserving and Traceable Functional Encryption for Inner Product in Cloud Computing. IEEE Trans. Cloud Comput. 2025, 13, 667–679. [Google Scholar] [CrossRef]
- Hu, B.; Zhang, K.; Gong, J.; Qian, H. Refrain From Inquiring About My Scalable Storage and Boolean Queries for Secure Cloud. IEEE Trans. Cloud Comput. 2025, 13, 969–982. [Google Scholar] [CrossRef]
- Nataraj, N.; Nataraj, R.V. Derived multi-objective function for latency sensitive-based cloud object storage system using hybrid heuristic algorithm. Data Knowl. Eng. 2025, 159, 102448. [Google Scholar] [CrossRef]
- Yang, R.; Zhao, T.; Yu, F.R.; Li, M.; Zhang, D.; Zhao, X. Blockchain-based federated learning with enhanced privacy and security using homomorphic encryption and reputation. IEEE Internet Things J. 2024, 11, 21674–21688. [Google Scholar] [CrossRef]
- Mahato, G.K.; Banerjee, A.; Chakraborty, S.K.; Gao, X.-Z. Privacy preserving verifiable federated learning scheme using blockchain and homomorphic encryption. Appl. Soft Comput. 2024, 167, 112405. [Google Scholar] [CrossRef]
- Kumbhar, H.R.; Rao, S.S. Federated learning enabled multi-key homomorphic encryption. Expert Syst. Appl. 2025, 268, 126197. [Google Scholar] [CrossRef]
- Anitha, R.; Murugan, M. Privacy-preserving collaboration in blockchain-enabled IoT: The synergy of modified homomorphic encryption and federated learning. Int. J. Commun. Syst. 2024, 37, e5955. [Google Scholar] [CrossRef]
- ChandraUmakantham, O.K.; Gajendran, S.; Marappan, S. Enhancing intrusion detection through federated learning with enhanced ghost_binet and homomorphic encryption. IEEE Access 2024, 12, 24879–24893. [Google Scholar] [CrossRef]
- Castro, F.; Impedovo, D.; Pirlo, G. An efficient and privacy-preserving federated learning approach based on homomorphic encryption. IEEE Open J. Comput. Soc. 2025, 6, 336–347. [Google Scholar] [CrossRef]
- Park, S.; Lee, J.; Harada, K.; Chi, J. Masking and Homomorphic Encryption-Combined Secure Aggregation for Privacy-Preserving Federated Learning. Electronics 2025, 14, 177. [Google Scholar] [CrossRef]
- Saidi, A.; Amira, A.; Nouali, O. Securing decentralized federated learning: Cryptographic mechanisms for privacy and trust. Clust. Comput. 2025, 28, 144. [Google Scholar] [CrossRef]
- Miao, Y.; Zheng, W.; Li, X.; Li, H.; Choo, K.-K.R.; Deng, R.H. Secure model-contrastive federated learning with improved compressive sensing. IEEE Trans. Inf. Forensics Secur. 2023, 18, 3430–3444. [Google Scholar] [CrossRef]
- Miao, Y.; Kuang, D.; Li, X.; Xu, S.; Li, H.; Choo, K.-K.R.; Deng, R.H. Privacy-Preserving asynchronous federated learning under non-IID settings. IEEE Trans. Inf. Forensics Secur. 2024, 19, 5828–5841. [Google Scholar] [CrossRef]
- Miao, Y.; Yan, X.; Li, X.; Xu, S.; Liu, X.; Li, H.; Deng, R.H. RFed: Robustness-enhanced privacy-preserving federated learning against poisoning attack. IEEE Trans. Inf. Forensics Secur. 2024, 19, 5814–5827. [Google Scholar] [CrossRef]
- Reyes-Palacios, S.; Morales-Sandoval, M.; Garcia-Hernandez, J.J.; Gonzalez-Compean, J.; Marin-Castro, H.M. Elastic cloud platform for privacy-preserving data mining as a service. Future Gener. Comput. Syst. 2025, 175, 108028. [Google Scholar] [CrossRef]
- Shakkeera, L. Efficient task scheduling and computational offloading optimization with federated learning and blockchain in mobile cloud computing. Results Control Optim. 2025, 18, 100524. [Google Scholar]
- Parte, S.A.; Thakur, R. Quorum and non-quorum based hierarchical multi-authority access control scheme for secure data sharing in cloud based environment. Sādhanā 2024, 49, 238. [Google Scholar] [CrossRef]




| Logical Layer | Main Input/Output | How to Interact with Other Layers |
|---|---|---|
| Data Access and local computing layer | Input: local tenant data; Output: model parameter update . | Submit the locally trained parameter updates to the encryption and key management layer, and receive the scheduling and weight feedback from the control layer. |
| Encryption and key management layer | Input: model parameter update; Output: encryption parameter . | The parameters from the local computing layer are encrypted in layers and mapped to the aggregation key space through the KeySwitch to provide unified ciphertext for the aggregation layer. |
| Aggregation and collaborative optimization layer | Input: multi-node ciphertext parameters; Output: global ciphertext model . | Complete homomorphic aggregation and weight distribution in ciphertext space, and submit the results to the control layer for verification and decryption. |
| Global control and audit layer | Input: aggregate results and behavior records; Output: control and trust feedback. | Audit and verify the aggregation and training process, update the trust of nodes, and feed back the scheduling and constraint information to the local computing layer. |
| Dataset | Data Content Characteristics |
|---|---|
| Cloud Workload Dataset for Scheduling Analysis, accessed on 6 April 2025 (https://www.kaggle.com/datasets/zoya77/cloud-workload-dataset-for-scheduling-analysis) | It contains the execution records of cloud tasks, and the fields include task ID, CPU, and memory usage, execution duration, task priority, arrival time, and completion status, reflecting the characteristics of cloud scheduling and resource allocation. |
| Multi-Tenant Cloud Dataset, accessed on 15 March 2025 (https://www.kaggle.com/datasets/ziya07/multi-tenant-cloud-dataset) | It contains multi-tenant virtualization and container operation information, including tenant ID, task ID, CPU and memory allocation ratio, delay, task type, and tenant attributes, reflecting the differences in multi-tenant resource occupation and isolation. |
| Alibaba Cluster Trace, accessed on 13 December 2018 (v2018) (https://github.com/alibaba/clusterdata) | It contains 8-day job and task logs of 4000 servers, recording information such as scheduling status, task duration, and failure rate, reflecting the real running load of large-scale cloud platforms. |
| Loghub-A Large Collection of System Log Datasets, accessed on 27 November 2018 (https://github.com/logpai/loghub) | It collects system logs such as HDFS, Spark, Hadoop, and OpenStack, including system calls, error reports, event timestamps, and running status, reflecting the characteristics of multi-system operation and security events. |
| Type | Item | Configuration |
|---|---|---|
| Hardware | CPU | Intel Xeon Gold 6348 × 2, 2.60 GHz, 32 cores |
| GPU | NVIDIA A100 40 GB × 2 | |
| RAM | 256 GB DDR4 | |
| Storage device | NVMe SSD 2 TB | |
| Software | Operating system | Ubuntu 22.04 LTS (64 bits) |
| Programming language | Python 3.10 | |
| Deep learning framework | PyTorch 2.1 | |
| FL frame | Flower 1.8 | |
| HE library | Pyfhel 3.4 | |
| Scheduling and control tool | Kubernetes 1.30 |
| Configuration Category | Parameter Term | Setup Description |
|---|---|---|
| Homomorphic encryption configuration | Encryption scheme type | Homomorphic encryption based on RLWE |
| Clear-text coding mode | Fixed-point encoding | |
| Modulus scale | q = 218 | |
| Support homomorphic operation | Addition, multiplication, KeySwitch | |
| Key management mode | Node independent key + aggregate key mapping | |
| Federal learning and training hyperparameter | Number of local training rounds | 5 |
| Learning rate | 0.01 (all clients are consistent) | |
| Batch size | 64 | |
| Optimizer | SGD | |
| Upper limit of global communication rounds | 100 |
| Folds | Verification Loss | Global Accuracy | ||||
|---|---|---|---|---|---|---|
| First Fold | 0.6 | 0.25 | 0.1 | 0.05 | 0.081 | 94.30% |
| Second Fold | 0.55 | 0.3 | 0.1 | 0.05 | 0.078 | 94.80% |
| Third Fold | 0.5 | 0.35 | 0.1 | 0.05 | 0.076 | 95.10% |
| Fouth Fold | 0.55 | 0.3 | 0.1 | 0.05 | 0.077 | 94.90% |
| Fifth Fold | 0.5 | 0.3 | 0.15 | 0.05 | 0.079 | 94.60% |
| Dimension | Index | Explanation |
|---|---|---|
| Privacy protection effect | Accuracy | The classification or prediction accuracy of the model on the verification or test set is used to measure the overall learning effect. |
| Loss | Measure the error between the predicted value and the real value of the model, and reflect the degree of convergence and optimization of the model. | |
| Encryption–Decryption Time (s) | The average time consumption of model parameter encryption and decryption in each round of training is used to evaluate the computational efficiency of privacy protection mechanism. | |
| Proportion of Encryption Overhead | Compared with unencrypted training, the encryption mechanism is used to measure the balance between security and efficiency. | |
| Distributed training performance | Communication Rounds | The number of communication rounds required for the model to achieve stable convergence reflects the training efficiency. |
| Upload Latency (s) | The average time taken by the client to upload local encryption parameters to the server reflects the influence of network and node differences. | |
| Aggregation Time (s) | The average time for a server to complete a homomorphic aggregation operation is used to measure the efficiency of the aggregation algorithm. | |
| Client Participation Rate | The proportion of nodes participating in uploading in each round of training reflects the parallelism and expansibility of the system. |
| Value | Encryption–Decryption Time (s) | Client Participation Rate (%) | Communication Rounds |
|---|---|---|---|
| 0.9 | 1.56 | 91.9 | 47 |
| 1.59 | 92.3 | 46 | |
| 1.1 | 1.66 | 91.2 | 48 |
| Client Number | Model Configuration | Accuracy (%) | Encryption–Decryption Time (s) | Upload Latency (s) | Peak Memory Usage (GB) | Communication Rounds |
|---|---|---|---|---|---|---|
| 20 | HFHE-Cloud-1 | 94.6 | 1.61 | 0.82 | 6.3 | 52 |
| 20 | HFHE-Cloud-2 | 95.0 | 2.04 | 0.79 | 7.8 | 47 |
| 20 | HFHE-Cloud-3 | 94.4 | 1.58 | 0.88 | 6.1 | 54 |
| 20 | HFHE-Cloud | 95.1 | 1.59 | 0.8 | 6.5 | 46 |
| 40 | HFHE-Cloud-1 | 94.3 | 1.91 | 1.05 | 6.9 | 56 |
| 40 | HFHE-Cloud-2 | 94.9 | 2.42 | 1.01 | 8.6 | 50 |
| 40 | HFHE-Cloud-3 | 94.1 | 1.86 | 1.12 | 6.8 | 58 |
| 40 | HFHE-Cloud | 95.0 | 1.88 | 1.03 | 7.1 | 49 |
| 60 | HFHE-Cloud-1 | 94.0 | 2.20 | 1.31 | 7.6 | 60 |
| 60 | HFHE-Cloud-2 | 94.7 | 2.78 | 1.26 | 9.4 | 53 |
| 60 | HFHE-Cloud-3 | 93.8 | 2.15 | 1.38 | 7.5 | 62 |
| 60 | HFHE-Cloud | 94.9 | 2.17 | 1.29 | 7.9 | 52 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Wang, J.; Wang, Y. Privacy Protection Optimization Method for Cloud Platforms Based on Federated Learning and Homomorphic Encryption. Sensors 2026, 26, 890. https://doi.org/10.3390/s26030890
Wang J, Wang Y. Privacy Protection Optimization Method for Cloud Platforms Based on Federated Learning and Homomorphic Encryption. Sensors. 2026; 26(3):890. https://doi.org/10.3390/s26030890
Chicago/Turabian StyleWang, Jing, and Yun Wang. 2026. "Privacy Protection Optimization Method for Cloud Platforms Based on Federated Learning and Homomorphic Encryption" Sensors 26, no. 3: 890. https://doi.org/10.3390/s26030890
APA StyleWang, J., & Wang, Y. (2026). Privacy Protection Optimization Method for Cloud Platforms Based on Federated Learning and Homomorphic Encryption. Sensors, 26(3), 890. https://doi.org/10.3390/s26030890

