Next Article in Journal
Contrastive Learning with Feature-Level Augmentation for Wireless Signal Representation
Previous Article in Journal
Method of Collaborative UAV Deployment: Carrier-Assisted Localization with Low-Resource Precision Touchdown
Previous Article in Special Issue
Secure Hierarchical Federated Learning for Large-Scale AI Models: Poisoning Attack Defense and Privacy Preservation in AIoT
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Analysis of Data Privacy Breaches Using Deep Learning in Cloud Environments: A Review

by
Abdulqawi Mohammed Almosti
* and
M. M. Hafizur Rahman
*
Department of Computer Networks and Communications, College of Computer Sciences and Information Technology, King Faisal University, Al-Ahsa 31982, Saudi Arabia
*
Authors to whom correspondence should be addressed.
Electronics 2025, 14(13), 2727; https://doi.org/10.3390/electronics14132727
Submission received: 4 May 2025 / Revised: 17 June 2025 / Accepted: 19 June 2025 / Published: 7 July 2025
(This article belongs to the Special Issue Security and Privacy for AI)

Abstract

Despite the advantages of using cloud computing, data breaches and security challenges remain, especially when dealing with sensitive information. The integration of deep learning (DL) techniques in a cloud environment ensures privacy preservation. This review paper analyzes 38 papers published from 2020 to 2025, focusing on privacy-preserving techniques in DL for cloud environments. Combining different privacy preservation technologies with DL results in improved utility for privacy protection and better security against data breaches than using individual applications such as differential privacy, homomorphic encryption, or federated learning. Further, a discussion is provided on the technical limitations when applying DL with various privacy preservation techniques, which include large communication overhead, lower model accuracy, and high computational cost. Additionally, this review paper presents the latest research in a comprehensive manner and provides directions for future research necessary to develop privacy-preserving DL models.

1. Introduction

The popularity of cloud computing has grown rapidly over the past few decades, driven by its numerous advantages and wide range of real-world applications.
Cloud computing offers several advantages, such as better safety enhancements, cost savings, enhanced system performance, higher productivity and operational maximization, and quickened response times. The various advantages of cloud computing have led to major changes in design software distribution along with developmental approaches and network infrastructure [1].
In cloud computing, security and privacy protection remain among the most pressing challenges. The diversity and evolving nature of modern cloud environments introduce significant security risks, including insecure APIs, data breaches, and threats to virtualized resources.
Due to the sensitivity of personal information in a cloud environment, organizations have aimed to achieve security objectives that include data confidentiality, integrity, and availability [2].
Moreover, these threats arise from various sources within the cloud environment, including misconfigurations, unauthorized access, and data breaches. In 2023, statistics showed that over 60% of packaging enterprises suffered cloud breaches, with ransomware attacks increasing by 32% compared to 2022. These findings highlight the urgent need for an effective and robust security framework [3].
DL has demonstrated its potential to reduce privacy risks in the healthcare sector by 25%, helping to ensure that patient data remains secure during model training [4].
Additionally, the economic impact of data breaches is significant, with the average cost per incident for an organization in Western Europe estimated at USD 4.35 million. These statistics emphasize the necessity of deploying advanced solutions, such as deep learning (DL), to address data breaches effectively. DL has been applied for real-time anomaly detection in IoT networks, offering crucial capabilities for identifying threats in modern cloud environments [5]. Using DL models, organizations can detect and mitigate threats such as distributed denial-of-service (DDoS) attacks and insider threats by enabling secure model training. Nevertheless, these benefits come with technical and operational challenges, including communication overhead, data heterogeneity, and vulnerability to adversarial attacks [6].
Traditional encryption algorithms for enhancing privacy in cloud computing are increasingly ineffective due to the cloud’s inherently diverse and distributed architecture [7]. To address this complexity, various deep learning (DL) paradigms such as convolutional neural networks (CNNs), recurrent neural networks (RNNs), long short-term memory networks (LSTM), generative adversarial networks (GANs), and federated learning (FL) have been introduced to improve privacy preservation in cloud environments. Furthermore, DL applications like monitoring systems, detection and prevention systems, and log analysis tools are being used to detect and automatically identify threats in the cloud [8].
The complexity of cloud components and their interconnections amplifies the security risks and weaknesses inherent in each part when deploying DL applications. In particular, data training in cloud environments poses a significant challenge, as it heavily depends on the specific characteristics and capabilities of the underlying cloud infrastructure [9].
Data protection and DL model defense are implemented through privacy-enhancing technologies. Several privacy preservation mechanisms are applied in cloud environments, which consist of homomorphic encryption, federated learning, differential privacy, secure multiparty computation, and hybrid privacy-preserving techniques [10].
Users must face numerous limitations when trying to gain full access to system resources in a cloud environment [11]. Different DL applications, incorporated with multiple privacy preservation algorithms, serve as the most beneficial techniques for securing cloud services [12]. The default structure of DL comprises multiple interconnected nodes and sophisticated algorithms. To achieve optimal privacy protection in cloud computing, it is crucial to design customized DL architectures tailored to specific problems. These architectures should integrate various cloud security mechanisms such as authentication, data encryption, intrusion detection, and intrusion prevention to enhance privacy preservation [13]. The integration of DL with various privacy-preserving technologies provides an effective solution that enhances both security and privacy standards, while maintaining operational efficiency in cloud environments [14].
Table 1, below, shows the differences between the different types of DL applications that lead to enhanced privacy in cloud environments.
The frequency of data breaches in cloud environments continues to rise, driven by increasing threats and the high cost of breaches, which can reach millions of dollars for organizations. Therefore, it is essential to develop robust privacy-preservation techniques to protect the sensitive personal and organizational data stored in the cloud. DL has become an advanced technique for detecting these threats by real-time anomaly detection and secure model training [15].
Data breaches in cloud environments still occur even when using DL with privacy-preserving technologies, due to factors such as high communication and computational overhead and centralized data storage. Therefore, it is crucial to enhance data privacy and security by addressing these technical and operational challenges, as they directly impact the effectiveness of DL integrated with privacy-preserving technologies [16].
In this review paper, we present an overview of deep learning (DL) applications aimed at enhancing privacy preservation in cloud environments. Various privacy-preserving techniques are integrated with DL, such as homomorphic encryption, differential privacy, and federated learning. Modern DL architectures help organizations address the complexity of cyberattacks in cloud-based systems. In the Methodology Section, the PRISMA model outlines the selection process of the papers reviewed in the related work. The Related Work Section discusses the applications of various technologies integrated with deep learning (DL) for privacy preservation. This review highlights recent approaches that combine DL with privacy-preserving techniques to protect sensitive information in cloud environments. The objectives of this review paper are summarized as follows:
  • To review current privacy-preserving technologies integrated with DL.
  • To provide an overview of opportunities for future directions in privacy preservation within cloud environments.
  • To identify challenges related to the application of DL in cloud computing, such as transaction overhead, cloud infrastructure complexity, data processing, and data loss.
  • To examine existing DL-based privacy preservation approaches in cloud computing and identify research gaps and future directions.
The organization of this review paper is as follows: the background is presented in Section 2. The methodology used to organize and collect relevant papers is described in Section 3. An overview of existing work on privacy preservation technologies in the field of deep learning (DL) is provided in Section 4. Section 5 discusses current solutions, while future directions are outlined in Section 6. The proposed work is detailed in Section 7. Finally, Section 8 presents the conclusions based on the findings of this review.

2. Background

2.1. Privacy Preservation Technologies

Confidentiality and privacy are major concerns in cloud environments, particularly when applying deep learning (DL) applications. Several privacy-preserving technologies have been developed to prevent data breaches, such as the following:
  • Data Encryption: Data encryption is a fundamental component of privacy preservation in DL systems, as it enhances privacy at points where sensitive information is stored and processed in cloud environments. While cloud computing enables easy access for users, it also introduces risks related to privacy and security. The implementation of encryption safeguards sensitive data by maintaining confidentiality and ensuring data integrity when transmitting information to the cloud. Encryption mechanisms embedded within DL systems have demonstrated strong privacy capabilities through the use of advanced encryption algorithms and secure key generation techniques in cloud environments [17].
  • Differential privacy: Differential privacy is based on protecting personal data when it exists on cloud computers using mathematical mechanisms. It ensures that individual data points in datasets maintain isolation from one another via added noise data to the current record. The training process becomes secure through differential privacy because it includes noise generation mechanisms that help to prevent attackers from obtaining details about individual records, even when they possess the complete model of differential privacy. However, during model training, it may expose data records to possible leakage through reverse engineering efforts [18].
  • Federated learning (FL): Federated learning allows multiple machines to collaborate in training a shared model while maintaining the training data locally, thus ensuring the privacy and security of sensitive data. The application of FL, which is applied as a DL technique, allows model training on decentralized servers without the need to share the actual data used for training. The cloud environment benefits from this approach because it enables data learning across different locations while protecting privacy and security standards [19].
  • A hybrid approach: A hybrid approach contains multiple algorithms and technologies for enhancing privacy while utilizing DL applications, such as homomorphic encryption, federated learning, and differential privacy. For example, when training data stays in the cloud, DL performs the first extraction by using some filtration privacy techniques. Then, the training data is encrypted with HE, and to improve security, noise injection is used to maintain the iterative updates of FL. These hybrid approaches have been proven to prevent gradient attacks and data breaches in cloud environments [20].

2.2. Challenges of Preserving Privacy on Cloud Platforms

Multiple applications require high-performance computing and big data analysis when employing DL mechanisms through cloud-based systems. Many applications take advantage of cloud platforms to develop DL models because they provide enormous computational capabilities and powerful storage resources. Pattern processing and recognition depend on the capabilities of the DL classifiers. DL models in cloud environments face problems including high energy costs, slow execution times, long training periods, and security threats related to data storage in cloud environments.

2.2.1. Energy Efficiency and Training Cost

During DL model building in various cloud environments, a specific training duration must be determined for each training stage. Research has applied green IT solutions to enhance data center energy efficiency by reducing the carbon emissions of power production so that real energy optimization becomes achievable [15]. There is an immediate need to develop innovative solutions regarding smart infrastructure, such as renewable energy sources, together with data storage technologies that maintain energy efficiency in data centers [16]. The high power consumption of data center servers is minimized through various methods that simultaneously enhance their computing capabilities, memory, and storage capacity. The network infrastructure becomes a focal point for developing methods that enhance data traffic efficiency between nodes in the data center. Studies and improvements in hardware technologies must remain a priority for green data centers [21].

2.2.2. Scalability

DL models require long inference delays as a result of their deeper processing complexity, because deep learning techniques demonstrate more delays when compared to other ML algorithms. Applications without strict delay requirements maintain proper functioning, provided the number of model requests stays below the threshold that ensures system performance stability [22]. All DL applications requiring rapid cloud-based responses cannot tolerate delays at any time. This solution provides insufficient capabilities for distributing resources across systems. A DL model cannot easily operate within the cloud without the prior determination of the inference environment. In cloud computing, DL models using containerized deployments demonstrate the highest efficiency and the fastest processing in traditional cloud environments [17].

2.2.3. Data Privacy

The privacy concern arises from using DL in cloud computing frameworks because the systems require handling private information. Hybrid models capable of handling such information are produced through the combination of personal data obtained from online sources with DNNs and their upload to cloud storage [18]. Dynamic models force DL solutions to operate with up-to-date information that needs continuous data transfer operations. The approach to handling data privacy problems was incomplete through the use of storage in a cloud environment [19]. Organizations implement blockchain technology to handle data security issues because this system provides absolute protection of data privacy and data integrity. The implementation of blockchain technology enables cloud-based IoT systems participating in DL to achieve secure private data transfers between cloud nodes and external networks and maintain data storage control. For example, a blockchain application may store patient identity information on separate blocks. An encryption procedure followed by validation enables the development of unique identifiers for each patient record. A series of wearable sensors and mobile devices track patient records until they reach blockchain preservation. The cloud-based database containing patient records allows DL systems to analyze data according to privacy preservation requirements. Research into blockchain technology aims to establish techniques for maintaining the privacy of personal information in a cloud environment [20].

2.2.4. Cloud Interoperability

Currently, DL algorithms provide the most efficient approach to processing and analyzing extensive datasets. DL model distribution over extensive computer processors on cloud computers is an approach to attain scalability when training and running classifier models [23]. Multi-cloud computing resources are used to support the transmission, storage, and examination of real-time and heterogeneous datasets due to large and diverse training datasets that contain various big data elements [24]. The development of multi-cloud computing as a solution for big data requirements shows significant advancement in training datasets in cloud environments. The insufficient advancement of diverse cloud services produces integration problems that create operational and technical issues [25]. The combination of massive data growth and the urgent need for real-time data processing makes it more challenging on cloud computers.

2.3. Security Threats in Cloud Environment

The widespread adoption of cloud computing demands immediate resolution of privacy-related issues in the cloud environment. Protection of privacy in cloud systems requires keeping sensitive data safe from unauthorized access so that it remains confidential and available and that its integrity is ensured [26,27]. Multiple complicated attack methods consistently challenge the protection of sensitive information in a cloud environment [28]. Multiple attacks occur in cloud computing that aim to compromise privacy preservation, such as data breaches, insider threats, insecure application programming interfaces (APIs), side-channel attacks, and cryptographic attacks, as detailed below:
  • Insider threats: Insider threats occur when legitimate access is used to exploit the privileges of access to specific services in the cloud environment. They are executed via social engineering attacks through the stealing of administrator user credentials. Insider threat detection becomes difficult because cloud environments have complex monitoring requirements combined with extensive privileged-access infrastructure [29].
  • Application Programming Interface (API): Programmatic connectivity to cloud resources became possible through APIs that made cloud services accessible to external systems. Malicious actors exploit weak APIs to compromise cloud systems if those APIs have insecure designs. Security vulnerabilities in these APIs might allow attackers to bypass authentication procedures and access unauthorized data. Modern cloud environments require higher security standards to protect their API connectivity [30].
  • Side-channel attacks: Cloud-based applications and services experience physical and logical leaks when side-channel attacks are used to steal confidential data. Attackers detect sensitive information by monitoring actions in cloud services, such as performance timing variations, power usage, and electromagnetic signal signatures on shared cloud resources [31].
  • Cryptographic algorithms: Secure data handling in cloud computing systems requires cryptographic algorithms for both encryption practices and protected network protocol systems. Cloud data confidentiality becomes vulnerable when attackers target weak cryptographic mechanisms or when implementations are improper. The success of breaking encryption attacks involves either brute-force methods, which only use ciphertext, or targeting weak points in cryptographic algorithms. These approaches allow attackers to obtain protected information. The protection of data privacy within cloud systems depends on strong cryptographic methodologies, as cloud systems primarily use encryption algorithms to protect cloud customers [32].
  • Data breaches: Unauthorized access to cloud-stored confidential information occurs due to cloud infrastructure vulnerabilities, weaknesses in user authentication practices, and user configuration errors. The disclosure of sensitive data during a breach results in severe privacy violations, financial damage, and a damaged organizational reputation [33].
The growing expansion of cloud computing industries produces new security concerns that require proactive attention. These new security problems need to be evaluated and resolved proactively before data breaches occur. Dependence on third-party providers represents a fundamental reason for data breaches in a cloud environment [34]. The table below, Table 2, provides the top 10 data breaches that occurred in the last decade across various cloud environments:

3. Methodology

We have used the PRISMA methodology, which provides a structured approach to conducting research that ensures a comprehensive understanding and valuable outcomes, as shown in Figure 1. The current body of literature on DL leverages various technologies to enhance privacy preservation in cloud environments and protect against data breaches. The research methodology follows a systematic review process, which includes identifying research questions, selecting papers from databases, defining a search string, and applying inclusion and exclusion criteria to choose relevant papers aligned with our topic.
The research methodology is summarized in Table 3, which is organized as follows: select the research question aimed at research success, then apply the identification stage, followed by screening, and finally, include papers.

3.1. Survey Questions

  • What are the current strategies used to improve privacy when deploying various DL applications in a cloud environment?
  • What are the limitations of each paper that deployed privacy preservation in a cloud environment?
  • What are the major vulnerabilities to data privacy introduced when DL is applied on cloud platforms?
  • How can DL applications be designed to provide strong and guaranteed privacy preservation by different techniques, such as differential privacy, homomorphic encryption, or federated learning?
  • What are the future research directions for applying DL to privacy preservation in cloud computing?

3.2. Identification Stage

Selecting appropriate papers represents a key step to quickly gain knowledge about the research topic. Several papers selected which match our subject from various online data resources, including IEEE Xplore, ACM Digital Library, SpringerLink, Saudi Digital Library, and Google Scholar. The total number of papers collected during the identification stage was 365. The search string is represented as (Privacy OR Confidentiality OR Data Breaches) AND (DL OR Deep Learning) AND (CC OR Cloud Computing) AND (Differential Privacy OR DP OR Homomorphic Encryption OR HE OR Federated Learning OR FL).

3.3. Scanning Stage

The first scanning process helps identify duplicate papers. Then, a paper screening is conducted to select content that matches our topic and provides valuable, relevant information. The essential second stage functions as an indispensable factor for achieving our goal. During this stage, the research selects specific attributes from the chosen papers that fulfill the requirements of our research project area. The total number of papers collected after applying the research criteria was 64.
The investigation method depends on five scientific online data resources, including IEEE Xplore, ACM Digital Library, SpringerLink, Saudi Digital Library, and Google Scholar, which were used to retrieve papers whose research matches the search terms. However, the general keywords used in the search engines include deep learning (DL), federated learning (FL), homomorphic encryption, differential privacy, CC, neural network, cloud computing, privacy, data breaches, and confidentiality. The research will make significant contributions to hybrid approaches for privacy-preserving techniques when deployed in cloud environments.

3.4. Inclusion and Exclusion Stage

  • This review paper focuses on hybrid approaches used to enhance privacy in cloud environments, such as DL combined with FL, HE, or DP.
  • It excludes any papers that use a single technology alone, such as FL, DL, or HE, for privacy preservation in cloud environments, without incorporating DL.
  • The search process filters out non-English language papers and excludes studies published before 2020.
Figure 2 shows the distribution of selected papers published between 2020 and 2025. Most of the selected papers focus on enhancing data privacy in the cloud in the year 2024. The total number of included entries in this review is 38, as shown in Figure 1.

4. Related Work

In this section of the research, we reviewed several published papers addressing methods to prevent data breaches in cloud environments. We analyzed existing research solutions that enhance privacy during data training in the cloud using various privacy-preserving techniques integrated with DL. The objective of this investigation is to identify gaps and limitations in each study and explore potential improvements in performance or accuracy that can help prevent data breaches.

4.1. Federated Learning

Wang et al. [14] proposed a privacy-preserving method called Verifiable Privacy Preserving Federated Learning (VPPFL), which enables users to collaboratively train models while maintaining the confidentiality of their individual data, ensuring that no sensitive data is shared between users. The main contributions include an authenticated federated learning (FL) system integrated with multiple threshold encryption keys for enhanced security, along with mechanisms to handle user dropout during the training process. VPPFL utilizes threshold multi-key homomorphic encryption to protect local gradients while allowing users to independently verify the results through a one-way function. The approach was tested using a fully connected neural network (FCNN) on the MNIST dataset in a MATLAB 2016 simulation, achieving an accuracy of 98%, thereby demonstrating its effectiveness in privacy protection.
Li et al. [27] addressed the challenges of applying federated learning (FL) in mobile edge computing (MEC), particularly in terms of privacy leakage and limited computational resources. The author proposed a model called DynamicNet, which incorporates two key components: Privacy Budget Regulation (PBR) and Dynamic Aggregation Weights (DAW). PBR enhances privacy by offering stronger protection against inference attacks, while DAW adjusts aggregation weights dynamically to accommodate heterogeneous edge environments. This approach maintained high model accuracy even when training on lower-quality datasets, such as MNIST and CIFAR-10, across varying edge device capabilities. The model achieved a recorded accuracy of 92.9%.
Su et al. [35] proposed a federated learning (FL) model for smart grids that incorporates edge–cloud cooperation and employs a two-layer Deep Reinforcement Learning (DRL) algorithm. The edge–cloud architecture enables efficient data processing through distributed systems. The FL model integrates DRL techniques to help both users and energy service providers (ESPs) determine optimal strategies for resource management and participation. The approach demonstrated strong performance in simulations, showing improvements in communication efficiency and modeling accuracy. The model proved to be a viable solution for AIoT-powered energy systems, offering both safety and sustainability. A convolutional neural network (CNN) was used within the FL framework and evaluated using the MNIST dataset, achieving an accuracy of 95.8%.
Zhou et al. [36] applied a federated learning architecture on the edge cloud for AIoT systems to improve Quality of Service (QoS). The models were trained on edge devices while selectively transmitting updates to the cloud, which led to reduced latency and preserved data privacy. The model was evaluated on the MNIST and Fashion-MNIST datasets, achieving an accuracy of 96.3%. The results demonstrated that the model improved communication efficiency, reduced latency, and enhanced privacy compared to centralized learning methods. This approach is suitable for use in bandwidth-limited environments and offers significant performance benefits in terms of latency and data protection for applications in smart cities, healthcare, and industrial IoT.
Makkar et al. [37] developed FedLearnSP, a secure federated learning framework designed to detect interfering attacks without compromising user data privacy. Edge computing enables the model to train on various devices that detect spam images, ensuring that user data remains in its original location. The model demonstrates strong scalability and low communication overhead during testing under different data conditions. FedLearnSP achieves reliable threat detection results, making it particularly suitable for applications that require real-time protection against threats and effective management of privacy-sensitive data.
Liu et al. [38] presented a Hierarchical Federated Learning (HFL) framework that optimizes training among clients, edge servers, and the cloud. The authors leveraged intermediate edge nodes to aggregate data processing locally before sending updates to the cloud, thereby reducing training time and energy consumption. The HFL model demonstrated reduced communication overhead and faster convergence compared to flat FL architectures. The highest recorded accuracy was 90% on the CIFAR-10 dataset.
Zhang et al. [39] proposed a lightweight and secure federated learning scheme (LSFL) designed for edge computing. The LSFL model focused on protection against privacy leakage, Byzantine attacks, and frequent edge node downtime. It incorporates a “Two-Server Secure Aggregation Protocol,” which enables secure and resilient model aggregation without relying on a single server or introducing significant cryptographic overhead. The accuracy of the LSFL model is consistent with that of FedAvg while demonstrating better computational efficiency and security, achieving a highest accuracy of 98.92%. This methodology addresses the gap in robust federated learning, making LSFL suitable for deployment in lower-resource edge environments. As a result, the proposed approach is both scalable and deployable in real-world smart systems.

4.2. Hybrid Approach

Qayyum et al. [40] introduced Clustered Federated Learning (CFL) as a model for diagnosing COVID-19 using multimodal X-ray and ultrasound data in a cloud environment. CFL distributes training operations to independent edge processing units that perform image diagnosis locally, keeping the data on the devices. The model achieved improvements in F1 scores of 16% for X-ray analysis and 11% for ultrasound diagnosis compared to conventional centralized approaches. The authors highlighted the significance of applying AI technologies with FL in real-time healthcare diagnostics while adhering to privacy regulations and ethical standards.
Rahulamathavan et al. [41] proposed a privacy-enhanced speaker verification system using the Cheon–Kim–Kim–Song (CKKS) fully homomorphic encryption scheme to process encrypted speaker features. The system ensures privacy by encrypting speech features during all transmission phases between devices and server systems. The method first extracts i-Vector feature vectors from speech data and then encrypts them using the CKKS scheme. The proposed system demonstrated strong authentication performance and achieved a 2.8% reduction in the Equal Error Rate (EER).
Parra-Ullauri et al. [42] introduced the KubeFlower model, a Kubernetes-native framework that enables privacy-preserving federated learning (FL) in cloud infrastructures. The authors implemented the KubeFlower model in two stages: first, the isolation of storage during cloud design to ensure secure resource partitioning, and second, the integration of differential privacy through Privacy-Preserving Persistent Volume Claims (P3-VC), which adds controlled noise during data processing. The framework supports FL systems distributed across cloud and edge nodes in different geographic locations while maintaining performance under privacy constraints. According to experimental results, the deployment phase of KubeFlower is completed 70% faster than KubeFATE but runs slightly slower than Helm charts.
Vizitiu et al. [43] examined two framework designs aimed at protecting the personal data privacy of wearable medical devices. The ECG signal model ensured raw data privacy by integrating homomorphic encryption algorithms. The authors employed CipherML in combination with homomorphic encryption to achieve higher training speeds while processing encrypted data.
Aziz et al. [44] proposed a new approach integrating homomorphic encryption (HE) and differential privacy (DP) within federated learning (FL) frameworks. The model analyzed potential privacy attacks and mitigated model leakage while maintaining acceptable performance levels using the Multi-Krum algorithm. The authors evaluated the model using deep learning frameworks on the MNIST dataset and achieved a reduction in communication overhead.
Korkmaz et al. [45] proposed a hybrid approach that combines the homomorphic HE technique with a bitwise scrambling algorithm to reduce computation time while enhancing privacy in FL. The model was evaluated on the MNIST and CIFAR-10 datasets using CNNs, achieving an accuracy of over 90%. The approach employed layer-wise encryption, which enables efficient encrypted training without significantly impacting the model’s performance.
Choi et al. [46] introduced a novel approach to enhancing privacy in cloud environments by combining federated learning (FL) with homomorphic encryption (HE). The model was designed using SmartNICs to store encryption keys and encrypt model weights within the FL process. The training results demonstrated reduced resource overhead without impacting accuracy, although training time increased by approximately 46% when evaluated on the MNIST and FEMNIST datasets.
Qiang et al. [47] utilized leveled homomorphic encryption (HE) and binary neural networks (BNNs) to reduce data loss between the training and test datasets. The training stage employed VGG19 and AlexNet architectures within a cloud environment. The framework applied an encryption scheme prior to transferring data to the cloud to enhance privacy preservation. The model was evaluated using the CIFAR-100 dataset, and the results demonstrated reduced memory usage and effective protection of both training and test data privacy.
He et al. [48] employed Paillier Homomorphic Encryption (PL-FedIPEC) to protect user privacy in edge computing environments using an enhanced version of federated learning. The novelty of the work is in the optimization of Paillier encryption through a key generation mechanism that introduces intermediate values and an additional hyperparameter to reduce encryption latency. The model demonstrated reduced training time while preserving accuracy when combined with FedAVG, making it suitable for latency-sensitive applications and reducing computational overhead in real-time systems.
Lin et al. [49] proposed a new federated learning framework designed for medical applications in cloud environments. The model employed Paillier encryption, random number generation, additive secret sharing, and verification via discrete logarithms to ensure integrity and gradient privacy on the server side. The framework, named PPVerifier, was evaluated using the MNIST dataset and achieved an accuracy of 97%. The model maintained this accuracy with minimal loss and demonstrated a reduction in overhead of more than 50% compared to bilinear signature schemes.
Zhang et al. [50] utilized homomorphic encryption and cryptographic masking in a privacy-preserving federated learning model for IoT healthcare systems. The authors emphasized that the evaluation focused on dataset quality rather than size. The proposed framework’s privacy guarantees were validated through experiments using the HAM10000 skin lesion dataset, where it maintained classification accuracy while preserving data confidentiality against inference and reconstruction attacks. This approach offered a scalable, privacy-preserving solution for medical diagnostics, ensuring both data integrity and clinical utility, and achieved over 76.9% accuracy in detecting lesion cell types.
Zhou et al. [51] proposed a hybrid fog-based federated learning framework to protect against various attacks that compromise privacy preservation by integrating differential privacy and Paillier encryption. The framework employs differential privacy and secure aggregation techniques to ensure user confidentiality. It was evaluated on the Fashion-MNIST dataset and achieved high accuracy in IoT applications. The model is particularly well suited for deploying latency-sensitive and data-intensive applications at the cloud edge.
Wang et al. [52] addressed data privacy challenges in electric power systems by proposing a decentralized, distributed solution that combines federated learning with edge computing. Differential privacy techniques were applied at the cloud edge to ensure privacy preservation. The model enables secure electric power load forecasting using existing data. The system is structured into a three-layer cloud–edge architecture to achieve coordination and balanced data distribution. Each layer collects information based on scheduling control tasks to enhance scheduling effectiveness. The proposed scheme demonstrated a relative error of approximately 1.580% in electricity load forecasting and exhibited low memory usage.
Nguyen et al. [53] designed a new framework called FedGAN, which merges federal learning systems with differential privacy protocols to protect privacy. The FedGAN framework serves as a blockchain platform that automates X-ray image training to identify cases of COVID-19. The FedGAN implemented a Proof-of-Reputation (PoR) algorithm as its lightweight protocol for fast verification of COVID-19 cases.
Dong et al. [54] proposed a novel approach to secure medical data analysis by combining federated learning with homomorphic encryption in a distributed data fabric, offering strong privacy guarantees and high model performance. The model employed Homomorphic Adversarial Networks (HANs) integrated with CNNs, and when evaluated on the MNIST and CIFAR-10 datasets, it achieved over 92% accuracy and demonstrated robustness against adversarial privacy breaches.

4.3. Homomorphic Encryption

Hayati et al. [55] proposed a framework for designing coding mechanisms that preserve privacy during data sharing and processing in cloud computing using homomorphic encryption. The model demonstrates a level of differential privacy without compromising algorithm performance and ensures robustness through the training and testing of learning algorithms, achieving an accuracy of approximately 97.75%.
Naresh et al. [56] proposed a privacy-preserving deep neural network model for credit risk prediction in the cloud (PPDNN-CRP). The model is based on homomorphic encryption integrated with deep neural network processing to protect sensitive loan application data during both the training and inference phases. The authors conducted experiments using TensorFlow for DNN operations and TenSEAL for HE. The PPDNN-CRP model was evaluated using the Kaggle loan dataset and achieved an accuracy of 80.48%, which was competitive with Privacy-Preserving Logistic Regression (PPLR), which achieved 77.23%.
Akram et al. [57] addressed the challenge of preserving user privacy when outsourcing deep learning inference tasks to the cloud. The authors proposed a privacy-preserving classification model based on HE. The model enables users to send encrypted queries to the cloud and receive encrypted inferences in return. It utilized the Microsoft SEAL cryptographic library and applied a CNN trained on unencrypted data with real activation functions. The model achieved an accuracy of 98.25% using both the sigmoid and ReLU activation functions when evaluated on the MNIST dataset.
Lam et al. [58] combined two cryptographic techniques, Learning with Errors (LWE) and Ring Learning with Errors (RLWE), to enable high-accuracy encrypted data processing by implementing Fully Homomorphic Encryption (FHE) for enhanced privacy in training data. This hybrid approach was integrated with a convolutional neural network (CNN) architecture by splitting characteristics during training to support efficient performance of both linear and non-linear functions within the CNN. The model was tested on the MNIST dataset and achieved a highest accuracy of 94.80%, along with the shortest testing time, which was three times faster than other models.
Song et al. [59] ensured that sensitive user data, such as classified images and medical diagnoses, remains confidential by utilizing CKKS Homomorphic Encryption in a cloud computing environment. The proposed framework adds noise to the data during the training phase. This approach applies a homomorphic encryption scheme to infer images, requiring 27.15 seconds to process 141 images which consuming 0.19 seconds per image using a batch method. The system maintained an accuracy of 99.05% while preserving data privacy.
Prabhu et al. [60] presented a mutual authentication mechanism using RSA cryptography and Schnorr’s signature scheme to enhance data privacy in cloud computing. The model employed a CNN to strengthen the privacy schema during data transmission. The proposed system maintained consistently high accuracy levels of 99.97% on the WSN-DS dataset and 99.93% on the CIC-IDS2017 dataset.
Li et al. [61] implemented an efficient method for multi-biometric recognition using a neural network-based feature combination of face and voiceprint information while preserving user privacy. The system combined face and voiceprint feature extraction through CNNs, followed by a fully connected layer and ArcFace loss to create a fusion model. The MK-CKKS cryptosystem served as the privacy mechanism, applying multi-key homomorphic encryption to securely process encrypted biometric templates without the need for a trusted third party. The system recorded an Equal Error Rate (EER) of 0.66%, outperforming individual biometric approaches, which recorded 2.42% for face and 11.21% for voiceprint.
Choi et al. [62] presented Blind-Touch, an encryption system for fingerprint authentication that performs secure neural network inference on encrypted data using homomorphic encryption. The model divided the authentication operations into two stages: first, the client decrypts the fully connected layer (FC-16) in plaintext and then transmits it to the server for processing the second encrypted layer (FC-1). The model achieved an F1 score of 93.6% and 98.2% accuracy using the PolyU and SOKOTO datasets, with authentication requiring 0.65 seconds to match among 5000 fingerprints.
Owusu-Agyemang et al. [63] analyzed privacy issues via the MSCryptoNet framework embedded with DL through an examination of current privacy-preserving techniques. MSCryptoNet preserved data privacy by applying homomorphic encryption together with secret-sharing methods. This method provided an open-source implementation of a multi-scheme framework that enables different encryption methods during operations. Compared to other models, MSCryptoNet demonstrated a shorter operation time and lower cost. The module successfully classified instances with 8192MB or greater capacity in each training iteration. The model operated at a low cost and maintained efficient communication in a cloud environment.
Zhao et al. [64] utilized Multi-Scheme Differential Privacy (MSDP) together with a DNN to protect both the accuracy and privacy aspects of training data. The model applied NTRU encryption along with Multi-Key Fully Homomorphic Encryption (MK-FHE), embedded with dynamic differential privacy. The model consists of four parties: data providers (DP), computational evaluator (CE), data analyst (DA), and crypto service provider (CSP). As part of their proposal, the authors outlined procedures enabling data providers to create cryptographic keys before encrypting data, followed by the integration of Laplace noise. The model interacted with Secure Multi-Party Computation (SMC) using ReLU functions, allowing the module to achieve operational cost reduction.
Yang et al. [65] provided a novel technology called Pio to secure outsourced data services by creating two non-colluding servers based on homomorphic encryption and implementing the inference phases layer by layer to preserve the privacy of sensitive data. The model overcomes the challenges of maintaining confidentiality during the input of training data into the model. The evaluation results showed that Pio was 2 to 4 times faster than existing modules.

4.4. Differential Privacy

Sharma et al. [66] proposed a differential privacy-based framework using a fuzzy convolutional neural network (DP-FCNN) with a Laplace mechanism to enhance user data privacy in mobile edge computing. The framework addressed privacy leakage at the edge layer and unauthorized data access by injecting noise and encrypting data before uploading it to the cloud. The key contributions of the proposed model included a two-step process for data uploading and query serving. To maintain privacy preservation, the model applied an FCNN and the Laplace mechanism. The data was encrypted using the lightweight Piccolo algorithm, and secure searching was enabled via a Merkle hash tree. The model authenticated users using the lightweight BLAKE2s algorithm and achieved an accuracy of 97–98%.
Gayathri et al. [67] proposed a new methodology, a Pseudo-Predictive Deep Denoising Network (PPDDN), for enhancing the reliability of connections between different parties and maintaining data privacy in the cloud. The system ensures that medical images remain invisible to malicious users through the use of Gaussian noise data. The hybrid model between PPDDN and Gaussian noise enhanced privacy and addressed the challenges of medical image confidentiality in cloud computing. The performance of the PPDDN model was measured as the signal-to-noise ratio (SNR) = 24.3165, similarity index (SI) = 0.973, Error Rate (ER) = 0.001, and contrast-to-noise ratio (CNR) = 42.2%.
Bukhari et al. [68] applied a novel approach to enhancing privacy within Wireless Sensor Networks (WSNs) by integrating federated learning (FL) with a hybrid model of a Stacked Convolutional Neural Network and Bidirectional Long Short-Term Memory (SCNN-Bi-LSTM). The model addressed potential legal and privacy concerns associated with data sharing during training on cloud computing platforms. The proposed model achieved consistent accuracy outcomes of 99.97% on the WSN-DS dataset and 99.93% on the CIC-IDS2017 dataset.
Feng et al. [69] presented a novel approach to enhancing privacy and efficiency by integrating chaotic image encryption with convolutional neural networks (CNNs). The integration of chaotic mapping with CNNs was applied for local block encryption, in which images were divided into smaller segments to improve security during the encryption process. The model was evaluated using the Peak Signal-to-Noise Ratio (PSNR) and Structural Similarity Index (SSIM), maintaining high visual quality while ensuring strong encryption.
Zhang et al. [70] proposed a novel model designed for facial expression detection while ensuring user data privacy. The model adopts an edge computing approach by integrating a Time-of-Flight (ToF) depth sensor, specifically the VL53L5CX model, into a wearable cap for unobtrusive monitoring, chosen for its low cost and low power consumption. The algorithm achieved notable accuracy, with CNN delivering the highest performance at 89.20% accuracy, a frame rate of 15 frames per second (fps), and a maximum latency of 2 milliseconds (ms).
Sharma et al. [71] proposed a differential privacy framework based on a fuzzy convolutional neural network (DP-FCNN). DP-FCNN addressed privacy violations by integrating the Laplace mechanism, which adds noise to the training data to protect it from unauthorized access. Before transferring data to the cloud, the module performs encryption using the Piccolo protocol. To facilitate secure and efficient access to cloud-stored data, the model employs the BLAKE2s hashing algorithm. The proposed framework achieved an accuracy of 90%.
A summary of privacy-enhancing technologies for cloud-based deep learning (DL) algorithms is presented in Table 4. This section evaluates privacy preservation techniques with DL, highlighting their individual limitations, the applied mechanisms, and the gaps identified in each research paper.

5. Discussion

Integrating deep learning (DL) into cloud environments for the privacy preservation of sensitive data represents a transformational step toward addressing data security challenges. The analysis of related works reveals several key trends and technical solutions. Federated learning is one of the most prominent techniques for preserving privacy, as it allows sensitive data to remain local while enabling collaborative model training. However, despite its effectiveness, federated learning faces challenges such as communication overhead and vulnerability to model poisoning attacks, which necessitate additional solutions to enhance performance. Meanwhile, homomorphic encryption, though secure, proves less efficient at a large scale in cloud environments due to its significant computational overhead.
Recently, many researchers have focused on integrating DL with privacy-preserving technologies in cloud environments, resulting in multiple solutions based on customized models. These customized models have demonstrated robust privacy-preserving capabilities for protecting user data during model training but often incur high computational costs. On the other hand, privacy preservation in mobile edge computing has shown higher accuracy in preserving data privacy; however, it faces limitations due to constrained resources, particularly when handling large-scale data. Several studies have integrated homomorphic encryption with federated learning to reduce resource overhead, although this often results in increased training times and persistent challenges with computational complexity. Similarly, while differential privacy frameworks demonstrate high accuracy, they continue to face limitations in scalability and demand substantial computational resources.
Hybrid approaches combining multiple privacy-preserving mechanisms often have improved accuracy but introduce greater complexity in implementation and maintenance. For instance, integrating differential privacy with federated learning enhances the protection of sensitive information but requires careful parameter tuning to preserve model accuracy. Despite their benefits, hybrid architectures often face challenges such as increased system complexity, communication overhead, and exposure to certain security threats. These limitations have motivated the development of optimized and customized privacy-preserving models, including architectures that integrate blockchain with federated learning.
Figure 3 presents the distribution of privacy-preserving technologies utilized in the reviewed studies. Federated learning emerged as the most widely adopted technique, appearing in 48.1% of the papers, due to its effectiveness in maintaining data privacy on cloud platforms. Homomorphic encryption was employed in 35.2% of the studies to enhance data confidentiality during model training. Differential privacy was used in 16.7% of the works, often in combination with DL with DP, to mitigate the risk of individual data leakage. These statistics highlight federated learning as the dominant research trend for privacy preservation in cloud environments, while the combination of HE and DP is increasingly explored for building more robust and scalable solutions. As demonstrated by the reviewed studies, the primary research gaps in privacy preservation for cloud computing involve high computational overhead and communication inefficiencies. Furthermore, developing models that protect sensitive information in heterogeneous cloud environments presents scalability challenges, particularly for deep learning frameworks. Addressing these limitations requires the design of lightweight and adaptive privacy-preserving solutions. Future work should focus on developing models that ensure privacy without compromising efficiency, scalability, or user experience.
Despite the significant progress achieved by deep learning techniques in enhancing data privacy within cloud environments, several key directions remain for future research. Developing more robust, efficient, and scalable privacy-preserving systems is essential. In particular, optimizing the computational efficiency of techniques such as homomorphic encryption and differential privacy has emerged as a critical area of ongoing investigation. When developing deep learning models integrated with cryptographic algorithms and federated learning protocols, it is essential to prioritize energy efficiency and low latency. These characteristics are critical to enabling privacy-preserving models suitable for large-scale cloud deployments. An adaptive approach can enhance the resilience of cloud systems against cyber threats, such as adversarial attacks, while also improving model accuracy, system performance, and privacy preservation. Finally, the integration of blockchain technology with DL for decentralized trust management represents a promising research direction in the context of privacy preservation in cloud computing. There remains a critical need to standardize evaluation frameworks for privacy-preserving DL models deployed in cloud environments. Furthermore, the development of robust security metrics and compliance frameworks is essential to ensure consistent and reliable privacy protection across diverse cloud infrastructures.

6. Future Directions

Despite the strong potential of deep learning (DL) in enhancing data privacy within cloud environments, current privacy-preserving mechanisms, particularly those combining federated learning, homomorphic encryption, and differential privacy, still face notable technical challenges. These include high computational complexity, significant communication overhead, and performance degradation in heterogeneous cloud systems. To address these limitations, future research should focus on developing more specialized, efficient, and scalable solutions that leverage emerging technological advancements.
One promising direction for future research is the integration of quantum computing into privacy-preserving frameworks. However, cryptographic systems being a threat, quantum computing has the potential to enhance secure computations using quantum-safe encryptions and quantum key distribution (QKD). Privacy-preserving DL model and quantum-enhanced encryption offer secure with lowered latency and stronger guarantees against adverse cyber threats. Additionally, incorporating post-quantum cryptographic techniques into DL-based cloud infrastructures can help ensure long-term data confidentiality and system robustness.
The investigation of the possibilities of using blockchain technology is another attractive continuation for future work on privacy preservation. Blockchain provides a decentralized tamper-evident ledger for maintaining updates to DL models that ensure access control and trust in collaborative learning environments. With the aid of smart contracts, model transactions and data usage policies are automated in compliance with privacy requirements and operate independently of a centralized authority. The development of lightweight blockchain frameworks would complement high-frequency DL training, and consensus mechanisms can be tailored to asynchronous cloud environments in order to overcome bottlenecks of performance.
More importantly, ensuring scalability will require the development of adaptive privacy-preserving architectures capable of monitoring resource consumption, system states, and user demand in real time. Future research should prioritize enhancing DL models to make them suitable for deployment on resource-constrained cloud nodes. Techniques such as neural architecture search (NAS) and model pruning can be employed to optimize model efficiency while maintaining robust privacy guarantees.
In addition, there is a critical need for the standardization of measurement metrics and compliance frameworks for privacy-preserving DL models. Defining quantifiable privacy metrics, establishing best practices, and aligning with regulatory standards such as the General Data Protection Regulation (GDPR) and the Health Insurance Portability and Accountability Act (HIPAA) for cloud-based DL systems across specific domains enable fair standards, promote transparency, and support practical and trustworthy deployment.

7. Proposed Work

Deep learning models can be flexibly and scalably deployed in cloud computing, but privacy issues may arise when sensitive data and updates are sent or processed in the cloud. FL provides a decentralization model in which the local data is kept on devices, whereas for model updates, only gradients are shared.
But there is a risk of information leakage through gradients, and privacy preservation mechanisms are necessary.
Any newly proposed methods involving privacy-preserving deep learning for the cloud should be investigated for securely pooling and averaging encrypted gradients while still respecting DP requirements. The authors have decided to use an LSTM deep learning model, use a privacy-based dataset such as MNIST, apply fully federated learning and homomorphic learning with 24 keys, add Laplace noise for differential privacy, and assess the relationship between accuracy and F1, the speed of execution, and the number of privacy breaches. If encryption is not used or the privacy requirements are reduced, it is important to include comparable baselines as well. This protocol presents a standard for measuring privacy-preserving systems, as it works under common conditions and communication patterns in the cloud, which helps create secure and reliable deep learning applications.

7.1. Fully Federated Learning with LSTM Models

Federated learning involves several clients learning a common global model in a collaboration with no direct exchange of raw data. Every client trains an LSTM neural network on the MNIST dataset and uploads an encrypted gradient to the server.
  • LSTM Model: LSTM models are selected because of their efficiency in working with the MNIST dataset on sequences, characterized by memory blocks and gate schemes aimed at describing long-term dependencies.
  • Local Training: Gradients (gi) are calculated from the local data of each client.
  • Global Model Update: These gradients are combined by the server that makes the global model update.

7.2. Homomorphic Encryption for Secure Gradient Aggregation

Homomorphic encryption enables calculation from encrypted data without decryption for privacy preservation. This method blocks individual gradients from being accessed by either the server or malicious clients and prevents the threat of model inversion or membership inference attacks.
  • Encryption Scheme: Local gradients E n c ( g i ) are encrypted with the help of a fully homomorphic encryption scheme that employs 24 keys.
  • Key Management: A set of keys will be used, one given to clients for encryption and the other kept secret by the server or otherwise shared with trusted parties.
  • Gradient Aggregation: The server estimates the total gradients decrypted E n c ( g i ) among single gradients.
  • Decryption: The server decrypts the aggregate of gradients to update the global model.

7.3. Differential Privacy with Laplace Noise

Differential privacy injects some controlled noise into the gradients in order to offer formal privacy guarantees.
  • Laplace Mechanism: Adds noise that is distributed with Laplace L a p ( Δ f / ϵ ) and sampled from the aggregated gradients, where Δ f is the sensitivity and ϵ is the privacy budget.
  • Privacy Budget ϵ : Regulates the tradeoff between accuracy and privacy; the smaller the value of ϵ , the greater privacy, but this could mean lower accuracy.
  • Noise Addition: The addition of noise following homomorphic aggregation prevents undue noise in the gradients of individuals to enhance the utility of a model.
  • Privacy Accounting: The total privacy loss must be monitored across training rounds while complying with DP guarantees.

7.4. Algorithmic Description (Privacy-Preserving Federated Learning with HE and DP)

1. Initialization:
  • Server uses global model parameters θ 0 .
  • Server utilizes a homomorphic encryption key pair ( p k , s k ) .
  • Distributes p k to all clients.
2. Training dataset
Client side:
For each training round t = 1, 2, …, T,
-
Each client uses i = 1 to n in parallel.
-
Client i computes the local gradient:
g i t = θ L ( θ t 1 , D i )
-
Client i encrypts ( g i t ) using a public key:
g ˜ i t = E n c p k ( g i t )
-
Client i sends g ˜ i t to the server.
Server side:
-
The server encrypts gradients homomorphically g i t :
G ˜ t = i = 1 n g ˜ i t
-
The server decrypts gradient G ˜ t :
G t = D e c s k ( G ˜ t )
-
The server adds Laplace noise for differential privacy with G t :
G ^ t = G t + L a p Δ f ϵ
-
The server updates global model parameters:
θ t = θ t 1 η G ^ t
-
The server broadcasts the updated model θ t to all clients.
  • Training on the dataset is repeated until the maximum number of rounds is reached.

7.5. Experimental Setup and Evaluation

  • Dataset: The MNIST dataset.
  • Model: An LSTM algorithm for sequence modeling.
  • Metrics:
    Accuracy and F1 score: Measure the performance of the LSTM model.
    Execution Speed: Measure time per training round, including encryption and decryption overhead.
    Privacy Leakage: Measure privacy preservation based on empirical attacks.
  • Baselines: The author will perform an experiment based on a comparison of the LSTM model with different privacy preservation approaches, such as the following:
    Federated learning with DP and HE.
    Federated learning without encryption or DP.

8. Conclusions

In conclusion, this review paper highlights the critical importance of enhancing privacy preservation through the integration of deep learning (DL) with privacy-preserving technologies to address the challenges of sensitive data breaches in cloud environments. While cloud computing offers numerous advantages, its inherent reliance on distributed data storage introduces various threats, including unauthorized access, insider attacks, and data leakage. The vast volume of data stored in the cloud necessitates robust and scalable protection mechanisms, which has led to the adoption of distributed approaches such as federated learning, homomorphic encryption, and differential privacy. However, these techniques, when deployed individually, exhibit notable limitations, including high computational overhead, communication inefficiencies, and decreased model accuracy. This study demonstrates that standalone DL techniques are insufficient to fully safeguard sensitive information in the cloud. Instead, hybrid approaches that combine multiple privacy-preserving mechanisms with DL have shown greater potential in effectively securing data. As a direction for future research, continuous innovation is essential to strike a balance between privacy, security, and operational efficiency. This includes minimizing computational and communication overheads while developing optimized hybrid models for privacy preservation in cloud-based environments.

Author Contributions

Both authors have equally contributed. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Deanship of Scientific Research, Vice Presidency for Graduate Studies and Scientific Research, King Faisal University, Saudi Arabia [Grant No. KFU252332].

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author.

Acknowledgments

The authors wish to express their gratitude to the Deanship of Scientific Research, Vice Presidency of Graduate Studies and Scientific Research, King Faisal University, Saudi Arabia. We would like to acknowledge the anonymous reviewers who made great contributions with their brilliant scholarly comments and recommendations for the quality and poignancy of the paper.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations were used in this review paper:
CNNConvolutional neural network
DLDeep learning
FLFederated learning
DPDifferential Privacy
CCCloud computing
PPDLPrivacy-preserving deep learning
RNNRecurrent Neural Network
LSTMLong short-term memory
GANGenerative adversarial network
HEHomomorphic encryption
DP-DLDifferentially Private Deep Learning
APIApplication programming interface
DDoSDistributed denial of service
VPPFLVerifiable Privacy-Preserving Federated Learning
FCNNFully connected neural network
MECMobile edge computing
PBRPrivacy Budget Regulation
DAWDynamic Aggregation Weights
ESPsEnergy service providers
QoSQuality of Service
HFLHierarchical Federated Learning
LSFLLightweight Secure Federated Learning
CKKSCheon–Kim–Kim–Song
P3-VCPrivacy-Preserving Persistent Volume Claims
BNNBinary Neural Networks
PL-FedIPECPaillier Homomorphic Encryption
CFLClustered Federated Learning
CRPCredit Risk Prediction
RLWERing Learning with Errors
HANsHomomorphic Adversarial Networks
PPDDNPseudo-Predictive Deep Denoising Network
WSNsWireless Sensor Networks
MSDPMulti-Scheme Differential Privacy
MK-FHEMulti-Key Fully Homomorphic Encryption
GDPRGeneral Data Protection Regulation
HIPAAHealth Insurance Portability and Accountability Act General Data Protection Regulation

References

  1. Zhang, Z.; Ning, H.; Shi, F.; Farha, F.; Xu, Y.; Xu, J.; Zhang, F.; Choo, K.K.R. Artificial intelligence in cyber security: Research advances, challenges, and opportunities. Artif. Intell. Rev. 2022, 55, 1029–1053. [Google Scholar] [CrossRef]
  2. Kaur, A.; Luthra, M.P. A review on load balancing in cloud environment. Int. J. Comput. Technol. 2018, 17, 7120–7125. [Google Scholar] [CrossRef]
  3. Wang, S.; Tuor, T.; Salonidis, T.; Leung, K.K.; Makaya, C.; He, T.; Chan, K. Adaptive federated learning in resource constrained edge computing systems. IEEE J. Sel. Areas Commun. 2019, 37, 1205–1221. [Google Scholar] [CrossRef]
  4. Rahman, A.; Hasan, K.; Kundu, D.; Islam, M.J.; Debnath, T.; Band, S.S.; Kumar, N. On the ICN-IoT with federated learning integration of communication: Concepts, security-privacy issues, applications, and future perspectives. Future Gener. Comput. Syst. 2023, 138, 61–88. [Google Scholar] [CrossRef]
  5. Mothukuri, V.; Parizi, R.M.; Pouriyeh, S.; Huang, Y.; Dehghantanha, A.; Srivastava, G. A survey on security and privacy of federated learning. Future Gener. Comput. Syst. 2021, 115, 619–640. [Google Scholar] [CrossRef]
  6. Li, L.; Li, X.; Jiang, L.; Su, X.; Chen, F. A review on deep learning techniques for cloud detection methodologies and challenges. Signal Image Video Process. 2021, 15, 1527–1535. [Google Scholar] [CrossRef]
  7. Bishukarma, R. Privacy-preserving based encryption techniques for securing data in cloud computing environments. Int. J. Sci. Res. Arch. 2023, 9, 1014–1025. [Google Scholar] [CrossRef]
  8. Teerapittayanon, S.; McDanel, B.; Kung, H.T. Distributed deep neural networks over the cloud, the edge and end devices. In Proceedings of the 2017 IEEE 37th International Conference on Distributed Computing Systems (ICDCS), Atlanta, GA, USA, 5–8 June 2017; pp. 328–339. [Google Scholar]
  9. Nita, S.L.; Mihailescu, M.I. On artificial neural network used in cloud computing security-a survey. In Proceedings of the 2018 10th International Conference on Electronics, Computers and Artificial Intelligence (ECAI), Iasi, Romania, 28–30 June 2018; pp. 1–6. [Google Scholar]
  10. De Azambuja, A.J.G.; Plesker, C.; Schützer, K.; Anderl, R.; Schleich, B.; Almeida, V.R. Artificial intelligence-based cyber security in the context of industry 4.0—A survey. Electronics 2023, 12, 1920. [Google Scholar] [CrossRef]
  11. Agarwal, A.; Khari, M.; Singh, R. Detection of DDOS attack using deep learning model in Cloud Storage Application—Wireless Personal Communications, SpringerLink. 2021. Available online: https://link.springer.com/article/10.1007/s11277-021-08271-z (accessed on 4 December 2023).
  12. Abadi, M.; Chu, A.; Goodfellow, I.; McMahan, H.B.; Mironov, I.; Talwar, K.; Zhang, L. Deep learning with differential privacy. In Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security, Vienna, Austria, 24–28 October 2016; pp. 308–318. [Google Scholar]
  13. Liu, B.; Li, Y.; Liu, Y.; Guo, Y.; Chen, X. Pmc: A privacy-preserving deep learning model customization framework for edge computing. Proc. ACM Interact. Mobile Wearable Ubiquitous Technol. 2020, 4, 1–25. [Google Scholar] [CrossRef]
  14. Wang, H.; Yang, T.; Ding, Y.; Tang, S.; Wang, Y. VPPFL: Verifiable Privacy-Preserving Federated Learning in Cloud Environment. IEEE Access 2024, 12, 151998–152008. [Google Scholar] [CrossRef]
  15. Suganya, M.; Prabha, T. A Comprehensive Analysis of Data Breaches and Data Security Challenges in Cloud Environment. In Proceedings of the 7th International Conference on Innovations and Research in Technology and Engineering (ICIRTE-2022), Organized by VPPCOE & VA, Mumbai, India, 9–10 April 2022. [Google Scholar]
  16. Barona, R.; Anita, E.M. A survey on data breach challenges in cloud computing security: Issues and threats. In Proceedings of the 2017 International Conference on Circuit, Power and Computing Technologies (ICCPCT), Kollam, India, 20–21 April 2017; pp. 1–8. [Google Scholar]
  17. Gilad-Bachrach, R.; Dowlin, N.; Laine, K.; Lauter, K.; Naehrig, M.; Wernsing, J. Cryptonets: Applying neural networks to encrypted data with high throughput and accuracy. In Proceedings of the International Conference on Machine Learning, New York, NY, USA, 19–24 June 2016; pp. 201–210, PMLR. [Google Scholar]
  18. Choraś, M.; Pawlicki, M. Intrusion detection approach based on optimised artificial neural network. Neurocomputing 2021, 452, 705–715. [Google Scholar] [CrossRef]
  19. McMahan, B.; Moore, E.; Ramage, D.; Hampson, S.; y Arcas, B.A. Communication-efficient learning of deep networks from decentralized data. In Proceedings of the Artificial Intelligence and Statistics, PMLR, Fort Lauderdale, FL, USA, 20–22 April 2017; pp. 1273–1282. [Google Scholar]
  20. Yang, Q.; Liu, Y.; Chen, T.; Tong, Y. Federated machine learning: Concept and applications. ACM Trans. Intell. Syst. Technol. (TIST) 2019, 10, 1–19. [Google Scholar] [CrossRef]
  21. Innocent, A. Cloud infrastructure service management—A review. arXiv 2012, arXiv:1206.6016. [Google Scholar]
  22. Mushtaq, M.F.; Akram, U.; Khan, I.; Khan, S.N.; Shahzad, A.; Ullah, A. Cloud computing environment and security challenges: A review. Int. J. Adv. Comput. Sci. Appl. 2017, 8, 183–195. [Google Scholar]
  23. Sinnott, R.O.; Cui, S. Benchmarking sentiment analysis approaches on the cloud. In Proceedings of the 2016 IEEE 22nd International Conference on Parallel and Distributed Systems (ICPADS), Wuhan, China, 13–16 December 2016; pp. 695–704. [Google Scholar]
  24. Katal, A.; Dahiya, S.; Choudhury, T. Energy efficiency in cloud computing data centers: A survey on software technologies. Clust. Comput. 2023, 26, 1845–1875. [Google Scholar] [CrossRef]
  25. Chan, K.Y.; Abu-Salih, B.; Qaddoura, R.; Ala’M, A.Z.; Palade, V.; Pham, D.S.; Del Ser, J.; Muhammad, K. Deep Neural Networks in the Cloud: Review, Applications, Challenges and Research Directions. Neurocomputing 2023, 545, 126327. [Google Scholar] [CrossRef]
  26. Rashid, A.; Dar, R.D. Survey on scalability in cloud environment. Int. J. Adv. Res. Comput. Eng. Technol. (IJARCET) 2016, 5, 2124–2128. [Google Scholar]
  27. Li, Z.; Duan, M.; Yu, S.; Yang, W. DynamicNet: Efficient federated learning for mobile edge computing with dynamic privacy budget and aggregation weights. IEEE Trans. Consum. Electron. 2024; early access. [Google Scholar]
  28. Achar, S. An overview of environmental scalability and security in hybrid cloud infrastructure designs. Asia Pac. J. Energy Environ. 2021, 8, 39–46. [Google Scholar] [CrossRef]
  29. Vadisetty, R. Privacy-Preserving Machine Learning Techniques for Data in Multi Cloud Environments. Corros. Manag. 2020, 30, 57–74. [Google Scholar]
  30. Abbas, Z. Securing Cloud-Based AI and Machine Learning Models: Privacy and Ethical Concerns. OSF, 2023; preprint. [Google Scholar] [CrossRef]
  31. Gupta, R.; Gupta, I.; Saxena, D.; Singh, A.K. A differential approach and deep neural network based data privacy-preserving model in cloud environment. J. Ambient. Intell. Humaniz. Comput. 2023, 14, 4659–4674. [Google Scholar] [CrossRef]
  32. Verma, G. Blockchain-based privacy preservation framework for healthcare data in cloud environment. J. Exp. Theor. Artif. Intell. 2024, 36, 147–160. [Google Scholar] [CrossRef]
  33. Dixit, P.; Silakari, S. Deep learning algorithms for cybersecurity applications: A technological and status review. Comput. Sci. Rev. 2021, 39, 100317. [Google Scholar] [CrossRef]
  34. Dean, J.; Corrado, G.; Monga, R.; Chen, K.; Devin, M.; Mao, M.; Ranzato, M.; Senior, A.; Tucker, P.; Yang, K.; et al. A Large scale distributed deep networks. In Proceedings of the Advances in Neural Information Processing Systems, Lake Tahoe, NV, USA, 3–6 December 2012; Volume 25. [Google Scholar]
  35. Zheng, K.; Jiang, G.; Liu, X.; Chi, K.; Yao, X.; Liu, J. DRL-based offloading for computation delay minimization in wireless-powered multi-access edge computing. IEEE Trans. Commun. 2023, 71, 1755–1770. [Google Scholar] [CrossRef]
  36. Su, Z.; Wang, Y.; Luan, T.H.; Zhang, N.; Li, F.; Chen, T.; Cao, H. Secure and efficient federated learning for smart grid with edge-cloud collaboration. IEEE Trans. Ind. Inform. 2021, 18, 1333–1344. [Google Scholar] [CrossRef]
  37. Zhou, J.; Pal, S.; Dong, C.; Wang, K. Enhancing quality of service through federated learning in edge-cloud architecture. Ad Hoc Networks 2024, 156, 103430. [Google Scholar] [CrossRef]
  38. Makkar, A.; Ghosh, U.; Rawat, D.B.; Abawajy, J.H. Fedlearnsp: Preserving privacy and security using federated learning and edge computing. IEEE Consum. Electron. Mag. 2021, 11, 21–27. [Google Scholar] [CrossRef]
  39. Liu, L.; Zhang, J.; Song, S.H.; Letaief, K.B. Client-edge-cloud hierarchical federated learning. In Proceedings of the ICC 2020-2020 IEEE International Conference on Communications (ICC), Virtual, 7–11 June 2020; pp. 1–6. [Google Scholar]
  40. Zhang, Z.; Wu, L.; Ma, C.; Li, J.; Wang, J.; Wang, Q.; Yu, S. LSFL: A lightweight and secure federated learning scheme for edge computing. IEEE Trans. Inf. Forensics Secur. 2022, 18, 365–379. [Google Scholar] [CrossRef]
  41. Qayyum, A.; Ahmad, K.; Ahsan, M.A.; Al-Fuqaha, A.; Qadir, J. Collaborative federated learning for healthcare: Multi-modal covid-19 diagnosis at the edge. IEEE Open J. Comput. Soc. 2022, 3, 172–184. [Google Scholar] [CrossRef]
  42. Rahulamathavan, Y. Privacy-preserving similarity calculation of speaker features using fully homomorphic encryption. arXiv 2022, arXiv:2202.07994. [Google Scholar]
  43. Parra-Ullauri, J.M.; Madhukumar, H.; Nicolaescu, A.C.; Zhang, X.; Bravalheri, A.; Hussain, R.; Simeonidou, D. kubeFlower: A privacy-preserving framework for Kubernetes-based federated learning in cloud–edge environments. Future Gener. Comput. Syst. 2024, 157, 558–572. [Google Scholar] [CrossRef]
  44. Vizitiu, A.; Nita, C.I.; Toev, R.M.; Suditu, T.; Suciu, C.; Itu, L.M. Framework for Privacy-Preserving Wearable Health Data Analysis: Proof-of-Concept Study for Atrial Fibrillation Detection. Appl. Sci. 2021, 11, 9049. [Google Scholar] [CrossRef]
  45. Aziz, R.; Banerjee, S.; Bouzefrane, S.; Le Vinh, T. Exploring homomorphic encryption and differential privacy techniques towards secure federated learning paradigm. Future Internet 2023, 15, 310. [Google Scholar] [CrossRef]
  46. Korkmaz, A.; Rao, P. A Selective Homomorphic Encryption Approach for Faster Privacy-Preserving Federated Learning. arXiv 2025, arXiv:2501.12911. [Google Scholar]
  47. Choi, S.; Patel, D.; Zad Tootaghaj, D.; Cao, L.; Ahmed, F.; Sharma, P. FedNIC: Enhancing privacy-preserving federated learning via homomorphic encryption offload on SmartNIC. Front. Comput. Sci. 2024, 6, 1465352. [Google Scholar] [CrossRef]
  48. Qiang, W.; Liu, R.; Jin, H. Defending CNN against privacy leakage in edge computing via binary neural networks. Future Gener. Comput. Syst. 2021, 125, 460–470. [Google Scholar] [CrossRef]
  49. He, C.; Liu, G.; Guo, S.; Yang, Y. Privacy-preserving and low-latency federated learning in edge computing. IEEE Internet Things J. 2022, 9, 20149–20159. [Google Scholar] [CrossRef]
  50. Lin, L.; Zhang, X. PPVerifier: A privacy-preserving and verifiable federated learning method in cloud-edge collaborative computing environment. IEEE Internet Things J. 2022, 10, 8878–8892. [Google Scholar] [CrossRef]
  51. Zhang, L.; Xu, J.; Vijayakumar, P.; Sharma, P.K.; Ghosh, U. Homomorphic encryption-based privacy-preserving federated learning in IoT-enabled healthcare system. IEEE Trans. Netw. Sci. Eng. 2022, 10, 2864–2880. [Google Scholar] [CrossRef]
  52. Zhou, C.; Fu, A.; Yu, S.; Yang, W.; Wang, H.; Zhang, Y. Privacy-preserving federated learning in fog computing. IEEE Internet Things J. 2020, 7, 10782–10793. [Google Scholar] [CrossRef]
  53. Wang, H.; Zhao, Y.; He, S.; Xiao, Y.; Tang, J.; Cai, Z. Federated learning-based privacy-preserving electricity load forecasting scheme in edge computing scenario. Int. J. Commun. Syst. 2024, 37, e5670. [Google Scholar] [CrossRef]
  54. Nguyen, D.C.; Ding, M.; Pathirana, P.N.; Seneviratne, A.; Zomaya, A.Y. Federated learning for COVID-19 detection with generative adversarial networks in edge cloud computing. IEEE Internet Things J. 2021, 9, 10257–10271. [Google Scholar] [CrossRef]
  55. Dong, W.; Lin, C.; He, X.; Huang, X.; Xu, S. Privacy-Preserving Federated Learning via Homomorphic Adversarial Networks. arXiv 2024, arXiv:2412.01650. [Google Scholar]
  56. Hayati, H.; van de Wouw, N.; Murguia, C. Privacy in Cloud Computing through Immersion-based Coding. arXiv 2024, arXiv:2403.04485. [Google Scholar]
  57. Zhao, L.; Wang, Q.; Zou, Q.; Zhang, Y.; Chen, Y. Privacy-preserving collaborative deep learning with unreliable participants. IEEE Trans. Inf. Forensics Secur. 2020, 15, 1486–1500. [Google Scholar] [CrossRef]
  58. Naresh, V.S. PPDNN-CRP: Privacy-preserving deep neural network processing for credit risk prediction in cloud: A homomorphic encryption-based approach. J. Cloud Comput. 2024, 13, 149. [Google Scholar] [CrossRef]
  59. Akram, A.; Khan, F.; Tahir, S.; Iqbal, A.; Shah, S.A.; Baz, A. Privacy preserving inference for deep neural networks: Optimizing homomorphic encryption for efficient and secure classification. IEEE Access 2024, 12, 15684–15695. [Google Scholar] [CrossRef]
  60. Yang, X.; Chen, J.; He, K.; Bai, H.; Wu, C.; Du, R. Efficient privacy-preserving inference outsourcing for convolutional neural networks. IEEE Trans. Inf. Forensics Secur. 2023, 18, 4815–4829. [Google Scholar] [CrossRef]
  61. Lam, K.Y.; Lu, X.; Zhang, L.; Wang, X.; Wang, H.; Goh, S.Q. Efficient FHE-based privacy-enhanced neural network for trustworthy AI-as-a-service. IEEE Trans. Dependable Secur. Comput. 2024, 21, 4451–4468. [Google Scholar] [CrossRef]
  62. Song, C.; Huang, R. Secure convolution neural network inference based on homomorphic encryption. Appl. Sci. 2023, 13, 6117. [Google Scholar] [CrossRef]
  63. Prabhu, M.; Revathy, G.; Kumar, R.R. Deep learning based authentication secure data storing in cloud computing. Int. J. Comput. Eng. Optim. 2023, 1, 10–14. [Google Scholar]
  64. Li, L.; Zhu, H.; Zheng, Y.; Wang, F.; Lu, R.; Li, H. Efficient and privacy-preserving fusion based multi-biometric recognition. In Proceedings of the GLOBECOM 2022-2022 IEEE Global Communications Conference, Rio de Janeiro, Brazil, 4–8 December 2022; pp. 4860–4865. [Google Scholar]
  65. Choi, H.; Woo, S.S.; Kim, H. Blind-touch: Homomorphic encryption-based distributed neural network inference for privacy-preserving fingerprint authentication. In Proceedings of the AAAI Conference on Artificial Intelligence, Stanford, CA, USA, 25–27 March 2024; Volume 38, pp. 21976–21985. [Google Scholar]
  66. Owusu-Agyemeng, K.; Qin, Z.; Xiong, H.; Liu, Y.; Zhuang, T.; Qin, Z. MSDP: Multi-scheme privacy-preserving deep learning via differential privacy. Pers. Ubiquitous Comput. 2023, 27, 221–233. [Google Scholar] [CrossRef]
  67. Sharma, J.; Kim, D.; Lee, A.; Seo, D. On differential privacy-based framework for enhancing user data privacy in mobile edge computing environment. IEEE Access 2021, 9, 38107–38118. [Google Scholar] [CrossRef]
  68. Gayathri, S.; Gowri, S. Securing medical image privacy in cloud using deep learning network. J. Cloud Comput. 2023, 12, 40. [Google Scholar]
  69. Bukhari, S.M.S.; Zafar, M.H.; Abou Houran, M.; Moosavi, S.K.R.; Mansoor, M.; Muaaz, M.; Sanfilippo, F. Secure and privacy-preserving intrusion detection in wireless sensor networks: Federated learning with SCNN-Bi-LSTM for enhanced reliability. Ad Hoc Netw. 2024, 155, 103407. [Google Scholar] [CrossRef]
  70. Feng, L.; Du, J.; Fu, C.; Song, W. Image encryption algorithm combining chaotic image encryption and convolutional neural network. Electronics 2023, 12, 3455. [Google Scholar] [CrossRef]
  71. Sharma, J.; Kim, D.; Lee, A.; Seo, D. Differential privacy using fuzzy convolution neural network (DP-FCNN) with Laplace mechanism and authenticated access in edge computing. In Proceedings of the 2021 15th International Conference on Ubiquitous Information Management and Communication (IMCOM), Seoul, Republic of Korea, 4–6 January 2021; pp. 1–5. [Google Scholar]
Figure 1. PRISMA model for literature review.
Figure 1. PRISMA model for literature review.
Electronics 14 02727 g001
Figure 2. Distribution of papers by year.
Figure 2. Distribution of papers by year.
Electronics 14 02727 g002
Figure 3. Statistics of privacy preservation technology studies.
Figure 3. Statistics of privacy preservation technology studies.
Electronics 14 02727 g003
Table 1. Comparison of deep learning (DL) techniques for privacy preservation in cloud environments.
Table 1. Comparison of deep learning (DL) techniques for privacy preservation in cloud environments.
DL TechniqueMain ConceptApplication in Privacy PreservationAdvantagesChallenges
Federated Learning (FL)Distributed training without centralizing dataProtects raw user data by keeping them on local devices during model trainingEnhances privacy and reduces data-transfer risksCommunication overhead and poisoning attack on the model
Differentially Private Deep Learning (DP-DL)Adds noise data to training modelPrevents the leakage of individuals’ data from trained modelsStrong mathematical privacy guaranteesTrade-off between the model’s accuracy and the level of data privacy.
Homomorphic Encryption with DLEnables computation with encrypted dataAllows DL models to process encrypted user data without decryptionStrong end-to-end privacy preservationHigh computational overhead and slower inference times
Hybrid ApproachesCombine distributed training, noise injection, and encrypted computationEnsure multi-layered privacy through local training, data perturbation, and encrypted processingMaximize privacy protection while enabling collaborative learningIncreased complexities, higher resource demands, and integration challenges
Table 2. Top 10 cloud data breaches from 2013 to 2024 and DL mitigation strategies.
Table 2. Top 10 cloud data breaches from 2013 to 2024 and DL mitigation strategies.
Company (Year)Reason for Data BreachDL-Based Cybersecurity Mitigation
Yahoo (2013)Outdated encryption algorithmDL model for auditing and monitoring the system.
Equifax (2017)Exploited vulnerability in Apache serverDL model to prioritize critical updates.
Facebook (2019)Misconfigured cloud storage led to breaches affecting around 540 million user recordsDL model to detect misconfigurations and enforce best practices in cloud environments.
Capital One (2019)Misconfigured AWS firewall allowed unauthorized accessDL-based intrusion detection system (IDS) to monitor and detect anomalous access attempts.
Marriott International (2018)Compromised credentials led to unauthorized database accessDL model to distinguish attacker activities from normal user behavior.
Microsoft (2023)Weakness in key management led to stolen cryptographic keyDL model to improve key management and predict needed patches.
AT&T (2023)Data breach caused by weakness in third-party providerDL model for automatic system assessment.
Hewlett Packard Enterprise (2023)Large-scale phishing attack led to email system compromiseDL model to detect and block sophisticated phishing attempts.
Atlassian (2019)Unpatched browser extension led to sensitive data leaksDL model for behavior monitoring and anomaly detection in browser usage.
River City Media (2017)Misconfigured database exposed 1.4 billion recordsDL model to identify and correct cloud misconfigurations.
Table 3. Summary of research methodology.
Table 3. Summary of research methodology.
CategoryDetails
Search Strings(“Privacy” OR “Confidentiality” OR “Data Breaches”) AND (“DL” OR “Deep Learning”) AND (“CC” OR “Cloud Computing”) AND (“Differential Privacy” OR “DP” OR “Homomorphic Encryption” OR “HE” OR “Federated Learning” OR “FL”).
Time Span January 2020–April 2025.
Databases Searched IEEE Xplore, ACM Digital Library, SpringerLink, Saudi Digital Library, and Google Scholar.
Inclusion Criteria
  • Published between 2020 and 2025.
  • English language only.
  • Studies integrating deep learning with at least one privacy-preserving technique (FL, HE, DP).
  • Relevance to privacy preservation in cloud computing.
Exclusion Criteria
  • Non-English language publications.
  • Published before 2020.
  • Studies using machine learning.
  • Reviews, magazines, eBooks, and non-peer-reviewed works.
Quality Assessment Tools Manual evaluation based on PRISMA guidelines, focusing on study relevance, methodological clarity, and reporting completeness.
Table 4. Comparison of related work.
Table 4. Comparison of related work.
AuthorMethodologyKey FindingsDatasetKey LengthLimitationSuggested Mitigation
Wang et al. [14]Federated learning and deep learningProposed VPPFL with CNN achieving 98% accuracyMNIST dataset1024 bitsHigher computational overhead compared to simpler FL approaches.Optimize computational complexity via model compression.
Li et al. [27]Federated learning and deep learningIntroduced DynamicNet model with FL and CNN, achieved 92.9% accuracyMNIST and CIFAR-10 datasets1024 bitsAccuracy was lower than expected for enhanced privacy.Integrate differential privacy algorithm with performance-aware tuning.
Su et al. [35]Federated learning and deep learningIntroduced edge–cloud collaborative FL with DRL incentives to boost participation and model quality in smart grid AIoT.Local datasetAES-128Communication inefficiencies with non-IID data.Apply DRL-based incentive mechanisms and edge–cloud orchestration to manage non-IID data and participant reliability.
Zhou et al. [36]Federated learning and deep learningThe model operated with bandwidth-limited environments with an accuracy of 96.3%.Fashion-MNIST dataset2048 bitsThe model lost reliability because of huge noise data addition.Apply hierarchical FL with adaptive client selection.
Makkar et al. [37]Federated learning and deep learningFedLearnSP and CNN model enhanced privacy with distillation for mobile device efficiency.900 mobile imagesN/AModel not evaluated on multiple kinds of noise data under diverse data distributions.Combine model distillation with personalization layers in FL.
Liu et al. [38]Federated learning and deep learningFL and blockchain used for cross-border cloud collaboration.N/A2048 bitsHierarchical architecture consumes more time to provide privacy preservation in cloud.Use layer-2 solutions or off-chain verification.
Zhang et al. [39]Federated learning and deep learningApplied FL with secure multiparty computation for sensitive education data.N/A2048 bitsProtocol complexity caused latency and dependence of key generationSimplify protocol layers and reduce interaction rounds.
Qayyum et al. [40]Federated learning and homomorphic encryptionHE integrated with Clustered FL and CNN for smart grid protectionCOVID-19 datasetN/ATraining time increased substantially and restriction of cloud constraintCompress encrypted model gradients and use partial updates.
Rahulamathavan et al. [41]Federated learning, fully homomorphic encryption, and deep learningFL with CKKS and DNN-UBM applied to real-time cloud analytics with CNN models.Real-time dataset128 bitsImplementation complexity and deployment issues.Develop modular FL pipelines with deployment toolkits.
Parra-Ullauri et al. [42]Federated learning and differential privacyProposed kubeFlower K8s operator for FL, ensuring privacy via differential privacyCIFAR-10N/ADeployment is slower than helm chart methods and lacks control of network policies by default.Employ P3-VC and SDN overlays to isolate network flows and manage privacy budgets dynamically.
Vizitiu et al. [43]Federated learning and differential privacyEnhanced IoT healthcare with combined DNN-based (LSTM) with FL and DPECG dataset64 bitsTemporal data noise affected prediction.Optimize noise calibration and apply denoising autoencoders.
Aziz et al. [44]Federated learning and homomorphic encryption and differential privacyApplied DP and HE to mitigate these vulnerabilities with tradeoffs in convergence and computation.N/A2048 bitsCombining DP and HE introduces higher computational costs and complexity in parameter tuning.Balance privacy and performance using hybrid models with adjustable noise and selective encryption.
Korkmaz et al. [45]Federated learning and homomorphic encryption and differential privacyProposed FAS approach combining selective HE, DP, and bitwise scrambling for faster, secure FL in healthcare applications.Healthcare datasetN/ALimited to specific FL tasks and not generalized to all DL classifiers.Optimize FAS for broader FL scenarios and evaluate with diverse datasets.
Choi et al. [46]Federated learning and homomorphic encryptionIntegrated FL with HE and evaluated via CNN model that reduced training time by 46%.FEMNIST128 bitsAccuracy still needs improvement for reliability.Enhance model architecture and increase training rounds.
Qiang et al. [47]Federated learning and homomorphic encryptionCombined FL with BNN and HE scheme using federated clustering that achieved 99.34%CIFAR-100128 bitsScalability affected by large amount of client data.Apply hierarchical FL with adaptive client selection.
He et al. [48]Federated learning and homomorphic encryptionProposed PL-FedIPEC using improved Paillier encryption for privacy in FL, reducing latency without accuracy loss.N/A768 bitsHigh computational overhead of traditional Paillier HE still remains an implementation complexity.Optimize encryption via pre-computed random values and introduce simpler operations to reduce latency.
Lin et al. [49]Federated learning and homomorphic encryptionFL applied with Paillier encryption for real-time cloud analytics with CNN models, which preserved accuracy above 90%MNIST datasetN/ADetails on implementation complexity and high overhead communication.Optimized and customized of FL model.
Zhang et al. [50]Federated learning and homomorphic encryptionFL and HE combined with cryptographic masking for privacy protection in IoMT environmentHAM10000 dataset128 bitsTemporal data noise affected model prediction stability.Optimize noise calibration and apply denoising autoencoders.
Zhou et al. [51]Federated learning and deep learningAdaptive FL scheme for industrial cloud environments using federated clustering and Paillier encryptionFashion-MNIST dataset1024 bitsModel lost reliability because of huge noise data addition.Apply hierarchical FL with adaptive client selection.
Wang et al. [52]Federated learning, differential privacy and deep learningCombined FL, DP, and RNN, which reduced relative error by 1.58%.N/A2048 bitsMassive number of edge devices impacted accuracy.Applied hierarchical FL with cluster-based aggregation.
Nguyen et al. [53]Federated learning and differential privacyDeveloped FedGAN model with Proof-of-Reputation (PoR) algorithm.X-ray imagesN/AFedGAN framework produced more overhead than normal blockchain schemes.Enhanced X-ray image preprocessing.
Dong et al. [54]Federated learning and homomorphic encryption andDeveloped HANs with aggregatable hybrid encryption to prevent key sharing and collaborative decryption.Local dataset128 bitsCommunication overhead is 29.2× higher than traditional MK-HE schemes.Employ lightweight encryption components and reduce redundancy in data sharing.
Hayati et al. [55]Homomorphic encryption and deep learningImplemented HE and CNN on custom dataset with 97.75% accuracy.Custom datasetN/AHigh computational cost and training time.Apply lightweight encryption and parallel model training.
Naresh et al. [56]Homomorphic and deep learning encryptionProposed PPDNN-CRP model, which combined FL and HE, achieving 80.48% and 77.23% accuracy.Kaggle loan datasetN/AAccuracy was lower than expected for enhanced privacy.Integrate homomorphic encryption algorithm with low overhead key generation.
Akram et al. [57]Homomorphic encryption and deep learningApplied HE-CNN on MNIST dataset, which showed 97.25% accuracy.MNIST datasetN/ALonger computation time.Use efficient homomorphic libraries and model pruning.
Lam et al. [58]Homomorphic encryption and deep learningCombination approach (FHE-PE-NN) achieved 94.8%MNIST dataset2048 bitsTraining latency under real-time constraints.Optimize model layers and apply fast encryption techniques.
Song et al. [59]Homomorphic encryption and deep learningModel included CKK, FHE, and CNN, achieving 99.05%MNIST dataset2048 bitsRandom noise addition impacted model effectiveness.Use controlled noise injection and calibration.
Prabhu et al. [60]Homomorphic encryption and deep learningApplied combination model (RSA-CNN) and achieved accuracy up to 99.97%WSN-DS and CIC-IDS2017 datasetN/AHigh energy and training time due to cryptographic operations.Apply energy-efficient RSA variants and lightweight CNNs.
Li et al. [61]Homomorphic encryption and deep learningDesigned decentralized multi-biometric recognition using MK-CKKS with neural fusion of face and voice data.N/A128 bits, 192 bits, and 256 bitsResource-constrained devices for local encryption and model computation.Offload training to centralized servers and minimize feature set through fused vectors to reduce computation.
Choi et al. [62]Homomorphic encryption and deep learningBlind-Touch enables privacy-preserving fingerprint authentication using distributed neural inference and HESOKOTO dataset192 bitsComputational overhead of HE remains high, especially in deep layers.Split inference into client–server model and use optimized compression and cluster processing for scalability.
Owusu-Agyemang et al. [63]Homomorphic encryption and deep learningIntegrated MSCryptoNet model with DNN that achieved 86%Real-time datasets128 bitsLack of trust assurance and privacy guarantees.Integrate trust anchors and privacy-preserving modules.
Zhao et al. [64]Homomorphic encryption and deep learningIntegrated MK-FHE model with DNN, which reduced operational costN/A1024 bitsMultiple parts may reduce model accuracy.Make encryption model in custom space in cloud computing.
Yang et al. [65]Homomorphic encryption and deep learningModel overcomes challenges of maintaining confidentiality during training data.N/A2048 bitsOverhead of computation during encryption phase.Make a redundancy model based on encryption model during inference phases in cloud computing.
Sharma et al. [66]Differential privacy and deep learningDP-FCNN achieved 97–98%Adult and Heart disease dataset.80 bits and 128 bitsLong training duration and high cost.Use gradient clipping and batch optimization.
Gayathri et al. [67]Differential privacy and deep learningApplied PPDDN used for medical images with high performance metrics.Real-time CT images64 bitsModel evaluation required more computation time.Implement parallel computing and model quantization.
Bukhari et al. [68]Differential privacy and deep learningToF sensor embedded with CNN model on edge cloud that achieved 89.2% on facial expression detectionVL53L5CX ToF depth images64 bitsLow-resolution depth images affected result quality.Enhance image preprocessing and upsampling techniques.
Feng et al. [69]Differential privacy and deep learningIntegrated SCNN-BiLSTM model with high-resolution images, which obtained 95% accuracyAI-TOD aerial imagesN/AHigh-resolution images led to low performance when training data in the cloud.Offload training to edge devices or preprocess images.
Zhang et al. [70]Differential privacy and deep learningCombined VerifyNet model with AES-GCM, which maintained 94.4% accuracy.N/A256 bitsCryptographic overhead during DL application.Use lightweight cryptographic primitives and hybrid models.
Sharma et al. [71]Differential privacy and deep learningCombined multiple privacy preservation methods, including LATENT, LDP, and DP-FCNN, which showed 91–96% accuracyMNIST dataset.1024 bitsLack of clear privacy assurance and performance breakdown.Enhance documentation and validation with benchmarks.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Almosti, A.M.; Rahman, M.M.H. Analysis of Data Privacy Breaches Using Deep Learning in Cloud Environments: A Review. Electronics 2025, 14, 2727. https://doi.org/10.3390/electronics14132727

AMA Style

Almosti AM, Rahman MMH. Analysis of Data Privacy Breaches Using Deep Learning in Cloud Environments: A Review. Electronics. 2025; 14(13):2727. https://doi.org/10.3390/electronics14132727

Chicago/Turabian Style

Almosti, Abdulqawi Mohammed, and M. M. Hafizur Rahman. 2025. "Analysis of Data Privacy Breaches Using Deep Learning in Cloud Environments: A Review" Electronics 14, no. 13: 2727. https://doi.org/10.3390/electronics14132727

APA Style

Almosti, A. M., & Rahman, M. M. H. (2025). Analysis of Data Privacy Breaches Using Deep Learning in Cloud Environments: A Review. Electronics, 14(13), 2727. https://doi.org/10.3390/electronics14132727

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop