Security and Privacy in Emerging Technologies

A special issue of Electronics (ISSN 2079-9292). This special issue belongs to the section "Computer Science & Engineering".

Deadline for manuscript submissions: closed (15 January 2025) | Viewed by 3254

Special Issue Editors


E-Mail Website
Guest Editor
Institute of Technology, Xidian University, Guangzhou 510555, China
Interests: cybersecurity

E-Mail Website
Guest Editor
Cyberspace Institute of Advanced Technology, Guangzhou University, Guangzhou 510006, China
Interests: clustering algorithm in resource constrained network; inference learning and intelligent decision-making in wireless networks; distributed learning and federated learning with multi-modal data; privacy and security protection in distributed learning

E-Mail Website
Guest Editor
College of Information Science and Technology, Nanjing Forestry University, Nanjing 210037, China
Interests: data stream classification; anomaly detection; cloud computing; Internet of Things

Special Issue Information

Dear Colleagues,

In recent times, various emerging technologies, such as edge computing, crowd intelligence, federated learning, smart grid, privacy computing, 5G/6G, satellite networks, and large models, to name a few, have been developing rapidly. On the one hand, these technologies either require massive amounts of data to compute or intentionally, or unintentionally, collect them. Unfortunately, these data are highly likely to involve sensitive information, which raises unprecedented privacy concerns. On the other hand, these emerging technologies always rely on networks, other parties, or remote servers to provide intelligence information or services, which makes them severely vulnerable to security attacks such as, for instance, denial-of-service attacks, poisoning attacks, member inference attacks, cryptographic attacks, and imitation attacks. Thus, it is high time to investigate privacy and security in emerging technologies. This Special Issue aims to bring together researchers from different backgrounds, such as artificial intelligence, cryptography, and cybersecurity, to discuss the latest experiences, research ideas, synergic research, and developments related to these fundamental issues.

The topics of this Special Issue include but are not limited to the following:

  • Formulation of privacy and privacy in data computation fusion;
  • Privacy-preserving crowd intelligence;
  • Attack and defense for federated learning/edge computing/smart grid/large models;
  • Security attacks in 5G/6G/satellite networks;
  • Efficient, fair, or verifiable privacy computing;
  • Privacy-preserving training and inference for large models;
  • Hybrid paradigm between software and hardware for privacy computing;
  • Privacy computing-empowered emerging technologies in tracking privacy concerns;
  • Privacy-preserving and secure artificial intelligence;
  • Blockchain-empowered emerging technologies in security and privacy.

Dr. Bowen Zhao
Dr. Cheng Qiao
Dr. Jun Jiang
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Electronics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • privacy protection
  • network security
  • emerging intelligence

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • Reprint: MDPI Books provides the opportunity to republish successful Special Issues in book format, both online and in print.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

18 pages, 3530 KiB  
Article
PPRD-FL: Privacy-Preserving Federated Learning Based on Randomized Parameter Selection and Dynamic Local Differential Privacy
by Jianlong Feng, Rongxin Guo and Jianqing Zhu
Electronics 2025, 14(5), 990; https://doi.org/10.3390/electronics14050990 - 28 Feb 2025
Viewed by 905
Abstract
As traditional federated learning algorithms often fall short in providing privacy protection, a growing body of research integrates local differential privacy methods into federated learning to strengthen privacy guarantees. However, under a fixed privacy budget, with the increase in the dimensionality of model [...] Read more.
As traditional federated learning algorithms often fall short in providing privacy protection, a growing body of research integrates local differential privacy methods into federated learning to strengthen privacy guarantees. However, under a fixed privacy budget, with the increase in the dimensionality of model parameters, the privacy budget allocated per parameter diminishes, which means that a larger amount of noise is required to meet privacy requirements. This escalation in noise may adversely affect the final model’s performance. For that, we propose a privacy protection federated learning (PPRD-FL) approach. First, we design a randomized parameter selection strategy that combines randomization with importance-based filtering, effectively addressing the privacy budget dilution problem by selecting only the most crucial parameters for global aggregation. Second, we develop a dynamic local differential privacy-based perturbation mechanism, which adjusts the noise levels according to the training phase, not only providing robustness and security but also optimizing the dynamic allocation of the privacy budget. Finally, our experiments have demonstrated that the proposed approach maintains a high performance while ensuring strong privacy guarantees. Full article
(This article belongs to the Special Issue Security and Privacy in Emerging Technologies)
Show Figures

Figure 1

22 pages, 994 KiB  
Article
Masking and Homomorphic Encryption-Combined Secure Aggregation for Privacy-Preserving Federated Learning
by Soyoung Park, Junyoung Lee, Kaho Harada and Jeonghee Chi
Electronics 2025, 14(1), 177; https://doi.org/10.3390/electronics14010177 - 3 Jan 2025
Cited by 1 | Viewed by 1356
Abstract
Secure aggregation of local learning model parameters is crucial for achieving privacy-preserving federated learning. This paper presents a novel and practical aggregation method that effectively combines the advantages of masking-based aggregation with those of homomorphic encryption-based techniques. Each node conceals its local parameters [...] Read more.
Secure aggregation of local learning model parameters is crucial for achieving privacy-preserving federated learning. This paper presents a novel and practical aggregation method that effectively combines the advantages of masking-based aggregation with those of homomorphic encryption-based techniques. Each node conceals its local parameters using a randomly selected mask, independently chosen, thereby eliminating the need for additional computations to generate or exchange mask values with other nodes. Instead, each node homomorphically encrypts its random mask using its own encryption key. During each federated learning round, nodes send their masked parameters and the homomorphically encrypted mask to the federated learning server. The server then aggregates these updates in an encrypted state, directly calculating the average of actual local parameters across all nodes without the necessity to decrypt the aggregated result separately. To facilitate this, we introduce a new multi-key homomorphic encryption technique tailored for secure aggregation in federated learning environments. Each node uses a different encryption key to encrypt its mask value. Importantly, the ciphertext of each mask includes a partial decryption component from the node, allowing the collective sum of encrypted masks to be automatically decrypted once all are aggregated. Consequently, the server computes the average of the actual local parameters by simply subtracting the decrypted total sum of mask values from the cumulative sum of the masked local parameters. Our approach effectively eliminates the need for interactions between nodes and the server for mask generation and sharing, while addressing the limitation of a single key homomorphic encryption. Moreover, the proposed aggregation process completes the global model update in just two interactions (in the absence of dropouts), significantly simplifying the aggregation procedure. Utilizing the CKKS (Cheon-Kim-Kim-Song) homomorphic encryption scheme, our method ensures efficient aggregation without compromising security or accuracy. We demonstrate the accuracy and efficiency of the proposed method through varied experiments on MNIST data. Full article
(This article belongs to the Special Issue Security and Privacy in Emerging Technologies)
Show Figures

Figure 1

Back to TopTop