Clustered Federated Learning with Adaptive Similarity for Non-IID Data
Abstract
1. Introduction
2. Related Work
2.1. Federated Learning (FL)
2.2. Clustered Federated Learning (CFL)
- (1)
- its superior scalability via low-rank matrix factorization () for similarity computation, compared to standard K-Means approaches
- (2)
- its unique positive incentive mechanism (Section 3.2.1), which adds a layer of performance-based quality control to cluster formation that distance-based methods lack.
2.3. Communication Efficiency in Federated Learning
2.4. Client Selection and Contribution Evaluation
3. Methodology
3.1. Problem Formulation
3.2. Initial Clustering with Adaptive Similarity
3.2.1. Similarity Matrix and Incentive Mechanism
3.2.2. Low-Rank Approximation and Greedy Clustering
- (i)
- a representativeness score is computed for each unassigned client (data-size weighted centrality);
- (ii)
- the highest-scoring client is selected as the seed, it is expanded with the greedy incentive test until no more positives are found, and this process is repeated until all the clients are assigned.
3.2.3. Data Imbalance Adjustment
| Algorithm 1: AS-CFL |
| Input: Initial global model parameters , set of all clients total rounds , batch size . Output: A set of personalized cluster models 1. Server executes: 2. Initialize cluster models for the first round: 3. for each communication round do: 4. Broadcast cluster models to clients in their 5. for each client in parallel do 6. Receive its cluster model 7. Compute local epochs 8. Perform local training for () epochs to get local model 9. Compute model update 10. Send to the server. 11. end for 12. Server executes: 13. // --- Stage 1: Initial Clustering --- 14. Receive all model updates . 15. Compute similarity matrix using cosine similarity on with low-rank factorization. 16. Perform initial greedy clustering using the positive incentive mechanism to form initial clusters 17. // --- Stage 2: Dynamic Cluster Adjustment --- 18. Let final clusters . 19. for each cluster } do 20. Compute the dispersion-to-separation ratio 21. if then Split and update . 22. else if then Merge and update . 23. end if 24. end for 25. // --- Stage 3: Intra-Cluster Aggregation --- 26. for each final cluster do 27. // Add noise to updates before aggregation 28. for each final cluster do 29. let = {} 30. for each client i in do 31. 32. add to 33. // Aggregate noisy updates 34. 35. end for 36. end for 37. Return final personalized models {} |
3.3. Dynamic Cluster Adjustment
3.4. Intra-Cluster Aggregation
3.5. AS-CFL Algorithm Workflow
3.6. Complexity Analysis
4. Experiments
4.1. Experimental Setup
4.2. Results and Analysis
4.2.1. Convergence Speed
4.2.2. Classification Accuracy
4.2.3. Communication Efficiency and Robustness
4.3. Ablation Study
- w/o Positive Incentive: The positive incentive mechanism is removed, and clients are assigned solely based on similarity without performance validation.
- w/o Dynamic Adjustment: Dynamic cluster adjustment using the EM algorithm and ratio is disabled, fixing the number of clusters to an initial value.
- w/o Low-Rank Factorization: The full similarity matrix is computed without low-rank approximation, increasing the computational overhead.
4.4. Hyperparameter Sensitivity Analysis
4.4.1. Impact of Dirichlet Concentration Parameter
4.4.2. Positive Incentive Threshold
5. Conclusions and Future Work
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Abbreviations
| AS-CFL | Adaptive Similarity Clustered Federated Learning |
| CFL | Clustered Federated Learning |
| EM | Expectation-Maximization |
| FL | Federated Learning |
| IFCA | Iterative Federated Clustering Algorithm |
| IID | Independent and Identically Distributed |
| Non-IID | Non-Independent and Identically Distributed |
References
- McMahan, B.; Moore, E.; Ramage, D.; Hampson, S.; y Arcas, B.A. Communication-efficient learning of deep networks from decentralized data. In Proceedings of the Artificial Intelligence and Statistics, Fort Lauderdale, FL, USA, 20–22 April 2017; PMLR: Fort Lauderdale, FL, USA, 2017; pp. 1273–1282. [Google Scholar]
- Liang, W.; Chen, X.; Huang, S.; Xiong, G.; Yan, K.; Zhou, X. Federal learning edge network based sentiment analysis combating global COVID-19. Comput. Commun. 2023, 204, 33–42. [Google Scholar] [CrossRef]
- Liu, J.; Mi, Y.; Zhang, X.; Li, X. Task graph offloading via deep reinforcement learning in mobile edge computing. Future Gener. Comput. Syst. 2024, 158, 545–555. [Google Scholar] [CrossRef]
- Fei, F.; Li, S.; Dai, H.; Hu, C.; Dou, W.; Ni, Q. A K-anonymity based schema for location privacy preservation. IEEE Trans. Sustain. Comput. 2019, 4, 156–167. [Google Scholar] [CrossRef]
- Qi, L.; Wang, R.; Hu, C.; Li, S.; He, Q.; Xu, X. Time-aware distributed service recommendation with privacy-preservation. Inf. Sci. 2019, 480, 354–364. [Google Scholar] [CrossRef]
- Zhang, C.; Ni, Z.; Xu, Y.; Luo, E.; Chen, L.; Zhang, Y. A trustworthy industrial data management scheme based on redactable blockchain. J. Parallel Distrib. Comput. 2021, 152, 167–176. [Google Scholar] [CrossRef]
- Zhu, H.; Xu, J.; Liu, S.; Jin, Y. Federated learning on non-IID data: A survey. Neurocomputing 2021, 465, 371–390. [Google Scholar] [CrossRef]
- Kairouz, P.; McMahan, H.B.; Avent, B.; Bellet, A.; Bennis, M.; Bhagoji, A.N.; Bonawitz, K.; Charles, Z.; Cormode, G.; Cummings, R.; et al. Advances and open problems in federated learning. Found. Trends® Mach. Learn. 2021, 14, 1–210. [Google Scholar] [CrossRef]
- Imran, M.; Yin, H.; Chen, T.; Nguyen, Q.V.H.; Zhou, A.; Zheng, K. ReFRS: Resource-efficient federated recommender system for dynamic and diversified user preferences. ACM Trans. Inf. Syst. 2023, 41, 1–30. [Google Scholar] [CrossRef]
- Liu, J.; Zhang, X. Truthful resource trading for dependent task offloading in heterogeneous edge computing. Future Gener. Comput. Syst. 2022, 133, 228–239. [Google Scholar] [CrossRef]
- Liu, P.; Ji, H. Dual Channel Residual Learning for Denoising Path Tracing. Int. J. Image Graph. 2025, 25, 2550003. [Google Scholar] [CrossRef]
- Sattler, F.; Wiedemann, S.; Müller, K.R.; Samek, W. Robust and communication-efficient federated learning from non-iid data. IEEE Trans. Neural Netw. Learn. Syst. 2019, 31, 3400–3413. [Google Scholar] [CrossRef] [PubMed]
- Zhao, Y.; Li, M.; Lai, L.; Suda, N.; Civin, D.; Chandra, V. Federated learning with non-iid data. arXiv 2018, arXiv:1806.00582. [Google Scholar] [CrossRef]
- Jing, X.-Y.; Zhang, X.; Zhu, X.; Wu, F.; You, X.; Gao, Y.; Shan, S.; Yang, J.Y. Multiset Feature Learning for Highly Imbalanced Data Classification. IEEE Trans. Pattern Anal. Mach. Intell. 2019, 43, 139–156. [Google Scholar] [CrossRef] [PubMed]
- Li, X.; He, D.; Zhang, X. Efficient Algorithms for Approximate k-Radius Coverage Query on Large-Scale Road Networks. IEEE Trans. Intell. Transp. Syst. 2025, 26, 1631–1644. [Google Scholar] [CrossRef]
- Liu, C.; Wen, J.; Xu, Y.; Zhang, B.; Nie, L.; Zhang, M. Reliable Representation Learning for Incomplete Multi-View Missing Multi-Label Classification. IEEE Trans. Pattern Anal. Mach. Intell. 2025, 47, 4940–4956. [Google Scholar] [CrossRef]
- Zhou, X.; Xu, X.; Liang, W.; Zeng, Z.; Yan, Z. Deep-learning-enhanced multitarget detection for end–edge–cloud surveillance in smart IoT. IEEE Internet Things J. 2021, 8, 12588–12596. [Google Scholar] [CrossRef]
- Pan, K.; Chi, H. Research on Printmaking Image Classification and Creation Based on Convolutional Neural Network. Int. J. Image Graph. 2025, 25, 2550019. [Google Scholar] [CrossRef]
- Fallah, A.; Mokhtari, A.; Ozdaglar, A. Personalized federated learning with theoretical guarantees: A model-agnostic meta-learning approach. Adv. Neural Inf. Process. Syst. 2020, 33, 3557–3568. [Google Scholar]
- Islam, M.S.; Javaherian, S.; Xu, F.; Yuan, X.; Chen, L.; Tzeng, N.F. FedClust: Optimizing federated learning on non-IID data through weight-driven client clustering. In Proceedings of the 2024 IEEE International Parallel and Distributed Processing Symposium Workshops (IPDPSW), San Francisco, CA, USA, 27–31 May 2024; IEEE: San Francisco, CA, USA, 2024; pp. 1184–1186. [Google Scholar]
- Ghosh, A.; Chung, J.; Yin, D.; Ramchandran, K. An efficient framework for clustered federated learning. Adv. Neural Inf. Process. Syst. 2020, 33, 19586–19597. [Google Scholar] [CrossRef]
- Luo, G.; Chen, N.; He, J.; Jin, B.; Zhang, Z.; Li, Y. Privacy-preserving clustering federated learning for non-IID data. Future Gener. Comput. Syst. 2024, 154, 384–395. [Google Scholar] [CrossRef]
- Briggs, C.; Fan, Z.; Andras, P. Federated learning with hierarchical clustering of local updates to improve training on non-IID data. In Proceedings of the International Joint Conference on Neural Networks (IJCNN), Glasgow, UK, 19–24 July 2020; IEEE: Glasgow, UK, 2020; pp. 1–9. [Google Scholar]
- Cho, Y.J.; Wang, J.; Joshi, G. Client selection in federated learning: Convergence analysis and power-of-choice selection strategies. arXiv 2020, arXiv:2010.01243. [Google Scholar]
- Lai, F.; Zhu, X.; Madhyastha, H.V.; Chowdhury, M. Oort: Efficient federated learning via guided participant selection. In Proceedings of the 15th {USENIX} Symposium on Operating Systems Design and Implementation ({OSDI} 21), Online, 14–16 July 2021; pp. 19–35. [Google Scholar]
- Qi, L.; Zhang, X.; Dou, W.; Hu, C.; Yang, C.; Chen, J. A two-stage locality-sensitive hashing based approach for privacy-preserving mobile service recommendation in cross-platform edge environment. Future Gener. Comput. Syst. 2018, 88, 636–643. [Google Scholar] [CrossRef]
- Wu, Z.; Wen, J.; Xu, Y.; Yang, J.; Li, X.; Zhang, D. Enhanced Spatial Feature Learning for Weakly Supervised Object Detection. IEEE Trans. Neural Netw. Learn. Syst. 2024, 35, 961–972. [Google Scholar] [CrossRef]
- Zhou, X.; Liang, W.; Kevin, I.; Wang, K.; Yan, Z.; Yang, L.T.; Jin, Q. Decentralized P2P federated learning for privacy-preserving and resilient mobile robotic systems. IEEE Wirel. Commun. 2023, 30, 82–89. [Google Scholar] [CrossRef]
- Li, T.; Sahu, A.K.; Zaheer, M.; Sanjabi, M.; Talwalkar, A.; Smith, V. Federated optimization in heterogeneous networks. Proc. Mach. Learn. Syst. 2020, 2, 429–450. [Google Scholar]
- Karimireddy, S.P.; Kale, S.; Mohri, M.; Reddi, S.; Stich, S.; Suresh, A. Scaffold: Stochastic controlled averaging for federated learning. In Proceedings of the International Conference on Machine Learning, PMLR, Virtual Event, 13–18 July 2020; pp. 5132–5143. [Google Scholar]
- Acar, D.A.E.; Zhao, Y.; Navarro, R.M.; Mattina, M.; Whatmough, P.N.; Saligrama, V. Federated learning based on dynamic regularization. arXiv 2021, arXiv:2111.04263. [Google Scholar] [CrossRef]
- Liang, X.; Tang, H.; Zhao, T.; Chen, X.; Huang, Z. PyCFL: A python library for clustered federated learning. In Proceedings of the 31st International Joint Conference on Artificial Intelligence (IJCAI), Vienna, Austria, 23–29 July 2022. [Google Scholar]
- Duan, M.; Liu, D.; Ji, X.; Liu, R.; Liang, L.; Chen, X.; Tan, Y. Fedgroup: Efficient federated learning via decomposed similarity-based clustering. In Proceedings of the 2021 IEEE Intl Conf on Parallel & Distributed Processing with Applications, Big Data & Cloud Computing, Sustainable Computing & Communications, Social Computing & Networking (ISPA/BDCloud/SocialCom/SustainCom), New York, NY, USA, 30 September–3 October 2021; IEEE: New York City, NY, USA, 2021; pp. 228–237. [Google Scholar]
- Kim, Y.; Al Hakim, E.; Haraldson, J.; Eriksson, H.; da Silva, J.M.B.; Fischione, C. Dynamic clustering in federated learning. In Proceedings of the ICC 2021-IEEE International Conference on Communications, Montreal, QC, Canada, 14–23 June 2021; IEEE: Montreal, QC, Canada, 2021; pp. 1–6. [Google Scholar]
- Gong, B.; Xing, T.; Liu, Z.; Wang, J.; Liu, X. Adaptive clustered federated learning for heterogeneous data in edge computing. Mob. Netw. Appl. 2022, 27, 1520–1530. [Google Scholar] [CrossRef]
- Gong, B.; Xing, T.; Liu, Z.; Xi, W.; Chen, X. Adaptive client clustering for efficient federated learning over non-iid and imbalanced data. IEEE Trans. Big Data 2022, 10, 1051–1065. [Google Scholar] [CrossRef]
- Wang, J.; Zhao, Z.; Hong, W.; Quek, T.Q.; Ding, Z. Clustered federated learning with model integration for non-iid data in wireless networks. In Proceedings of the 2022 IEEE Globecom Workshops (GC Wkshps), Rio de Janeiro, Brazil, 4–8 December 2022; IEEE: Rio de Janeiro, Brazil, 2022; pp. 1634–1639. [Google Scholar]
- Long, G.; Xie, M.; Shen, T.; Zhou, T.; Wang, X.; Jiang, J. Multi-center federated learning: Clients clustering for better personalization. World Wide Web 2023, 26, 481–500. [Google Scholar] [CrossRef]
- Lee, H.; Seo, D. FedLC: Optimizing federated learning in non-IID data via label-wise clustering. IEEE Access 2023, 11, 42082–42095. [Google Scholar] [CrossRef]
- Gong, B.; Li, H.; Liu, Z.; Xing, T.; Hou, R.; Chen, X. FedAC: An Adaptive Clustered Federated Learning Framework for Heterogeneous Data. arXiv 2024, arXiv:2403.16460. [Google Scholar] [CrossRef]
- Zhang, Y.; Li, T.; Wang, Z.; Wu, Z.; Shariff, M.H.B.M.; Dick, R.P.; Mao, Z.M. Efficient Distribution Similarity Identification in Clustered Federated Learning via Principal Angles Between Client Data Subspaces. In Proceedings of the Eleventh International Conference on Learning Representations (ICLR), Kigali, Rwanda, 1–5 May 2023. [Google Scholar]
- Wen, J.; Deng, S.; Fei, L.; Zhang, Z.; Zhang, B.; Zhang, Z.; Xu, Y. Discriminative Regression With Adaptive Graph Diffusion. IEEE Trans. Neural Netw. Learn. Syst. 2024, 35, 1797–1809. [Google Scholar] [CrossRef]
- Xu, C.; Ren, J.; She, L.; Zhang, Y.; Qin, Z.; Ren, K. EdgeSanitizer: Locally differentially private deep inference at the edge for mobile data analytics. IEEE Internet Things J. 2019, 6, 5140–5151. [Google Scholar] [CrossRef]
- Sharma, M.; Saripalli, S.R.; Gupta, A.K.; Palta, P.; Pandey, D. Image Processing-Based Method of Evaluation of Stress from Grain Structures of Through Silicon Via (TSV). Int. J. Image Graph. 2025, 25, 2550008. [Google Scholar] [CrossRef]
- Alistarh, D.; Grubic, D.; Li, J.; Tomioka, R.; Vojnovic, M. QSGD: Communication-efficient SGD via gradient quantization and encoding. Adv. Neural Inf. Process. Syst. 2017, 30. [Google Scholar]
- Lin, Y.; Han, S.; Mao, H.; Wang, Y.; Dally, W.J. Deep gradient compression: Reducing the communication bandwidth for distributed training. arXiv 2017, arXiv:1712.01887. [Google Scholar]
- He, C.; Annavaram, M.; Avestimehr, S. Group knowledge transfer: Federated learning of large cnns at the edge. Adv. Neural Inf. Process. Syst. 2020, 33, 14068–14080. [Google Scholar]
- Zhou, X.; Ye, X.; Kevin, I.; Wang, K.; Liang, W.; Nair, N.K.C.; Jin, Q. Hierarchical federated learning with social context clustering-based participant selection for internet of medical things applications. IEEE Trans. Comput. Soc. Syst. 2023, 10, 1742–1751. [Google Scholar] [CrossRef]
- Li, T.; Sanjabi, M.; Beirami, A.; Smith, V. Fair resource allocation in federated learning. arXiv 2019, arXiv:1905.10497. [Google Scholar]
- Goetz, J.; Malik, K.; Bui, D.; Moon, S.; Liu, H.; Kumar, A. Active federated learning. arXiv 2019, arXiv:1909.12641. [Google Scholar] [CrossRef]
- Wu, Z.; Liu, C.; Wen, J.; Xu, Y.; Yang, J.; Li, X. Spatial Continuity and Nonequal Importance in Salient Object Detection with Image-Category Supervision. IEEE Trans. Neural Netw. Learn. Syst. 2025, 36, 8565–8576. [Google Scholar] [CrossRef]
- Ghosh, A.; Hong, J.; Yin, D.; Ramchandran, K. Robust federated learning in a heterogeneous environment. arXiv 2019, arXiv:1906.06629. [Google Scholar] [CrossRef]
- Wang, Z.; Chai, Y.; Sun, C.; Rui, X.; Mi, H.; Zhang, X.; Yu, P.S. A Weighted Symmetric Graph Embedding Approach for Link Prediction in Undirected Graphs. IEEE Trans. Cybern. 2024, 54, 1037–1047. [Google Scholar] [CrossRef] [PubMed]
- Jiang, F.; Dong, L.; Wang, K.; Yang, K.; Pan, C. Distributed resource scheduling for large-scale MEC systems: A multiagent ensemble deep reinforcement learning with imitation acceleration. IEEE Internet Things J. 2021, 9, 6597–6610. [Google Scholar] [CrossRef]
- Eshwarappa, L.; Rajput, G.G. Optimal Classification Model for Text Detection and Recognition in Video Frames. Int. J. Image Graph. 2025, 25, 2550014. [Google Scholar] [CrossRef]
- Gao, W.; Zhou, J.; Lin, Y.; Wei, J. Compressed sensing-based privacy preserving in labeled dynamic social networks. IEEE Syst. J. 2022, 17, 2201–2212. [Google Scholar] [CrossRef]





| Ref. | Year | Objective | Technique | Limitation |
|---|---|---|---|---|
| Ghosh et al. [21] | 2020 | Cluster heterogeneous clients to improve accuracy. | Iterative Federated Clustering Algorithm (IFCA) | Requires the number of clusters to be pre-defined; less adaptable to dynamic client populations. |
| Sattler et al. [12] | 2020 | Improve robustness on non-IID data through similarity. | Model Update Similarity-based Clustering | Not a fully dynamic method; primarily used to show the benefit of grouping similar clients. |
| Briggs et al. [23] | 2020 | Dynamic clustering without pre-specifying cluster count. | FL with Hierarchical Clustering (FL + HC) | O(N2) complexity of hierarchical clustering is not scalable to a large number of clients. |
| Sattler et al. [33] | 2020 | Group clients based on model parameter similarity. | FedGroup: Hierarchical Clustering on Model Weights | Less effective if key distributional differences are only in final layers; computationally expensive. |
| Kim et al. [34] | 2021 | Dynamic clustering for time-series data using GANs. | FedGAN-based Clustering | The GAN training process on the client-side introduces significant computational and communication overhead. |
| Gong et al. [35] | 2022 | Increase accuracy and reduce communication. | AdaCFL: Pre-clustering + Hierarchical Clustering | Partial model weight selection may not align with the true underlying data distribution patterns. |
| Gong et al. [36] | 2022 | Adaptive clustering with local training adjustment. | AutoCFL: Weighted Voting Adaptive Clustering | The three-phase clustering and adjustment strategy increases communication overhead. |
| Wang et al. [37] | 2022 | Improve performance in data imbalance scenarios. | Weighted Clustered Federated Learning | Focuses on imbalance but may not generalize well to other types of heterogeneity. |
| Long et al. [38] | 2023 | Cluster clients based on local updates similarity. | Multi-center Federated Learning (K-Means based) | K-Means is inefficient for high-dimensional data, sensitive to outliers, and requires full client participation. |
| Lee et al. [39] | 2023 | Capture fine-grained similarity for better clusters. | FedLC: Layer-wised Clustered Federated Learning | Increases complexity by requiring layer-by-layer similarity calculations. |
| Luo et al. [22] | 2024 | CFL with formal privacy guarantees. | Privacy-Preserving Clustering Federated Learning | The addition of privacy mechanisms (e.g., DP) often leads to a slight degradation in model accuracy. |
| Gong et al. [40] | 2024 | Adaptive clustering for heterogeneous data. | FedAC: K-Means based Split/Merge | Relies on K-Means, limiting scalability; lacks a performance-based quality control mechanism. |
| Zhang et al. [41] | 2023 | Efficient static identification of distribution similarity. | Clustering via Principal Angles Between Subspaces | A static, pre-training method; cannot adapt to data distributions that change during training. |
| Symbol | Description |
|---|---|
| Set of all clients. | |
| Number of client clusters. | |
| Total communication rounds. | |
| Current communication round index. | |
| Model parameter dimension. | |
| Local training batch size. | |
| i,j | Client indices. |
| Client i’s local dataset. | |
| Client i’s local model after round t. | |
| Client i’s local training epochs. | |
| k | Cluster index. |
| Set of clients in cluster k. | |
| Cluster k’s model (centroid) at start of round t. | |
| Indicator: client i in cluster k. | |
| Cluster ’s dispersion-to-separation ratio. | |
| Cluster ’s intra-cluster dispersion. | |
| Cluster k’s inter-cluster separation. | |
| Client similarity matrix. | |
| Components of low-rank factorization of M. | |
| Rank used for low-rank approximation. | |
| Tunable threshold for positive incentive mechanism. | |
| Predefined threshold for splitting a cluster. | |
| Predefined threshold for merging clusters. | |
| Standard deviation of Gaussian noise for privacy. |
| Dataset | Algorithm | 6 | 4 | |
|---|---|---|---|---|
| MNIST | AS-CFL | 0.917 | 0.915 | 0.910 |
| CFL | 0.918 | 0.912 | 0.905 | |
| IFCA | 0.890 | 0.882 | 0.865 | |
| FedAvg | 0.875 | 0.850 | 0.805 | |
| EMNIST | AS-CFL | 0.693 | 0.688 | 0.682 |
| CFL | 0.695 | 0.685 | 0.675 | |
| IFCA | 0.620 | 0.605 | 0.580 | |
| FedAvg | 0.580 | 0.545 | 0.500 |
| Component | MNIST | Convergence Rounds | EMNIST | Convergence Rounds |
|---|---|---|---|---|
| Full AS-CFL | 0.915 | 40 | 0.688 | 45 |
| w/o Positive Incentive | 0.885 | 55 | 0.652 | 62 |
| w/o Dynamic Adjustment | 0.902 | 48 | 0.670 | 53 |
| w/o Low-Rank Factorization | 0.921 | 52 | 0.689 | 58 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yi, G.; Wu, Z.; Zhang, X.; Li, X. Clustered Federated Learning with Adaptive Similarity for Non-IID Data. Electronics 2025, 14, 4454. https://doi.org/10.3390/electronics14224454
Yi G, Wu Z, Zhang X, Li X. Clustered Federated Learning with Adaptive Similarity for Non-IID Data. Electronics. 2025; 14(22):4454. https://doi.org/10.3390/electronics14224454
Chicago/Turabian StyleYi, Guodong, Zhouyang Wu, Xinyu Zhang, and Xiaocui Li. 2025. "Clustered Federated Learning with Adaptive Similarity for Non-IID Data" Electronics 14, no. 22: 4454. https://doi.org/10.3390/electronics14224454
APA StyleYi, G., Wu, Z., Zhang, X., & Li, X. (2025). Clustered Federated Learning with Adaptive Similarity for Non-IID Data. Electronics, 14(22), 4454. https://doi.org/10.3390/electronics14224454

