Optimizing Client Participation in Communication-Constrained Federated LLM Adaptation with LoRA
Abstract
1. Introduction
- The proposed framework enables communication-aware client selection for federated adaptation of LLMs using LoRA, specifically designed for constrained wireless environments.
- To capture the tradeoff between performance and cost, the problem is formulated as a max-min optimization that jointly maximizes model accuracy while minimizing communication overhead.
- To efficiently solve the resulting non-convex problem, a genetic-algorithm (GA)-based method is introduced, allowing dynamic determination of the optimal number of clients per round.
- In addition, a structured peer-to-peer collaboration protocol with complexity is incorporated, supporting scalable update sharing without requiring full connectivity.
- Finally, extensive simulations under varying bandwidth budgets confirmed that the framework maintains competitive accuracy while substantially reducing communication cost compared to baselines.
| Algorithm 1 LoRaC-GA: GA for client selection in federated LLM adaptation |
| Input: , B, S, R, population size P, generations G, crossover rate , mutation rate , elitism ratio Output: Optimal number of clients
|
2. Related Work
2.1. Communication-Efficient FL and Client Selection
2.2. Parameter-Efficient Fine-Tuning in FL and NLP
2.3. Generative and Multi-Criteria Client Selection
2.4. GA-Based Optimization in FL
3. System Model
3.1. Communication and Model Update
3.2. Accuracy–Communication Tradeoff
3.3. Optimization Problem
3.4. Client Collaboration Protocol
4. Proposed Method: LoRaC-GA
4.1. Client-Collaboration Protocol
4.2. Problem-Specific Formulation
- denotes the model accuracy achieved when K clients participate per round.
- is the total communication cost.
- B is the communication budget.
4.3. GA Framework
4.4. Theoretical Insight
- Pipeline (step-by-step).
- (i)
- Profile for candidate K under the non-IID FL setup with LoRA.
- (ii)
- Compute and the efficiency term .
- (iii)
- Evaluate .
- (iv)
- Run GA for G generations with population P (tournament selection, crossover , mutation , elitism ratio ).
- (v)
- Return subject to .
5. Simulation Results
5.1. Simulation Setup
5.2. Comparison with Baselines
5.3. Fitness Function Analysis
5.4. Optimal K Under Varying Budgets
5.5. Convergence Analysis of LoRaC-GA
5.6. Convergence Speed
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
| Abbreviation | Description |
| FL | federated learning |
| LLM | large language model |
| PEFT | parameter-efficient fine-tuning |
| LoRA | low-rank adaptation |
| P2P | peer-to-peer (sparse overlay, messages) |
| GA | genetic algorithm |
| K | number of selected clients per round (decision variable) |
| R | number of communication rounds |
| S | per-client uplink payload (adapter update size) |
| B | total bandwidth/communication budget |
| empirical model accuracy when K clients participate | |
| total communication cost |
References
- McMahan, B.; Moore, E.; Ramage, D.; Hampson, S.; y Arcas, B.A. Communication-Efficient Learning of Deep Networks from Decentralized Data. In Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, Lauderdale, FL, USA, 20–22 April 2017. [Google Scholar]
- Nishio, T.; Yonetani, R. Client Selection for Federated Learning with Heterogeneous Resources in Mobile Edge. In Proceedings of the ICC 2019—2019 IEEE International Conference on Communications (ICC), Shanghai, China, 20–24 May 2019. [Google Scholar]
- Cho, Y.J.; Wang, J.; Joshi, G. Client Selection in Federated Learning: Convergence Analysis and Power-of-Choice Strategies. In Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, Virtual, 13–18 April 2021. [Google Scholar]
- Maciel, F.; Souza, A.M.D.; Bittencourt, L.F.; Villas, L.A. Resource Aware Client Selection for Federated Learning in IoT Scenarios. In Proceedings of the 2023 19th International Conference on Distributed Computing in Smart Systems and the Internet of Things (DCOSS-IoT), Pafos, Cyprus, 19–21 June 2023. [Google Scholar]
- Yan, Y.; Yang, Q.; Tang, S.; Shi, Z. FeDeRA: Efficient Fine-tuning of Language Models in Federated Learning Leveraging Weight Decomposition. arXiv 2024, arXiv:2404.18848. [Google Scholar] [CrossRef]
- Kalra, S.; Wen, J.; Cresswell, J.C.; Volkovs, M.; Tizhoosh, H.R. Decentralized Federated Learning Through Proxy Model Sharing. Nat. Commun. 2023, 14, 2899. [Google Scholar] [CrossRef] [PubMed]
- Chai, Z.; Ali, A.; Truex, S.; Anwar, A.; Baracaldo, N.; Zhou, Y.; Ludwig, H.; Yan, F.; Cheng, Y. TiFL: A Tier-based Federated Learning System. arXiv 2020, arXiv:2001.09249. [Google Scholar] [CrossRef]
- Lai, F.; Zhu, X.; Madhyastha, H.V.; Chowdhury, M. Oort: Efficient Federated Learning via Guided Participant Selection. In Proceedings of the the 15th USENIX Symposium on Operating Systems Design and Implementation (OSDI), Virtual, 14–16 July 2021; USENIX Association: Berkeley, CA, USA, 2021; pp. 19–35. [Google Scholar]
- Dakhia, Z.; Merenda, M. Client Selection in Federated Learning on Resource-Constrained Devices: A Game Theory Approach. Appl. Sci. 2025, 15, 7556. [Google Scholar] [CrossRef]
- Cai, D.; Wu, Y.; Wang, S.; Lin, F.X.; Xu, M. FedAdapter: Efficient Federated Learning for Modern NLP. arXiv 2022, arXiv:2205.10162. [Google Scholar] [CrossRef]
- Zhao, H.; Du, W.; Li, F.; Li, P.; Liu, G. FedPrompt: Communication-Efficient and Privacy-Preserving Prompt Tuning in Federated Learning. arXiv 2022, arXiv:2208.12268. [Google Scholar] [CrossRef]
- Che, T.; Liu, J.; Zhou, Y.; Ren, J.; Zhou, J.; Sheng, V.S.; Dai, H.; Dou, D. FedPepTAO: Parameter-Efficient Prompt Tuning with Adaptive Optimization for Federated Learning of LLMs. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing (EMNLP), Singapore, 6–10 December 2023; Association for Computational Linguistics: Stroudsburg, PA, USA, 2023; pp. 8760–8774. [Google Scholar]
- Ning, Z.; Tian, C.; Xiao, M.; Fan, W.; Wang, P.; Li, L.; Wang, P.; Zhou, Y. FedGCS: A Generative Framework for Efficient Client Selection in Federated Learning. In Proceedings of the the 33rd International Joint Conference on Artificial Intelligence (IJCAI), Jeju, Republic of Korea, 3–9 August 2024; IJCAI Organization: California City, CA, USA, 2024; pp. 4743–4750. [Google Scholar]
- Tahir, M.; Ali, M.I. FedPROM: Multi-Criterion Client Selection for Efficient Federated Learning. Proc. AAAI Spring Symp. Ser. 2024, 3, 318–322. [Google Scholar]
- Gen, M.; Altiparmak, F.; Lin, L. A genetic algorithm for two-stage transportation problem using priority-based encoding. OR Spectr. 2006, 28, 337–354. [Google Scholar] [CrossRef]
- Lee, S.; Lee, J.; Park, H.S.; Choi, J.K. A novel fair and scalable relay control scheme for Internet of Things in LoRa-based low-power wide-area networks. IEEE Internet Things J. 2020, 8, 5985–6001. [Google Scholar] [CrossRef]
- Solat, F.; Kim, T.Y.; Lee, J. A novel group management scheme of clustered federated learning for mobile traffic prediction in mobile edge computing systems. J. Commun. Netw. 2023, 25, 480–490. [Google Scholar] [CrossRef]
- Yang, Z.; Chen, M.; Saad, W.; Hong, C.S.; Shikh-Bahaei, M. Energy Efficient Federated Learning Over Wireless Communication Networks. IEEE Trans. Wirel. Commun. 2020, 20, 1935–1949. [Google Scholar] [CrossRef]
- Qin, Z.; Li, G.Y.; Ye, H. Federated Learning and Wireless Communications. IEEE Wirel. Commun. 2021, 28, 134–140. [Google Scholar] [CrossRef]
- Li, T.; Sahu, A.K.; Talwalkar, A.; Smith, V. Federated Optimization in Heterogeneous Networks. In Proceedings of the 2020 3rd MLSys Conference, Austin, TX, USA, 2–4 March 2020; MLSys: Santa Clara, CA, USA, 2020; pp. 429–450. [Google Scholar]
- Kairouz, P.; McMahan, H.B.; Avent, B.; Bellet, A.; Bennis, M.; Bhagoji, A.N.; Bonawitz, K.; Charles, Z.; Cormode, G.; Cummings, R.; et al. Advances and Open Problems in Federated Learning. Found. Trends Mach. Learn. 2021, 14, 1–210. [Google Scholar] [CrossRef]
- Gen, M.; Cheng, R. Genetic Algorithms and Engineering Optimization; John Wiley & Sons: Hoboken, NJ, USA, 2000. [Google Scholar]
- Cohen, G.; Afshar, S.; Tapson, J.; Schaik, A.V. EMNIST: Extending MNIST to handwritten letters. In Proceedings of the 2017 International Joint Conference on Neural Networks (IJCNN), Anchorage, AK, USA, 14–19 May 2017. [Google Scholar]





| Work | Objective | Technique | Communication Efficiency | Limitations/Gap |
|---|---|---|---|---|
| FedAvg [1] | Communication-efficient FL baseline | Local training + model averaging | Reduces communication vs. centralized training | Struggles with non-IID data; fixed client count |
| TiFL [7] | Handle stragglers in heterogeneous FL | Tier-based client grouping | Up to 6× faster convergence | Does not address bandwidth budget explicitly |
| Oort [8] | Guided client selection | Utility + system-aware sampling | 14× faster time-to-accuracy | Optimizes which clients, not how many |
| RBPS [9] | Multi-objective client payoff | Game-theoretic participation | Balances accuracy, energy, privacy | Requires client self-evaluation; no explicit LLM focus |
| FedAdapter [10] | Efficient FL for NLP | Adapter modules in Transformers | Up to 155× faster convergence | Fixed client participation, no budget awareness |
| FedPrompt [11] | Prompt-tuning in FL | Soft prompt aggregation | Uses only 0.01% of parameters | Robustness issues under backdoors |
| FedPepTAO [12] | Handle non-IID data in PEFT | Adaptive optimization for prompts | Efficient under client drift | Still assumes fixed client count |
| FedGCS [13] | Flexible client selection | Generative encoding/decoding | Scalable decision making | Focuses on selection, not PEFT or bandwidth |
| FedPROM [14] | Multi-criteria selection | Optimization across accuracy, latency, resources | Balances multiple objectives | No explicit consideration of LoRA or LLM PEFT |
| FedGM [17] | Group management in clustered FL | Optimization across accuracy, latency, resources | Reduces idle time and grouping cost in MEC | Focuses on MEC clustering, not LLM PEFT |
| This Work (LoRaC-GA) | Optimizing client count under bandwidth | GA-based max-min optimization + LoRA | Structured overlay: message complexity | First to integrate bandwidth-aware client count with PEFT for LLMs |
| Parameter (Symbol) | Value/Description |
|---|---|
| Max clients () | 100 |
| Rounds (R) | 10 |
| LoRA size (S) | 0.0833 MB |
| Bandwidth (B) | {10, 50, 100, 400, 1000, 4000} MB |
| Accuracy () | Empirical, non-convex |
| Comm. cost () | |
| Population (P) | 20 |
| Generations (G) | 30 |
| Selection rate () | {0.1, 0.2, 0.3, 0.4} |
| Crossover prob. () | 0.5 |
| Mutation prob. () | 0.2 |
| Elitism ratio () | 10% (top preserved) |
| Budget (MB) | Optimal K | Accuracy (%) | Comm. Cost (MB) |
|---|---|---|---|
| 10 | 2 | 85.1 | 1.67 |
| 50 | 4 | 91.7 | 3.33 |
| 100 | 5 | 93.1 | 4.17 |
| 400 | 5 | 93.1 | 4.17 |
| 1000 | 12 | 96.6 | 10.00 |
| 4000 | 10 | 96.6 | 8.33 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Solat, F.; Lee, J. Optimizing Client Participation in Communication-Constrained Federated LLM Adaptation with LoRA. Sensors 2025, 25, 6538. https://doi.org/10.3390/s25216538
Solat F, Lee J. Optimizing Client Participation in Communication-Constrained Federated LLM Adaptation with LoRA. Sensors. 2025; 25(21):6538. https://doi.org/10.3390/s25216538
Chicago/Turabian StyleSolat, Faranaksadat, and Joohyung Lee. 2025. "Optimizing Client Participation in Communication-Constrained Federated LLM Adaptation with LoRA" Sensors 25, no. 21: 6538. https://doi.org/10.3390/s25216538
APA StyleSolat, F., & Lee, J. (2025). Optimizing Client Participation in Communication-Constrained Federated LLM Adaptation with LoRA. Sensors, 25(21), 6538. https://doi.org/10.3390/s25216538

