Multi-Source Unsupervised Domain Adaptation with Prototype Aggregation
Abstract
:1. Introduction
- We propose a prototype aggregation method to address three issues, e.g., the class-level discrepancy quantification, the unavailability of noisy pseudo labels, and source transferability discrimination. Furthermore, we first propose a complementary mechanism where the class-level and domain-level alignment methods can work together.
- We propose a similarity score-based strategy to assess the transferability of source domains. Additionally, we design two prototype aggregation discrepancy metrics to quantify the cross-domain discrepancy at the class and domain levels.
- Extensive experiments on three widely used benchmark datasets demonstrate competitive results, showing superior and effective performance compared to state-of-the-art methods.
2. Related Work
2.1. Single-Source Domain Adaptation
2.2. Multi-Source Domain Adaptation
3. Method
3.1. Problem Description
3.2. Overall Scheme
3.3. Prototype Generation
3.4. Prototype Aggregation
3.5. Objective Construction
Algorithm 1: Algorithm of PAMDA |
3.6. Theoretical Error Analysis
4. Experiments
4.1. Datasets
4.2. Comparison with Other Algorithms
4.3. Experimental Setups
4.4. Result
4.5. Ablation Analysis
4.6. Component Comparison
4.7. Hyperparameter Analysis
4.8. Visualization
4.9. Discussion
- The PAMDA algorithm demonstrated superior performance on three benchmarks, which validates the efficacy of the PAMDA algorithm.
- Ablation analytical results show that each component of the PAMDA model is positive for performance improvement.
- Component comparison experimental results reveal two points. First, class–prototype aggregation explores the correlation of class-specific semantic features in greater depth than domain–prototype aggregation. However, the reliance on high-confidence pseudo labels leads to the misalignment of low-confidence pseudo-annotated target features. Second, class–prototype aggregation in synergy with domain–prototype aggregation can align both class features of high-confidence pseudo-labeled target samples and domain features of low-confidence pseudo-labeled target samples.
- Hyperparameter analytical results reveal the resilience and stability of PAMDA for hyperparameter variation.
- The visualization results demonstrate that our PAMDA model establishes a better distributional alignment compared to the Source Only model and that our similarity score-based strategy delves deeply into the class structural features.
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
MSDA | Multi-Source Domain Adaptation |
SDA | Single-Source Domain Adaptation |
MMD | Maximum Mean Discrepancy |
KL | Kullback–Leibler |
JS | Jensen–Shannon |
MRF-MSDA | Markov Random Field for Multi-Source Domain Adaptation |
References
- Yosinski, J.; Clune, J.; Bengio, Y.; Lipson, H. How transferable are features in deep neural networks? Adv. Neural Inf. Process. Syst. 2014, 27, 3320–3328. [Google Scholar]
- Quinonero-Candela, J.; Sugiyama, M.; Schwaighofer, A.; Lawrence, N.D. Dataset Shift in Machine Learning; Mit Press: Cambridge, MA, USA, 2008. [Google Scholar]
- Saenko, K.; Kulis, B.; Fritz, M.; Darrell, T. Adapting visual category models to new domains. In Proceedings of the Computer Vision–ECCV 2010: 11th European Conference on Computer Vision, Heraklion, Greece, 5–11 September 2010; Proceedings, Part IV 11. Springer: Berlin/Heidelberg, Germany, 2010; pp. 213–226. [Google Scholar]
- Ong, J.; Waisberg, E.; Masalkhi, M.; Kamran, S.A.; Lowry, K.; Sarker, P.; Zaman, N.; Paladugu, P.; Tavakkoli, A.; Lee, A.G. Artificial intelligence frameworks to detect and investigate the pathophysiology of spaceflight associated neuro-ocular syndrome (SANS). Brain Sci. 2023, 13, 1148. [Google Scholar] [CrossRef]
- Kumari, S.; Singh, P. Deep learning for unsupervised domain adaptation in medical imaging: Recent advancements and future perspectives. Comput. Biol. Med. 2024, 170, 107912. [Google Scholar] [CrossRef] [PubMed]
- Buonocore, T.M.; Crema, C.; Redolfi, A.; Bellazzi, R.; Parimbelli, E. Localizing in-domain adaptation of transformer-based biomedical language models. J. Biomed. Inform. 2023, 144, 104431. [Google Scholar] [CrossRef] [PubMed]
- Yang, L.; Balaji, Y.; Lim, S.N.; Shrivastava, A. Curriculum manager for source selection in multi-source domain adaptation. In Proceedings of the Computer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, 23–28 August 2020; Proceedings, Part XIV 16. Springer: Berlin/Heidelberg, Germany, 2020; pp. 608–624. [Google Scholar]
- Xu, Y.; Kan, M.; Shan, S.; Chen, X. Mutual learning of joint and separate domain alignments for multi-source domain adaptation. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, Waikoloa, HI, USA, 3–8 January 2022; pp. 1890–1899. [Google Scholar]
- Peng, X.; Bai, Q.; Xia, X.; Huang, Z.; Saenko, K.; Wang, B. Moment matching for multi-source domain adaptation. In Proceedings of the IEEE/CVF international conference on computer vision, Seoul, Republic of Korea, 27 October–2 November 2019; pp. 1406–1415. [Google Scholar]
- Zhao, S.; Wang, G.; Zhang, S.; Gu, Y.; Li, Y.; Song, Z.; Xu, P.; Hu, R.; Chai, H.; Keutzer, K. Multi-source distilling domain adaptation. In Proceedings of the AAAI Conference on Artificial Intelligence, New York, NY, USA, 7–8 February 2020; Volume 34, pp. 12975–12983. [Google Scholar]
- Zhou, L.; Li, N.; Ye, M.; Zhu, X.; Tang, S. Source-free domain adaptation with class prototype discovery. Pattern Recognit. 2024, 145, 109974. [Google Scholar] [CrossRef]
- Zhou, C.; Wang, Z.; Du, B.; Luo, Y. Cycle Self-Refinement for Multi-Source Domain Adaptation. In Proceedings of the AAAI Conference on Artificial Intelligence, Vancouver, BC, Canada, 20–27 February 2024; Volume 38, pp. 17096–17104. [Google Scholar]
- Pan, S.J.; Yang, Q. A survey on transfer learning. IEEE Trans. Knowl. Data Eng. 2009, 22, 1345–1359. [Google Scholar] [CrossRef]
- Cheng, Z.; Wang, S.; Yang, D.; Qi, J.; Xiao, M.; Yan, C. Deep joint semantic adaptation network for multi-source unsupervised domain adaptation. Pattern Recognit. 2024, 151, 110409. [Google Scholar] [CrossRef]
- Li, Z.; Cai, R.; Chen, G.; Sun, B.; Hao, Z.; Zhang, K. Subspace identification for multi-source domain adaptation. Adv. Neural Inf. Process. Syst. 2024, 36. Available online: https://proceedings.neurips.cc/paper_files/paper/2023/hash/6cb7246003d556c4d1cbf9c17c392ee3-Abstract-Conference.html (accessed on 8 January 2025).
- Wang, H.; Xu, M.; Ni, B.; Zhang, W. Learning to combine: Knowledge aggregation for multi-source domain adaptation. In Proceedings of the Computer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, 23–28 August 2020; Proceedings, Part VIII 16. Springer: Berlin/Heidelberg, Germany, 2020; pp. 727–744. [Google Scholar]
- Xu, M.; Wang, H.; Ni, B. Graphical modeling for multi-source domain adaptation. IEEE Trans. Pattern Anal. Mach. Intell. 2022, 46, 1727–1741. [Google Scholar] [CrossRef] [PubMed]
- Zhu, Y.; Zhuang, F.; Wang, D. Aligning domain-specific distribution and classifier for cross-domain classification from multiple sources. In Proceedings of the AAAI Conference on Artificial Intelligence, Honolulu, HI, USA, 27 January–1 February 2019; Volume 33, pp. 5989–5996. [Google Scholar]
- Zhao, H.; Zhang, S.; Wu, G.; Moura, J.M.; Costeira, J.P.; Gordon, G.J. Adversarial multiple source domain adaptation. Adv. Neural Inf. Process. Syst. 2018, 31, 8568–8579. [Google Scholar]
- Qian, Q.; Wang, Y.; Zhang, T.; Qin, Y. Maximum mean square discrepancy: A new discrepancy representation metric for mechanical fault transfer diagnosis. Knowl.-Based Syst. 2023, 276, 110748. [Google Scholar] [CrossRef]
- Simon-Gabriel, C.J.; Barp, A.; Schölkopf, B.; Mackey, L. Metrizing weak convergence with maximum mean discrepancies. J. Mach. Learn. Res. 2023, 24, 1–20. [Google Scholar]
- Ge, P.; Ren, C.X.; Xu, X.L.; Yan, H. Unsupervised domain adaptation via deep conditional adaptation network. Pattern Recognit. 2023, 134, 109088. [Google Scholar] [CrossRef]
- Zhang, Y.; Pan, J.; Li, L.K.; Liu, W.; Chen, Z.; Liu, X.; Wang, J. On the properties of Kullback-Leibler divergence between multivariate Gaussian distributions. Adv. Neural Inf. Process. Syst. 2024, 36, 58152–58165. [Google Scholar]
- Cui, J.; Tian, Z.; Zhong, Z.; Qi, X.; Yu, B.; Zhang, H. Decoupled kullback-leibler divergence loss. arXiv 2023, arXiv:2305.13948. [Google Scholar]
- Chen, L.; Deng, Y.; Cheong, K.H. Permutation Jensen–Shannon divergence for random permutation set. Eng. Appl. Artif. Intell. 2023, 119, 105701. [Google Scholar] [CrossRef]
- Meng, Z.; He, H.; Cao, W.; Li, J.; Cao, L.; Fan, J.; Zhu, M.; Fan, F. A novel generation network using feature fusion and guided adversarial learning for fault diagnosis of rotating machinery. Expert Syst. Appl. 2023, 234, 121058. [Google Scholar] [CrossRef]
- Long, M.; Cao, Z.; Wang, J.; Jordan, M.I. Conditional adversarial domain adaptation. In Proceedings of the 32nd International Conference on Neural Information Processing Systems, Montréal, QC, Canada, 3–8 December 2018; pp. 1647–1657. [Google Scholar]
- Luo, X.; Chen, W.; Liang, Z.; Li, C.; Tan, Y. Adversarial style discrepancy minimization for unsupervised domain adaptation. Neural Netw. 2023, 157, 216–225. [Google Scholar] [CrossRef] [PubMed]
- Dayal, A.; Aishwarya, M.; Abhilash, S.; Mohan, C.K.; Kumar, A.; Cenkeramaddi, L.R. Adversarial unsupervised domain adaptation for hand gesture recognition using thermal images. IEEE Sens. J. 2023, 23, 3493–3504. [Google Scholar] [CrossRef]
- Li, J.; Yu, Z.; Du, Z.; Zhu, L.; Shen, H.T. A comprehensive survey on source-free domain adaptation. IEEE Trans. Pattern Anal. Mach. Intell. 2024, 46, 5743–5762. [Google Scholar] [CrossRef] [PubMed]
- Chen, Z.; Pan, Y.; Xia, Y. Reconstruction-driven dynamic refinement based unsupervised domain adaptation for joint optic disc and cup segmentation. IEEE J. Biomed. Health Inform. 2023, 27, 3537–3548. [Google Scholar] [CrossRef] [PubMed]
- Zhu, W.; Shi, B.; Feng, Z.; Tang, J. An unsupervised domain adaptation method for intelligent bearing fault diagnosis based on signal reconstruction by cycle-consistent adversarial learning. IEEE Sens. J. 2023, 23, 18477–18485. [Google Scholar] [CrossRef]
- Ben-David, S.; Blitzer, J.; Crammer, K.; Kulesza, A.; Pereira, F.; Vaughan, J.W. A theory of learning from different domains. Mach. Learn. 2010, 79, 151–175. [Google Scholar] [CrossRef]
- Crammer, K.; Kearns, M.; Wortman, J. Learning from Multiple Sources. J. Mach. Learn. Res. 2008, 9, 1757–1774. [Google Scholar]
- Mansour, Y.; Mohri, M.; Rostamizadeh, A. Domain adaptation with multiple sources. Adv. Neural Inf. Process. Syst. 2008, 21, 1041–1048. [Google Scholar]
- Wen, J.; Greiner, R.; Schuurmans, D. Domain aggregation networks for multi-source domain adaptation. In Proceedings of the International Conference on Machine Learning, PMLR, Virtual, 13–18 July 2020; pp. 10214–10224. [Google Scholar]
- Shui, C.; Li, Z.; Li, J.; Gagné, C.; Ling, C.X.; Wang, B. Aggregating from multiple target-shifted sources. In Proceedings of the International Conference on Machine Learning, PMLR, Virtual, 18–24 July 2021; pp. 9638–9648. [Google Scholar]
- Chen, Q.; Marchand, M. Algorithm-dependent bounds for representation learning of multi-source domain adaptation. In Proceedings of the International Conference on Artificial Intelligence and Statistics, PMLR, Valencia, Spain, 25–27 April 2023; pp. 10368–10394. [Google Scholar]
- Zhang, W.; Ouyang, W.; Li, W.; Xu, D. Collaborative and adversarial network for unsupervised domain adaptation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–22 June 2018; pp. 3801–3809. [Google Scholar]
- LeCun, Y.; Bottou, L.; Bengio, Y.; Haffner, P. Gradient-based learning applied to document recognition. Proc. IEEE 1998, 86, 2278–2324. [Google Scholar] [CrossRef]
- Ganin, Y.; Ustinova, E.; Ajakan, H.; Germain, P.; Larochelle, H.; Laviolette, F.; Marchand, M.; Lempitsky, V. Domain-adversarial training of neural networks. J. Mach. Learn. Res. 2016, 17, 1–35. [Google Scholar]
- Hull, J.J. A database for handwritten text recognition research. IEEE Trans. Pattern Anal. Mach. Intell. 1994, 16, 550–554. [Google Scholar] [CrossRef]
- Yuval, N. Reading digits in natural images with unsupervised feature learning. In Proceedings of the NIPS Workshop on Deep Learning and Unsupervised Feature Learning, Granada, Spain, 12–17 December 2011. [Google Scholar]
- Gong, B.; Shi, Y.; Sha, F.; Grauman, K. Geodesic flow kernel for unsupervised domain adaptation. In Proceedings of the 2012 IEEE Conference on Computer Vision and Pattern Recognition, Providence, RI, USA, 16–21 June 2012; IEEE: Piscataway, NJ, USA, 2012; pp. 2066–2073. [Google Scholar]
- Long, M.; Zhu, H.; Wang, J.; Jordan, M.I. Deep transfer learning with joint adaptation networks. In Proceedings of the International Conference on Machine Learning, PMLR, Sydney, Australia, 6–11 August 2017; pp. 2208–2217. [Google Scholar]
- Saito, K.; Watanabe, K.; Ushiku, Y.; Harada, T. Maximum classifier discrepancy for unsupervised domain adaptation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–22 June 2018; pp. 3723–3732. [Google Scholar]
- Long, M.; Cao, Y.; Wang, J.; Jordan, M. Learning transferable features with deep adaptation networks. In Proceedings of the International Conference on Machine Learning, PMLR, San Diego, CA, USA, 9–12 May 2015; pp. 97–105. [Google Scholar]
- Tzeng, E.; Hoffman, J.; Saenko, K.; Darrell, T. Adversarial discriminative domain adaptation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 7167–7176. [Google Scholar]
- Li, Y.; Yuan, L.; Chen, Y.; Wang, P.; Vasconcelos, N. Dynamic transfer for multi-source domain adaptation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Nashville, TN, USA, 19–25 June 2021; pp. 10998–11007. [Google Scholar]
- Xu, R.; Chen, Z.; Zuo, W.; Yan, J.; Lin, L. Deep cocktail network: Multi-source unsupervised domain adaptation with category shift. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–22 June 2018; pp. 3964–3973. [Google Scholar]
- Paszke, A.; Gross, S.; Chintala, S.; Chanan, G.; Yang, E.; DeVito, Z.; Lin, Z.; Desmaison, A.; Antiga, L.; Lerer, A. Automatic differentiation in pytorch. In Proceedings of the 31st Conference on Neural Information Processing Systems (NIPS 2017), Long Beach, CA, USA, 4–9 December 2017. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2016; pp. 770–778. [Google Scholar]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. ImageNet classification with deep convolutional neural networks. Commun. ACM 2017, 60, 84–90. [Google Scholar] [CrossRef]
- Ganin, Y.; Lempitsky, V. Unsupervised domain adaptation by backpropagation. In Proceedings of the International Conference on Machine Learning, PMLR, Lille, France, 6–11 July 2015; pp. 1180–1189. [Google Scholar]
- Van der Maaten, L.; Hinton, G. Visualizing data using t-SNE. J. Mach. Learn. Res. 2008, 9, 2579–2605. [Google Scholar]
Standards | Models | Avg | |||||
---|---|---|---|---|---|---|---|
Source Only | 97.2 ± 0.6 | 59.1 ± 0.6 | 84.6 ± 0.8 | 77.7 ± 0.7 | 84.7 ± 1.0 | 80.7 | |
Single Best | ADDA [48] | 97.9 ± 0.8 | 71.6 ± 0.5 | 86.5 ± 0.6 | 75.5 ± 0.5 | 92.8 ± 0.7 | 84.8 |
DAN [47] | 96.3 ± 0.5 | 63.8 ± 0.7 | 85.4 ± 0.8 | 62.5 ± 0.7 | 94.2 ± 0.9 | 80.4 | |
Source Only | 90.2 ± 0.8 | 63.4 ± 0.8 | 82.4 ± 0.7 | 62.9 ± 0.9 | 88.8 ± 0.8 | 77.5 | |
Source Combination | DAN [47] | 97.5 ± 0.6 | 67.9 ± 0.8 | 86.9 ± 0.5 | 67.8 ± 0.6 | 93.5 ± 0.8 | 82.7 |
MCD [46] | 96.2 ± 0.8 | 72.5 ± 0.7 | 87.5 ± 0.7 | 78.9 ± 0.8 | 95.3 ± 0.7 | 86.1 | |
DCTN [50] | 96.2 ± 0.8 | 70.5 ± 1.2 | 86.8 ± 0.8 | 77.6 ± 0.4 | 92.8 ± 0.3 | 84.8 | |
DRT [49] | 99.3 ± 0.1 | 81.0 ± 0.3 | 93.8 ± 0.3 | 77.6 ± 0.4 | 98.4 ± 0.1 | 91.8 | |
[9] | 98.4 ± 0.7 | 72.8 ± 1.1 | 89.6 ± 0.6 | 81.3 ± 0.9 | 96.1 ± 0.8 | 87.7 | |
Multiple Source | Ltc-MSDA [16] | 99.0 ± 0.4 | 85.6 ± 0.8 | 93.0 ± 0.5 | 83.2 ± 0.6 | 98.3 ± 0.4 | 91.8 |
MLAN [8] | 98.6 ± 0.0 | 86.3 ± 0.3 | 93.0 ± 0.3 | 82.8 ± 0.1 | 97.5 ± 0.2 | 91.6 | |
MRF-MSDA [17] | 99.2 ± 0.2 | 90.7 ± 0.7 | 94.7 ± 0.5 | 85.8 ± 0.7 | 98.5 ± 0.4 | 93.7 | |
PAMDA (ours) | 99.1 ± 0.0 | 95.2 ± 0.3 | 95.3 ± 0.2 | 82.7 ± 0.4 | 98.8 ± 0.1 | 94.2 |
Models | Avg | ||||
---|---|---|---|---|---|
Source Only | 99.0 | 98.3 | 87.8 | 86.1 | 92.8 |
DCTN [50] | 99.4 | 99.0 | 90.2 | 92.7 | 95.3 |
MCD [46] | 99.5 | 99.1 | 91.5 | 92.1 | 95.6 |
JAN [45] | 99.4 | 99.4 | 91.2 | 91.8 | 95.5 |
[9] | 99.4 | 99.2 | 91.5 | 94.1 | 96.1 |
PAMDA (ours) | 99.3 ± 0.3 | 100.0 ± 0.0 | 94.6 ± 0.1 | 95.2 ± 0.1 | 97.3 |
Standards | Models | Avg | |||
---|---|---|---|---|---|
Source Only | 95.3 | 99.2 | 50.3 | 81.6 | |
Single Best | ADDA [48] | 95.3 | 99.4 | 54.6 | 83.1 |
DAN [47] | 96.0 | 99.0 | 54.0 | 83.0 | |
Source Only | 93.2 | 97.7 | 51.6 | 80.8 | |
DAN [47] | 96.2 | 98.8 | 54.9 | 83.3 | |
Source Combination | JAN [45] | 95.9 | 99.4 | 54.6 | 83.3 |
MCD [46] | 96.2 | 99.5 | 54.4 | 83.4 | |
ADDA [48] | 96.0 | 99.2 | 55.9 | 83.7 | |
MDAN [19] | 95.4 | 99.2 | 55.2 | 83.3 | |
[9] | 96.2 | 99.4 | 55.4 | 83.7 | |
Multiple Source | MDDA [10] | 97.1 | 99.2 | 56.2 | 84.2 |
DCTN [50] | 96.9 | 99.6 | 54.9 | 83.8 | |
PAMDA (ours) | 97.2 ± 0.1 | 99.6 ± 0.0 | 56.5 ± 0.2 | 84.4 |
Avg | |||||||||
---|---|---|---|---|---|---|---|---|---|
√ | √ | 99.1 | 65.3 | 83.5 | 71.3 | 96.7 | 83.0 | ||
√ | √ | √ | 99.1 | 94.8 | 95.2 | 79.0 | 98.8 | 93.4 | |
√ | √ | √ | 98.4 | 82.2 | 88.3 | 77.1 | 95.3 | 88.3 | |
√ | √ | √ | 98.8 | 94.3 | 94.5 | 82.7 | 98.2 | 93.7 | |
√ | √ | √ | √ | 99.1 | 95.2 | 95.3 | 82.7 | 98.8 | 94.2 |
Models | Avg | |||||
---|---|---|---|---|---|---|
Class-Only | 99.1 | 94.8 | 95.2 | 79.0 | 98.8 | 93.4 |
Domain-Only | 98.9 | 92.1 | 93.6 | 81.1 | 96.6 | 92.5 |
PAMDA | 99.1 | 95.2 | 95.3 | 82.7 | 98.8 | 94.2 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Huang, M.; Xie, Z.; Sun, B.; Wang, N. Multi-Source Unsupervised Domain Adaptation with Prototype Aggregation. Mathematics 2025, 13, 579. https://doi.org/10.3390/math13040579
Huang M, Xie Z, Sun B, Wang N. Multi-Source Unsupervised Domain Adaptation with Prototype Aggregation. Mathematics. 2025; 13(4):579. https://doi.org/10.3390/math13040579
Chicago/Turabian StyleHuang, Min, Zifeng Xie, Bo Sun, and Ning Wang. 2025. "Multi-Source Unsupervised Domain Adaptation with Prototype Aggregation" Mathematics 13, no. 4: 579. https://doi.org/10.3390/math13040579
APA StyleHuang, M., Xie, Z., Sun, B., & Wang, N. (2025). Multi-Source Unsupervised Domain Adaptation with Prototype Aggregation. Mathematics, 13(4), 579. https://doi.org/10.3390/math13040579