Next Article in Journal
Finite-Time Stability of a Class of Nonstationary Nonlinear Fractional Order Time Delay Systems: New Gronwall–Bellman Inequality Approach
Previous Article in Journal
Learning Gaussian Bayesian Network from Censored Data Subject to Limit of Detection by the Structural EM Algorithm
Previous Article in Special Issue
DRHT: A Hybrid Mathematical Model for Accurate Ultrasound Probe Calibration and Efficient 3D Reconstruction
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Gradual Geometry-Guided Knowledge Distillation for Source-Data-Free Domain Adaptation

IMI Group, University of Shanghai for Science and Technology, Shanghai 200093, China
*
Author to whom correspondence should be addressed.
Mathematics 2025, 13(9), 1491; https://doi.org/10.3390/math13091491
Submission received: 8 April 2025 / Revised: 19 April 2025 / Accepted: 27 April 2025 / Published: 30 April 2025
(This article belongs to the Special Issue Robust Perception and Control in Prognostic Systems)

Abstract

Due to access to the source data during the transfer phase, conventional domain adaptation works have recently raised safety and privacy concerns. More research attention thus shifts to a more practical setting known as source-data-free domain adaptation (SFDA). The new challenge is how to obtain reliable semantic supervision in the absence of source domain training data and the labels on the target domain. To that end, in this work, we introduce a novel Gradual Geometry-Guided Knowledge Distillation (G2KD) approach for SFDA. Specifically, to address the lack of supervision, we used local geometry of data to construct a more credible probability distribution over the potential categories, termed geometry-guided knowledge. Then, knowledge distillation was adopted to integrate this extra information for boosting the adaptation. More specifically, first, we constructed a neighborhood geometry for any target data using a similarity comparison on the whole target dataset. Second, based on pre-obtained semantic estimation by clustering, we mined soft semantic representations expressing the geometry-guided knowledge by semantic fusion. Third, using the soften labels, we performed knowledge distillation regulated by the new objective. Considering the unsupervised setting of SFDA, in addition to the distillation loss and student loss, we introduced a mixed entropy regulator that minimized the entropy of individual data as well as maximized the mutual entropy with augmentation data to utilize neighbor relation. Our contribution is that, through local geometry discovery with semantic representation and self-knowledge distillation, the semantic information hidden in the local structures is transformed to effective semantic self-supervision. Also, our knowledge distillation works in a gradual way that is helpful to capture the dynamic variations in the local geometry, mitigating the previous guidance degradation and deviation at the same time. Extensive experiments on five challenging benchmarks confirmed the state-of-the-art performance of our method.
Keywords: domain adaptation; source-data-free; geometry-guided; gradual knowledge distillation; object recognition domain adaptation; source-data-free; geometry-guided; gradual knowledge distillation; object recognition

Share and Cite

MDPI and ACS Style

Zhang, Y.; Tang, S. Gradual Geometry-Guided Knowledge Distillation for Source-Data-Free Domain Adaptation. Mathematics 2025, 13, 1491. https://doi.org/10.3390/math13091491

AMA Style

Zhang Y, Tang S. Gradual Geometry-Guided Knowledge Distillation for Source-Data-Free Domain Adaptation. Mathematics. 2025; 13(9):1491. https://doi.org/10.3390/math13091491

Chicago/Turabian Style

Zhang, Yangkuiyi, and Song Tang. 2025. "Gradual Geometry-Guided Knowledge Distillation for Source-Data-Free Domain Adaptation" Mathematics 13, no. 9: 1491. https://doi.org/10.3390/math13091491

APA Style

Zhang, Y., & Tang, S. (2025). Gradual Geometry-Guided Knowledge Distillation for Source-Data-Free Domain Adaptation. Mathematics, 13(9), 1491. https://doi.org/10.3390/math13091491

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop