Next Article in Journal
Extreme Events and Event Size Fluctuations in Resetting Random Walks on Networks
Previous Article in Journal
Statistical CSI-Based Beamspace Transmission for Massive MIMO LEO Satellite Communications
Previous Article in Special Issue
LAViTSPose: A Lightweight Cascaded Framework for Robust Sitting Posture Recognition via Detection– Segmentation–Classification
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Exploring Universal Domain Adaptation with CLIP Models: A Calibration Method

School of Electronic and Information Engineering, Wuyi University, Jiangmen 529020, China
Entropy 2025, 27(12), 1213; https://doi.org/10.3390/e27121213 (registering DOI)
Submission received: 7 September 2025 / Revised: 22 November 2025 / Accepted: 27 November 2025 / Published: 28 November 2025
(This article belongs to the Special Issue Entropy in Machine Learning Applications, 2nd Edition)

Abstract

CLIP models have shown their impressive learning and transfer capabilities in a wide range of visual tasks. It is, however, interesting that these foundation models have not been fully explored for Universal Domain Adaptation (UniDA). In this paper, we make comprehensive empirical studies of state-of-the-art UniDA methods using these foundation models. We first demonstrate that although the foundation models greatly improve the performance of the baseline method (which trains the models on the source data alone), existing UniDA methods struggle to improve over the baseline. This suggests that new research efforts are necessary for UniDA using these foundation models. Finally, we observe that calibration of CLIP models plays a key role in UniDA. To this end, we propose a very simple calibration method via automatic temperature scaling, which significantly enhances the baseline’s out-of-class detection capability. We show that a single learned temperature outperforms previous approaches in most benchmark tasks when adapting from CLIP models, excelling in evaluation metrics including H-score and a newly proposed Universal Classification Rate (UCR) metric. We hope that our investigation and the proposed simple framework can serve as a strong baseline to facilitate future studies in this field.
Keywords: Universal Domain Adaptation; CLIP; model calibration Universal Domain Adaptation; CLIP; model calibration

Share and Cite

MDPI and ACS Style

Deng, B. Exploring Universal Domain Adaptation with CLIP Models: A Calibration Method. Entropy 2025, 27, 1213. https://doi.org/10.3390/e27121213

AMA Style

Deng B. Exploring Universal Domain Adaptation with CLIP Models: A Calibration Method. Entropy. 2025; 27(12):1213. https://doi.org/10.3390/e27121213

Chicago/Turabian Style

Deng, Bin. 2025. "Exploring Universal Domain Adaptation with CLIP Models: A Calibration Method" Entropy 27, no. 12: 1213. https://doi.org/10.3390/e27121213

APA Style

Deng, B. (2025). Exploring Universal Domain Adaptation with CLIP Models: A Calibration Method. Entropy, 27(12), 1213. https://doi.org/10.3390/e27121213

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop