Previous Article in Journal
Intelligent Learning on Multidimensional Data Streams: A Bibliometric Analysis of Research Evolution and Future Directions
Previous Article in Special Issue
A Globally Optimal Alternative to MLP
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

LossTransform: Reformulating the Loss Function for Contrastive Learning

Department of Computer Science, New York Institute of Technology, New York, NY 10023, USA
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Information 2025, 16(12), 1068; https://doi.org/10.3390/info16121068
Submission received: 15 September 2025 / Revised: 30 November 2025 / Accepted: 1 December 2025 / Published: 3 December 2025

Abstract

Contrastive learning improves model performance by differentiating between positive and negative sample pairs. However, its application is primarily confined to classification tasks, facing challenges with complex recognition tasks such as object detection and segmentation due to its limited capacity to capture spatial relationships and fine-grained features. To address this limitation, we propose LossTransform, a novel approach that redefines positive sample pairs and establishes a novel contrastive loss paradigm. LossTransform advances contrastive learning to the instance level, departing from the traditional sample level. Empirical evaluations on ImageNet, CIFAR, and object detection benchmarks indicate that LossTransform improves accuracy by +2.73% on CIFAR, +2.52% on ImageNet, and up to +5.2% in average precision on detection tasks, while maintaining efficiency. These results illustrate that LossTransform is compatible with large-scale training pipelines and exhibits robust performance across diverse and complex datasets. By optimizing model performance and significantly reducing training time, this research enables more efficient and accessible solutions for societal applications.
Keywords: loss function; ensemble method; repeated augmentation, test-time augmentation loss function; ensemble method; repeated augmentation, test-time augmentation

Share and Cite

MDPI and ACS Style

Li, Z.; Cheng, J.; Gu, H.H. LossTransform: Reformulating the Loss Function for Contrastive Learning. Information 2025, 16, 1068. https://doi.org/10.3390/info16121068

AMA Style

Li Z, Cheng J, Gu HH. LossTransform: Reformulating the Loss Function for Contrastive Learning. Information. 2025; 16(12):1068. https://doi.org/10.3390/info16121068

Chicago/Turabian Style

Li, Zheng, Jerry Cheng, and Huanying Helen Gu. 2025. "LossTransform: Reformulating the Loss Function for Contrastive Learning" Information 16, no. 12: 1068. https://doi.org/10.3390/info16121068

APA Style

Li, Z., Cheng, J., & Gu, H. H. (2025). LossTransform: Reformulating the Loss Function for Contrastive Learning. Information, 16(12), 1068. https://doi.org/10.3390/info16121068

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop