Abstract
Whole slide imaging has transformed the field of pathology by enabling high-resolution digitization of histopathological slides. However, the large image size and variability in morphology, tissue processing, and imaging can pose challenges for robust computational analysis. When working with specific tasks in digital pathology, conventional feature extractors pretrained on general images may not provide features as relevant as those trained on histopathological images. To address this, foundation models pretrained on histopathological images have been developed. Yet, their large size and computational demands might limit widespread adoptions to specific tasks. To facilitate the low-cost adoption of these models, we utilized low-rank adaptation for finetuning the model and developed evolving prototype-based multiple instance learning (EP-MIL). Our method’s capabilities were demonstrated by applying it to the classification of two histological subtypes of lung cancer. The results show that our approach achieves competitive performance when benchmarked against a state-of-the-art technique (CLAM), while offering improvements in efficiency. Specifically, our proposed method requires 8.3 times less training runtime compared with CLAM, uses less than 200.0 MB of memory during training, and enables 73.8 times faster inference runtime. These efficiency gains, combined with competitive performance, suggest that utilizing evolving prototypes with LoRA-tuned foundation models offers a more efficient and practical approach for broader use of foundation models in resource-constrained clinical settings.