You are currently viewing a new version of our website. To view the old version click .
AI
  • This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
  • Article
  • Open Access

20 November 2025

AdaLite: A Distilled AdaBins Model for Depth Estimation on Resource-Limited Devices

,
,
,
and
1
RCAM Laboratory, Department of Electronics, Djillali Liabes University, Sidi Bel Abbes 22000, Algeria
2
RCAM Laboratory, Department of Computer Science, Djillali Liabes University, Sidi Bel Abbes 22000, Algeria
3
Univ Rennes, INSA Rennes, CNRS, IETR-UMR 6164, F-35000 Rennes, France
4
IRECOM Laboratory, Djillali Liabes University, Sidi Bel Abbes 22000, Algeria
AI2025, 6(11), 298;https://doi.org/10.3390/ai6110298 
(registering DOI)

Abstract

This paper presents AdaLite, a knowledge distillation framework for monocular depth estimation designed for efficient deployment on resource-limited devices, without relying on quantization or pruning. While large-scale depth estimation networks achieve high accuracy, their computational and memory demands hinder real-time use. To address this problem, a large model is adopted as a teacher, and a compact encoder–decoder student with few trainable parameters is trained under a dual-supervision scheme that aligns its predictions with both teacher feature maps and ground-truth depths. AdaLite is evaluated on the NYUv2, SUN-RGBD and KITTI benchmarks using standard depth metrics and deployment-oriented measures, including inference latency. The distilled model achieves a 94% reduction in size and reaches 1.02 FPS on a Raspberry Pi 2 (2 GB CPU), while preserving 96.8% of the teacher’s accuracy (δ1) and providing over 11× faster inference. These results demonstrate the effectiveness of distillation-driven compression for real-time depth estimation in resource-limited environments. The code is publically available.

Article Metrics

Citations

Article Access Statistics

Multiple requests from the same IP address are counted as one view.