Distill Any Depth

Introduction

We present Distill-Any-Depth, a new SOTA monocular depth estimation model trained with our proposed knowledge distillation algorithms. It was introduced in the paper Distill Any Depth: Distillation Creates a Stronger Monocular Depth Estimator. Models with various sizes are available in this repo.

Installation

git clone https://huggingface.co/xingyang1/Distill-Any-Depth
pip install -r requirements.txt

BibTeX entry and citation info

If you find this project useful, please consider citing:

@article{he2025distill,
  title   = {Distill Any Depth: Distillation Creates a Stronger Monocular Depth Estimator},
  author  = {Xiankang He and Dongyan Guo and Hongji Li and Ruibo Li and Ying Cui and Chi Zhang},
  year    = {2025},
  journal = {arXiv preprint arXiv: 2502.19204}
}
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Space using xingyang1/Distill-Any-Depth 1