Starred repositories
Official implementation of "MST-Distill: Mixture of Specialized Teachers for Cross-Modal Knowledge Distillation" (ACM MM 2025)
Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019
Denoising Diffusion Probabilistic Models
Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022
The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights -- ResNet, ResNeXT, EfficientNet, NFNet, Vision Transformer (V…
The official project website of "ScaleKD: Strong Vision Transformers Could Be Excellent Teachers" (ScaleKD for short, accepted to NeurIPS 2024).
Convert AudioKinetic Wwise RIFF/RIFX Vorbis to standard Ogg Vorbis