-
University of Washington
- https://mitchellnw.github.io/
Stars
A repository for research on medium sized language models.
Easily create large video dataset from video urls
An open-source implementation of Google's PaLM models
Hackable and optimized Transformers building blocks, supporting a composable construction.
An open source implementation of CLIP.
Simple large-scale training of stable diffusion with multi-node support.
Accessible large language models via k-bit quantization for PyTorch.
Easily turn large sets of image urls to an image dataset. Can download, resize and package 100M urls in 20h on one machine.
Official codebase used to develop Vision Transformer, SigLIP, MLP-Mixer, LiT and more.
An open-source framework for training large multimodal models.
Patching open-vocabulary models by interpolating weights
A machine learning benchmark of in-the-wild distribution shifts, with data loaders, evaluators, and default models.
Compiler and tooling for the Myte programming language.
The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights -- ResNet, ResNeXT, EfficientNet, NFNet, Vision Transformer (V…
CVNets: A library for training computer vision networks
A high-performance Python-based I/O system for large (and small) deep learning problems, with strong support for PyTorch.
arXiv LaTeX Cleaner: Easily clean the LaTeX code of your paper to submit to arXiv
A repository in preparation for open-sourcing lottery ticket hypothesis code.
Flax is a neural network library for JAX that is designed for flexibility.
Javascript library for visualizing dynamic neural networks across time.
Sparse learning library and sparse momentum resources.
Slimmable Networks, AutoSlim, and Beyond, ICLR 2019, and ICCV 2019