- Seoul, South Korea
Starred repositories
Production-tested AI infrastructure tools for efficient AGI development and community-driven innovation
Moshi is a speech-text foundation model and full-duplex spoken dialogue framework. It uses Mimi, a state-of-the-art streaming neural audio codec.
A robust web archive analytics toolkit
Helpful tools and examples for working with flex-attention
A subset of PyTorch's neural network modules, written in Python using OpenAI's Triton.
Repository for Meta Chameleon, a mixed-modal early-fusion foundation model from FAIR.
DeepSeek-Coder-V2: Breaking the Barrier of Closed-Source Models in Code Intelligence
[ICLR 2025] Samba: Simple Hybrid State Space Models for Efficient Unlimited Context Language Modeling
Qwen3 is the large language model series developed by Qwen team, Alibaba Cloud.
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
Easily embed, cluster and semantically label text datasets
This is a Phi Family of SLMs book for getting started with Phi Models. Phi a family of open sourced AI models developed by Microsoft. Phi models are the most capable and cost-effective small langua…
Builder and index for PyTorch packages
[MLSys'25] QServe: W4A8KV4 Quantization and System Co-design for Efficient LLM Serving; [MLSys'25] LServe: Efficient Long-sequence LLM Serving with Unified Sparse Attention
Welcome to the Llama Cookbook! This is your go to guide for Building with Llama: Getting started with Inference, Fine-Tuning, RAG. We also show you how to solve end to end problems using Llama mode…
USP: Unified (a.k.a. Hybrid, 2D) Sequence Parallel Attention for Long Context Transformers Model Training and Inference
PyTorch native quantization and sparsity for training and inference
Schedule-Free Optimization in PyTorch
Scalable data pre processing and curation toolkit for LLMs
Ring attention implementation with flash attention
GPU programming related news and material links