Stars
Efficient Long-context Language Model Training by Core Attention Disaggregation
Reading notes about Multimodal Large Language Models, Large Language Models, and Diffusion Models
PaddlePaddle Developer Community
A compact implementation of SGLang, designed to demystify the complexities of modern LLM serving systems.
Accelerating MoE with IO and Tile-aware Optimizations
An Open Phone Agent Model & Framework. Unlocking the AI Phone for Everyone
A NCCL extension library, designed to efficiently offload GPU memory allocated by the NCCL communication library.
"AI-Trader: Can AI Beat the Market?" Live Trading Bench: https://ai4trade.ai Tech Report Link: https://arxiv.org/abs/2512.10971
Pytorch Distributed native training library for LLMs/VLMs with OOTB Hugging Face support
Anthropic's Interactive Prompt Engineering Tutorial
Claude Code is an agentic coding tool that lives in your terminal, understands your codebase, and helps you code faster by executing routine tasks, explaining complex code, and handling git workflo…
PyMuPDF is a high performance Python library for data extraction, analysis, conversion & manipulation of PDF (and other) documents.
Domain-specific language designed to streamline the development of high-performance GPU/CPU/Accelerators kernels
Sample codes for my CUDA programming book
Tongyi Deep Research, the Leading Open-source Deep Research Agent
Venus Collective Communication Library, supported by SII and Infrawaves.
NVIDIA NVSHMEM is a parallel programming interface for NVIDIA GPUs based on OpenSHMEM. NVSHMEM can significantly reduce multi-process communication and coordination overheads by allowing programmer…
A Next-Generation Training Engine Built for Ultra-Large MoE Models
KernelBench: Can LLMs Write GPU Kernels? - Benchmark + Toolkit with Torch -> CUDA (+ more DSLs)
基于多智能体LLM的中文金融交易框架 - TradingAgents中文增强版
AgentScope: Agent-Oriented Programming for Building LLM Applications
The simplest, fastest repository for training/finetuning medium-sized GPTs.
verl: Volcano Engine Reinforcement Learning for LLMs