Highlights
- Pro
Lists (1)
Sort Name ascending (A-Z)
Stars
Hubble is a suite of fully open-source large language models (LLMs) for the scientific study of LLM memorization.
Python implementation of Bayesian online changepoint detection
shangshang-wang / Tora
Forked from meta-pytorch/torchtuneTora: Torchtune-LoRA for RL
AlgoTune is a NeurIPS 2025 benchmark made up of 154 math, physics, and computer science problems. The goal is write code that solves each problem, and is faster than existing implementations.
A lightweight library for beautiful game of life embeds.
A command-line tool for building static sites from Observable Notebooks
A comprehensive toolkit for streamlining data editing, search, and inspection for large-scale language model training and interpretability.
Reproduce ICLR2025 Energy-Based Diffusion Language Models for Text Generation
Implementation of Sharpe-ratio-based active learning strategies for aligning large language models using Direct Preference Optimization (DPO).
Sample-Efficient Preference Alignment in LLMs via Active Exploration
Official implementation for the paper "Toward Scientific Reasoning in LLMs: Training from Expert Discussions via Reinforcement Learning"
A High-Efficiency System of Large Language Model Based Search Agents
This package contains the original 2012 AlexNet code.
You like pytorch? You like micrograd? You love tinygrad! ❤️
(WIP) A small but powerful, homemade PyTorch from scratch.
metagene-ai / gene-mteb
Forked from embeddings-benchmark/mtebGene-MTEB Benchmark for Genomic Embedding
Code and Data for Compare without Despair: Reliable Preference Evaluation with Generation Separability