-
Karolinska Institute
- Stockholm, Sweden
- @AntonOresten
Stars
Protein ribbon plots implemented in Julia using Makie
Implement a ChatGPT-like LLM in PyTorch from scratch, step by step
Jax Codebase for Evolutionary Strategies at the Hyperscale
A Julia package for interacting with the Hugging Face dataset repository.
The most open diffusion language model for code generation — releasing pretraining, evaluation, inference, and checkpoints.
Discrete, Continuous, and Manifold Flow Matching with Splits and Deletions
More informative REPL printing of numbers & arrays
Domain-specific language designed to streamline the development of high-performance GPU/CPU/Accelerators kernels
Package for compiling and bundling julia binaries, specially trimmed ones
A comprehensive collection of 35+ recurrent neural network layers for Flux.jl
A framework for clean, testable, and high-performance CUDA kernels.
Language modeling with linear-cost context
[ICLR 2025] Official PyTorch Implementation of Gated Delta Networks: Improving Mamba2 with Delta Rule
Qwen3-omni is a natively end-to-end, omni-modal LLM developed by the Qwen team at Alibaba Cloud, capable of understanding text, audio, images, and video, as well as generating speech in real time.
Julia implementation for the BFloat16 number type
a mixed-precision gemm with quantize and reorder kernel.
Programming Gemm Kernels on NVIDIA GPUs with Tensor Cores in Julia
Julia package for inference and training of Llama-style language models
Neural Network primitives with multiple backends
Julia support for the oneAPI programming toolkit.