Starred repositories
A fast and hackable fuzzy finder for the terminal.
Self-hosted app store and runtime for AI agents. Install third-party agents, run them on your infrastructure with your own model providers (Ollama, Bedrock, OpenAI, etc.). Container isolation, cred…
Fastest LLM gateway (50x faster than LiteLLM) with adaptive load balancer, cluster mode, guardrails, 1000+ models support & <100 µs overhead at 5k RPS.
A focused launcher for your desktop — native, fast, extensible
It’s not “just dotfiles” - it's extreme modularity that looks and feels like a NixOS flake. It's Voice-driven DevOps, it's a declarative home automation system, it's a self-contained auto-documenti…
The easiest, and most secure way to access and protect all of your infrastructure.
A widget framework for building desktop shells, written and configurable in Python
A Material 3 inspired desktop shell for Niri and Hyprland created with Ignis.
The Ultimate all-in social media automation (outreach) tool 🤖
📨 The ultimate social media scheduling tool, with a bunch of AI 🤖
Qstick / LunaSea
Forked from jagandeepbrar/lunaseaA self-hosted controller for mobile and macOS built using the Flutter framework.
Filter lists for uBlock Origin & uBlock Origin Lite
🤖 A comprehensive, pure n8n solution for automating job discovery, AI-powered analysis, application submission, and tracking. Enterprise-grade job search automation with zero external dependencies.
The main control script for the Caelestia dotfiles
This is a fork of the caelestia-shell desktop shell, containing my personal modifications and customizations. The original caelestia-shell is a desktop shell built with Quickshell and designed for …
A nix configuration for (eventually) everything i own that is able to run it.
The AI coding agent built for the terminal.
A tool for calculating the score of a hand in Balatro
AI conversations that actually remember. Never re-explain your project to your AI again. Join our Discord: https://discord.gg/tyvKNccgqN
a magical LLM desktop client that makes it easy for *anyone* to use LLMs and MCP