Shell AI command generator - convert natural language to shell commands using AI.
- Multiple AI providers: Anthropic (Claude), Google (Gemini), OpenAI (GPT), Ollama (local models)
- Shell integration: zsh with "# request" + Enter UX; bash with Ctrl-x a keybinding (optional Enter auto-expand with
SLY_BASH_ENTER=1) - Context-aware: Detects project type, git status, current directory
- Fast: Written in Zig 0.15.2+ with libcurl for minimal latency
- Offline mode: Echo provider for testing without API keys
- Zig 0.15.2 or later
- libcurl development headers:
- Ubuntu/Debian:
sudo apt-get install libcurl4-openssl-dev - macOS:
brew install curl(or use system libcurl) - Arch:
sudo pacman -S curl
- Ubuntu/Debian:
git clone https://codeberg.org/sam/sly.git
cd sly/
zig build -Doptimize=ReleaseSafeBinary will be at zig-out/bin/sly.
export PATH="$PWD/zig-out/bin:$PATH"Add to ~/.bashrc or ~/.zshrc to persist.
- Development shell:
nix develop - Build the CLI:
nix buildornix build .#sly(binary at./result/bin/sly) - Run directly:
nix run . -- "list all pdf files"(ornix run .#sly -- "list all pdf files")
First build will error with a vendor hash for Zig dependencies. Copy the suggested sha256-... into flake.nix where pkgs.lib.fakeHash is used, then rebuild.
Using on NixOS, you can either add the package to your environment.systemPackages, or import the included module and set defaults/env vars:
{
inputs.sly.url = "path:/path/to/this/checkout"; # or your git remote
outputs = { self, nixpkgs, sly, ... }: {
nixosConfigurations.your-host = nixpkgs.lib.nixosSystem {
system = "x86_64-linux";
modules = [
sly.nixosModules.default
({ config, pkgs, ... }: {
programs.sly.enable = true;
# Optional: set defaults or API keys (beware: Nix store visibility)
programs.sly.provider = "anthropic";
# programs.sly.anthropicApiKey = "..."; # prefer sops-nix/agenix instead
})
];
};
};
}Note: Do not commit API keys directly in Nix configs. Use sops-nix, agenix, or other secret management to keep credentials out of the Nix store.
Set environment variables to configure providers and models:
# Choose provider (default: anthropic)
export SLY_PROVIDER=anthropic # or gemini, openai, ollama, echo
# Anthropic (Claude)
export ANTHROPIC_API_KEY="sk-ant-..."
export SLY_ANTHROPIC_MODEL="claude-3-5-sonnet-20241022" # default
# Google Gemini
export GEMINI_API_KEY="..."
export SLY_GEMINI_MODEL="gemini-2.0-flash-exp" # default
# OpenAI (Responses API)
export OPENAI_API_KEY="sk-..."
export SLY_OPENAI_MODEL="gpt-4o" # default
export SLY_OPENAI_URL="https://api.openai.com/v1/responses" # default
# Ollama (local)
export SLY_OLLAMA_MODEL="llama3.2" # default
export SLY_OLLAMA_URL="http://localhost:11434" # default
# Custom system prompt extension
export SLY_PROMPT_EXTEND="Always use verbose flags"# Direct invocation
sly "list all pdf files"
# Output: find . -name '*.pdf'
sly "show disk usage sorted by size"
# Output: du -sh * | sort -hAdd to ~/.zshrc:
source /path/to/sly/lib/sly.plugin.zshThen type:
# list all pdf files
# <press Enter>
# → Buffer becomes: find . -name '*.pdf'Spinner animation shows progress. Press Enter again to execute, or edit first.
Add to ~/.bashrc:
source /path/to/sly/lib/bash-sly.plugin.shThen type:
# list all pdf files
<press Ctrl-x a>
# → Buffer becomes: find . -name '*.pdf'Press Enter to execute, or edit first.
Optional: enable Enter to auto-expand # lines (two-step UX) by adding:
export SLY_BASH_ENTER=1With this set, on a line starting with # :
- First Enter expands to the generated command (does not execute)
- Second Enter executes the command
zig build testzig fmt src/ build.zigSLY_PROVIDER=echo sly "test query"
# Output: echo 'test query'| Variable | Default | Description |
|---|---|---|
| SLY_PROVIDER | anthropic | AI provider: anthropic, gemini, openai, ollama, echo |
| ANTHROPIC_API_KEY | - | Anthropic API key (required for anthropic provider) |
| SLY_ANTHROPIC_MODEL | claude-3-5-sonnet-20241022 | Anthropic model name |
| GEMINI_API_KEY | - | Google Gemini API key (required for gemini provider) |
| SLY_GEMINI_MODEL | gemini-2.0-flash-exp | Gemini model name |
| OPENAI_API_KEY | - | OpenAI API key (required for openai provider) |
| SLY_OPENAI_MODEL | gpt-4o | OpenAI model name |
| SLY_OPENAI_URL | https://api.openai.com/v1/responses | OpenAI API endpoint |
| SLY_OLLAMA_MODEL | llama3.2 | Ollama model name |
| SLY_OLLAMA_URL | http://localhost:11434 | Ollama server URL |
| SLY_PROMPT_EXTEND | - | Additional system prompt instructions |
MIT - see LICENSE file
Patches welcome! Send to the mailing list or open an issue on Codeberg.