A powerful command-line AI assistant built with modular architecture and xAI's Grok models. Features intelligent context management, secure tool execution, and a rich console interface.
- Dual AI Models: Switch between Grok-3 (conversational) and Grok-4 (enhanced reasoning)
- Intelligent File Operations: Read, create, and edit files with optional fuzzy matching
- Secure Shell Execution: Cross-platform shell commands with user confirmation
- Smart Context Management: Automatic conversation truncation with token estimation
- Rich Console Interface: Beautiful formatting with syntax highlighting
- Modular Architecture: Clean separation of concerns with dependency injection
- Comprehensive Testing: Full test suite with pytest
- Python 3.11+
- uv package manager (recommended)
- xAI API key
-
Install uv (if not already installed):
# On macOS and Linux curl -LsSf https://astral.sh/uv/install.sh | sh # On Windows powershell -c "irm https://astral.sh/uv/install.ps1 | iex"
-
Clone the repository:
git clone https://github.com/fabiopauli/grok-cli.git cd grok-cli
-
Install dependencies:
uv sync
-
Set up your API key:
# Create a .env file echo "XAI_API_KEY=your_api_key_here" > .env # Or export as environment variable export XAI_API_KEY=your_api_key_here
-
Clone the repository:
git clone https://github.com/fabiopauli/grok-cli.git cd grok-cli
-
Create a virtual environment:
python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate
-
Install dependencies:
pip install -r requirements.txt
-
Set up your API key:
echo "XAI_API_KEY=your_api_key_here" > .env
Start the interactive assistant:
# Using uv
uv run main.py
# Using python directly
python main.py
/add <file_pattern>
- Add files to conversation context/context
- Show current conversation context and token usage/reasoner
or/r
- Switch to Grok-4 for enhanced reasoning (temporary)/fuzzy
- Enable fuzzy file/code matching for current session/exit
or/quit
- Exit the applicationCtrl+C
- Exit the application
$ uv run main.py
Welcome to Grok Assistant!
Type your questions or commands. Use /help for available commands.
> /add src/core/*.py
Added 2 files to context: config.py, session.py
> Explain the configuration system
The configuration system is built around a dataclass-based approach...
> /r How can I optimize the token estimation?
[Switches to Grok-4 for enhanced reasoning]
To optimize token estimation, consider these approaches...
Create a .env
file in the project root:
XAI_API_KEY=your_api_key_here
Create config.json
for advanced settings:
{
"model_name": "grok-3",
"max_context_tokens": 100000,
"file_size_limit": 1048576,
"enable_fuzzy_matching": false
}
# Using uv
uv run pytest
# Using pytest directly
pytest
grok-cli/
├── main.py # Application entry point
├── src/
│ ├── core/ # Core functionality
│ │ ├── config.py # Configuration management
│ │ └── session.py # Session and context management
│ ├── commands/ # Special command handlers
│ ├── tools/ # AI function calling tools
│ ├── ui/ # Console interface
│ └── utils/ # Utility functions
├── tests/ # Comprehensive test suite
├── pyproject.toml # Project configuration
└── requirements.txt # Pip dependencies
- Dependency Injection: Configuration passed to all components
- Modular Design: Clear separation of concerns
- Security-First: Shell commands require confirmation
- Context Management: Intelligent conversation truncation
- Cross-Platform: Works on Windows, macOS, and Linux
- Shell Command Confirmation: All shell operations require user approval
- Path Validation: Robust file path sanitization
- File Size Limits: Configurable limits for file operations
- Exclusion Patterns: Automatically excludes system files
- Fuzzy Matching: Opt-in only for security
Get your xAI API key from console.x.ai and either:
- Add it to your
.env
file:XAI_API_KEY=your_key_here
- Export as environment variable:
export XAI_API_KEY=your_key_here
- Pass it when running:
XAI_API_KEY=your_key_here uv run main.py
- Fork the repository
- Create a feature branch:
git checkout -b feature-name
- Make your changes
- Run tests:
uv run pytest
- Submit a pull request
[License information - check LICENSE file]
For issues and questions, please open an issue on the GitHub repository.