Skip to content
forked from openai/codex

Fast, effective, mind-blowing, coding CLI. Browser integration, multi-agents, theming, and reasoning control. Orchestrate agents from OpenAI, Claude, Gemini or any provider.

License

Notifications You must be signed in to change notification settings

just-every/code

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CODE

Code Logo

Code is a fast, local coding agent for your terminal. It's a community-driven fork of openai/codex focused on real developer ergonomics: Browser integration, multi-agents, theming, and reasoning control — all while staying compatible with upstream.

What's new in v0.4.0 (October 26, 2025)

  • Auto Drive upgraded – hand /auto a task and it now plans, coordinates agents, reruns checks, and recovers from hiccups without babysitting.
  • Unified settings/settings centralizes limits, model routing, themes, and CLI integrations so you can audit configuration in one place.
  • Card-based activity – Agents, browser sessions, web search, and Auto Drive render as compact cards with drill-down overlays for full logs.
  • Turbocharged performance – History rendering and streaming were optimized to stay smooth even during long multi-agent sessions.
  • Smarter agents – Mix and match orchestrator CLIs (Claude, Gemini, GPT-5, Qwen, and more) per /plan, /code, or /solve run.

Read the full notes in release-notes/RELEASE_NOTES.md.

Why Code

  • 🚀 Auto Drive orchestration – Multi-agent automation that now self-heals and ships complete tasks.
  • 🌐 Browser Integration – CDP support, headless browsing, screenshots captured inline.
  • 🤖 Multi-agent commands/plan, /code and /solve coordinate multiple CLI agents.
  • 🧭 Unified settings hub/settings overlay for limits, theming, approvals, and provider wiring.
  • 🎨 Theme system – Switch between accessible presets, customize accents, and preview live via /themes.
  • 🔌 MCP support – Extend with filesystem, DBs, APIs, or your own tools.
  • 🔒 Safety modes – Read-only, approvals, and workspace sandboxing.

AI Videos

Play Introducing Auto Drive video
Auto Drive Overview

Play Multi-Agent Support video
Multi-Agent Promo

Quickstart

Run

npx -y @just-every/code

Install & Run

npm install -g @just-every/code
code // or `coder` if you're using VS Code

Note: If another tool already provides a code command (e.g. VS Code), our CLI is also installed as coder. Use coder to avoid conflicts.

Authenticate (one of the following):

  • Sign in with ChatGPT (Plus/Pro/Team; uses models available to your plan)
    • Run code and pick "Sign in with ChatGPT"
  • API key (usage-based)
    • Set export OPENAI_API_KEY=xyz and run code

Install Claude & Gemini (optional)

Code supports orchestrating other AI CLI tools. Install these and config to use alongside Code.

# Ensure Node.js 20+ is available locally (installs into ~/.n)
npm install -g n
export N_PREFIX="$HOME/.n"
export PATH="$N_PREFIX/bin:$PATH"
n 20.18.1

# Install the companion CLIs
export npm_config_prefix="${npm_config_prefix:-$HOME/.npm-global}"
mkdir -p "$npm_config_prefix/bin"
export PATH="$npm_config_prefix/bin:$PATH"
npm install -g @anthropic-ai/claude-code @google/gemini-cli @qwen-code/qwen-code

# Quick smoke tests
claude --version
gemini --version
qwen --version

ℹ️ Add export N_PREFIX="$HOME/.n" and export PATH="$N_PREFIX/bin:$PATH" (plus the npm_config_prefix bin path) to your shell profile so the CLIs stay on PATH in future sessions.

Commands

Browser

# Connect code to external Chrome browser (running CDP)
/chrome        # Connect with auto-detect port
/chrome 9222   # Connect to specific port

# Switch to internal browser mode
/browser       # Use internal headless browser
/browser https://example.com  # Open URL in internal browser

Agents

# Plan code changes (Claude, Gemini and GPT-5 consensus)
# All agents review task and create a consolidated plan
/plan "Stop the AI from ordering pizza at 3AM"

# Solve complex problems (Claude, Gemini and GPT-5 race)
# Fastest preferred (see https://arxiv.org/abs/2505.17813)
/solve "Why does deleting one user drop the whole database?"

# Write code! (Claude, Gemini and GPT-5 consensus)
# Creates multiple worktrees then implements the optimal solution
/code "Show dark mode when I feel cranky"

Auto Drive

# Hand off a multi-step task; Auto Drive will coordinate agents and approvals
/auto "Refactor the auth flow and add device login"

# Resume or inspect an active Auto Drive run
/auto status

General

# Try a new theme!
/themes

# Change reasoning level
/reasoning low|medium|high

# Switch models or effort presets
/model

# Start new conversation
/new

CLI reference

code [options] [prompt]

Options:
  --model <name>        Override the model (gpt-5, claude-opus, etc.)
  --read-only          Prevent file modifications
  --no-approval        Skip approval prompts (use with caution)
  --config <key=val>   Override config values
  --oss                Use local open source models
  --sandbox <mode>     Set sandbox level (read-only, workspace-write, etc.)
  --help              Show help information
  --debug             Log API requests and responses to file
  --version           Show version number

Memory & project docs

Code can remember context across sessions:

  1. Create an AGENTS.md or CLAUDE.md file in your project root:
# Project Context
This is a React TypeScript application with:
- Authentication via JWT
- PostgreSQL database
- Express.js backend

## Key files:
- `/src/auth/` - Authentication logic
- `/src/api/` - API client code  
- `/server/` - Backend services
  1. Session memory: Code maintains conversation history
  2. Codebase analysis: Automatically understands project structure

Non-interactive / CI mode

For automation and CI/CD:

# Run a specific task
code --no-approval "run tests and fix any failures"

# Generate reports
code --read-only "analyze code quality and generate report"

# Batch processing
code --config output_format=json "list all TODO comments"

Model Context Protocol (MCP)

Code supports MCP for extended capabilities:

  • File operations: Advanced file system access
  • Database connections: Query and modify databases
  • API integrations: Connect to external services
  • Custom tools: Build your own extensions

Configure MCP in ~/.code/config.toml Define each server under a named table like [mcp_servers.<name>] (this maps to the JSON mcpServers object used by other clients):

[mcp_servers.filesystem]
command = "npx"
args = ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/project"]

Configuration

Main config file: ~/.code/config.toml

Note

Code reads from both ~/.code/ and ~/.codex/ for backwards compatibility, but it only writes updates to ~/.code/. If you switch back to Codex and it fails to start, remove ~/.codex/config.toml. If Code appears to miss settings after upgrading, copy your legacy ~/.codex/config.toml into ~/.code/.

# Model settings
model = "gpt-5"
model_provider = "openai"

# Behavior
approval_policy = "on-request"  # untrusted | on-failure | on-request | never
model_reasoning_effort = "medium" # low | medium | high
sandbox_mode = "workspace-write"

# UI preferences see THEME_CONFIG.md
[tui.theme]
name = "light-photon"

# Add config for specific models
[profiles.gpt-5]
model = "gpt-5"
model_provider = "openai"
approval_policy = "never"
model_reasoning_effort = "high"
model_reasoning_summary = "detailed"

Environment variables

  • CODE_HOME: Override config directory location
  • OPENAI_API_KEY: Use API key instead of ChatGPT auth
  • OPENAI_BASE_URL: Use alternative API endpoints
  • OPENAI_WIRE_API: Force the built-in OpenAI provider to use chat or responses wiring

FAQ

How is this different from the original?

This fork adds browser integration, multi-agent commands (/plan, /solve, /code), theme system, and enhanced reasoning controls while maintaining full compatibility.

Can I use my existing Codex configuration?

Yes. Code reads from both ~/.code/ (primary) and legacy ~/.codex/ directories. We only write to ~/.code/, so Codex will keep running if you switch back; copy or remove legacy files if you notice conflicts.

Does this work with ChatGPT Plus?

Absolutely. Use the same "Sign in with ChatGPT" flow as the original.

Is my data secure?

Yes. Authentication stays on your machine, and we don't proxy your credentials or conversations.

Contributing

We welcome contributions! This fork maintains compatibility with upstream while adding community-requested features.

Development workflow

# Clone and setup
git clone https://github.com/just-every/code.git
cd code
npm install

# Build (use fast build for development)
./build-fast.sh

# Run locally
./code-rs/target/dev-fast/code

Opening a pull request

  1. Fork the repository
  2. Create a feature branch: git checkout -b feature/amazing-feature
  3. Make your changes
  4. Run tests: cargo test
  5. Build successfully: ./build-fast.sh
  6. Submit a pull request

Legal & Use

License & attribution

  • This project is a community fork of openai/codex under Apache-2.0. We preserve upstream LICENSE and NOTICE files.
  • Code is not affiliated with, sponsored by, or endorsed by OpenAI.

Your responsibilities

Using OpenAI, Anthropic or Google services through Code means you agree to their Terms and policies. In particular:

  • Don't programmatically scrape/extract content outside intended flows.
  • Don't bypass or interfere with rate limits, quotas, or safety mitigations.
  • Use your own account; don't share or rotate accounts to evade limits.
  • If you configure other model providers, you're responsible for their terms.

Privacy

  • Your auth file lives at ~/.code/auth.json
  • Inputs/outputs you send to AI providers are handled under their Terms and Privacy Policy; consult those documents (and any org-level data-sharing settings).

Subject to change

AI providers can change eligibility, limits, models, or authentication flows. Code supports both ChatGPT sign-in and API-key modes so you can pick what fits (local/hobby vs CI/automation).

License

Apache 2.0 - See LICENSE file for details.

This project is a community fork of the original Codex CLI. We maintain compatibility while adding enhanced features requested by the developer community.

Need help? Open an issue on GitHub or check our documentation.

About

Fast, effective, mind-blowing, coding CLI. Browser integration, multi-agents, theming, and reasoning control. Orchestrate agents from OpenAI, Claude, Gemini or any provider.

Resources

License

Contributing

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Rust 96.5%
  • JavaScript 1.5%
  • Python 0.7%
  • Shell 0.7%
  • TypeScript 0.4%
  • HTML 0.1%
  • Other 0.1%