Enterprise-grade gateway for Model Context Protocol (MCP) servers. Secure, scalable, and production-ready.
MCPRelay is an open-source API gateway designed specifically for MCP (Model Context Protocol) servers. It provides enterprise-grade security, authentication, rate limiting, and monitoring capabilities for production MCP deployments.
- Security First: MCP-aware request validation and response sanitization
- Authentication: API key and JWT-based authentication with role-based access control
- Rate Limiting: Token bucket rate limiting with per-user tiers
- Load Balancing: Intelligent load balancing with health checks and failover
- Monitoring: Prometheus metrics, structured logging, and real-time dashboard
- Easy Deployment: Docker-based deployment with YAML configuration
- Web Interface: Modern admin dashboard for configuration and monitoring
git clone https://github.com/plwp/mcprelay.git
cd mcprelay
./quickstart.sh# Install dependencies
pip install -e .
# Create configuration
cp config.example.yaml config.yaml
# Edit config.yaml to add your MCP servers
# Start the gateway
mcprelay serve# Using Docker Compose (recommended)
docker-compose up -d
# Or run directly
docker run -p 8080:8080 -v ./config.yaml:/app/config.yaml mcprelay/mcprelayMCPRelay uses YAML configuration. Here's a minimal example:
# Basic server settings
host: "0.0.0.0"
port: 8080
# MCP servers to proxy to
servers:
- name: "hue-server"
url: "http://localhost:3000"
weight: 1
timeout: 30
# Authentication
auth:
enabled: true
method: "api_key"
api_keys:
admin: "your-admin-key"
# Rate limiting
rate_limit:
enabled: true
default_requests_per_minute: 60
burst_size: 10
# Security
mcp_safeguards_enabled: trueAI Client → MCPRelay Gateway → MCP Server(s)
↓
[Auth, Rate Limit, Load Balance, Monitor]
MCPRelay sits between your AI clients and MCP servers, providing:
- Request Authentication: Validates API keys or JWT tokens
- Rate Limiting: Prevents abuse with configurable limits
- Load Balancing: Distributes requests across healthy backend servers
- Request Validation: MCP-aware filtering of dangerous operations
- Response Sanitization: Strips sensitive data from responses
- Monitoring: Comprehensive metrics and logging
GET /health- Health check endpointGET /metrics- Prometheus metricsPOST /mcp/{path}- Main proxy endpoint for MCP requestsGET /admin/- Web administration interface
MCPRelay includes a modern web interface for administration:
- Dashboard: Real-time statistics and recent activity
- Server Management: Add, remove, and monitor backend servers
- Configuration: Edit settings through the web UI
- Logs: View and filter system logs
- Metrics: Performance analytics and visualizations
Access the dashboard at http://localhost:8080/admin/ (requires admin authentication).
# Start the server
mcprelay serve
# Validate configuration
mcprelay validate
# Check backend health
mcprelay health
# Show version
mcprelay versionMCPRelay provides multiple layers of security:
- Validates JSON-RPC requests for proper MCP format
- Blocks dangerous operations (file system access, code execution)
- Sanitizes responses to prevent data leakage
- API key authentication with configurable keys
- JWT token validation with proper claims checking
- Role-based access control (admin vs user permissions)
- Token bucket algorithm with configurable rates
- Per-user rate limits
- Burst capacity for handling traffic spikes
MCPRelay exports Prometheus metrics including:
- Request count and rate
- Response times and percentiles
- Backend server health status
- Error rates and types
Structured JSON logging with configurable levels:
- Request/response logging
- Security event logging
- Performance metrics
- Error tracking
We welcome contributions! Please see our Contributing Guide for details.
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
- Issues: GitHub Issues
- Discussions: GitHub Discussions
- Documentation: mcprelay.org