A powerful tool composition platform built on the Model Context Protocol (MCP). Create sophisticated composite tools by combining multiple MCP servers using familiar Python-like Starlark syntax.
MCP Metatool transforms the MCP ecosystem from individual tools into a unified composition platform. Instead of calling tools individually, you can create intelligent workflows that combine GitHub, Slack, databases, filesystems, and any other MCP server into a single powerful tool.
Example: Automated Issue Management
# Create a saved tool that combines GitHub and Slack
issue = github.createIssue({
"title": params.title,
"body": params.description,
"labels": ["bug", "high-priority"]
})
notification = slack.postMessage({
"channel": "#dev-alerts",
"text": f"π¨ Critical issue created: {issue.html_url}"
})
result = {
"issue_url": issue.html_url,
"issue_number": issue.number,
"notification_sent": True,
"slack_ts": notification.ts
}- π Multi-Server Integration: Connect and orchestrate multiple MCP servers seamlessly
- π Starlark Scripting: Write composite tools using familiar Python-like syntax
- π οΈ Tool Composition: Combine GitHub, Slack, databases, filesystems, and more
- π Data Processing: Transform and route data between different services
- β Production Ready: Full test coverage, error handling, and validation
- π Hot Reloading: Create and update tools without server restarts
- π Schema Validation: Robust input validation with JSON Schema support
go build -o mcp-metatool .The server communicates over stdio using the MCP protocol. Add it to your Claude Code configuration:
{
"mcpServers": {
"mcp-metatool": {
"type": "stdio",
"command": "/path/to/mcp-metatool"
}
}
}MCP_METATOOL_DIR: Override the default storage directory (~/.mcp-metatool)
The metatool can connect to upstream MCP servers and proxy their tools, making them available in Starlark scripts. This enables creating composite tools that combine functionality from multiple MCP servers.
Create a servers.json file in your metatool directory (~/.mcp-metatool/servers.json or $MCP_METATOOL_DIR/servers.json):
Basic Example:
{
"mcpServers": {
"github": {
"command": "mcp-server-github",
"args": ["--token", "${GITHUB_TOKEN}"]
},
"slack": {
"command": "mcp-server-slack",
"args": []
}
}
}Advanced Example with Environment Variables:
{
"mcpServers": {
"github": {
"command": "mcp-server-github",
"args": ["--token", "${GITHUB_TOKEN}", "--org", "${GITHUB_ORG}"],
"env": {
"DEBUG": "true",
"RATE_LIMIT": "5000"
}
},
"database": {
"command": "/usr/local/bin/mcp-server-postgres",
"args": ["--connection", "${DATABASE_URL}"],
"env": {
"POSTGRES_SSL": "require"
}
},
"filesystem": {
"command": "mcp-server-filesystem",
"args": ["--allowed-dir", "${HOME}/projects"]
}
}
}Control which tools are exposed to agents while keeping all tools available for Starlark composition:
Allowlist Mode (only specified tools exposed):
{
"mcpServers": {
"github": {
"command": "mcp-server-github",
"allowedTools": ["get_issue", "list_issues", "create_*"]
}
}
}Denylist Mode (specified tools hidden):
{
"mcpServers": {
"slack": {
"command": "mcp-server-slack",
"hiddenTools": ["admin_*", "delete_*", "dangerous_operation"]
}
}
}Wildcard Patterns:
admin_*matchesadmin_user,admin_delete, etc.*_adminmatchesdelete_admin,user_admin, etc.get_*_infomatchesget_user_info,get_repo_info, etc.*matches any tool name
Important Notes:
- Use either
allowedToolsORhiddenTools, not both - Filtered tools remain available in Starlark scripts for composition
- Perfect for wrapping raw tools with processed versions
- Environment Variable Expansion: Use
${VAR}syntax to reference environment variables in commands, args, and env values - Automatic Discovery: Tools from connected servers are automatically discovered at startup
- Per-Tool Filtering: Fine-grained control over which tools are exposed to agents
- Error Resilience: Failed server connections don't prevent the metatool from starting
- Clean Shutdown: Proper cleanup of all upstream connections on exit
- β Phase 1 Complete: Configuration, connection management, and tool discovery
- β Phase 2 Complete: MCP server proxying with configurable tool visibility
- β
Phase 2+ Complete: Starlark integration for calling upstream tools as
serverName.toolName(params) - π Phase 3 Planned: Advanced features like execution timeouts, audit trails, and performance optimizations
Call multiple MCP servers in a single Starlark script:
# Using eval_starlark tool
echo_result = echo.echo({"message": "Hello from composition!"})
processed_data = {
"response": echo_result["structured"]["result"],
"timestamp": "2025-01-11",
"processed_by": "starlark"
}Save reusable tools that combine multiple services:
# Save a tool that processes GitHub issues
github_issue = github.getIssue({"number": params.issue_number})
analysis = {
"title": github_issue.title,
"priority": "high" if "urgent" in github_issue.title.lower() else "normal",
"assignee_count": len(github_issue.assignees),
"needs_attention": github_issue.state == "open" and len(github_issue.comments) == 0
}
if analysis.needs_attention:
slack.postMessage({
"channel": "#dev-team",
"text": f"π Issue #{params.issue_number} needs attention: {github_issue.html_url}"
})
result = analysisTransform and route data between different systems:
# Fetch data from API, process it, and store results
api_data = api.fetchData({"endpoint": params.source})
processed = []
for item in api_data.items:
if item.status == "active":
processed.append({
"id": item.id,
"name": item.name.upper(),
"score": item.score * 1.2 # Apply boost
})
# Store processed data
database.insert({
"table": "processed_items",
"data": processed
})
result = {"processed_count": len(processed), "source": params.source}All Starlark code has access to these standard library modules:
Functions:
time.now()- Get current system timetime.parse_time(str, format, location)- Parse time strings (supports ISO 8601)- Default format: RFC3339 (e.g.,
"2025-01-15T10:30:00Z") - Custom format: Go time format (e.g.,
"2006-01-02") - Default location: UTC
- Default format: RFC3339 (e.g.,
time.time(year, month, day, hour, minute, second, nanosecond, location)- Create time values (all parameters optional, use keyword arguments)time.parse_duration(str)- Parse duration strings (e.g.,"1h30m","5s")time.from_timestamp(sec, nsec)- Convert Unix timestamp to timetime.is_valid_timezone(loc)- Check if timezone name is valid
Constants:
time.nanosecond,time.microsecond,time.millisecondtime.second,time.minute,time.hour
Examples:
# Parse ISO 8601 timestamp
timestamp = time.parse_time("2025-01-15T10:30:00Z")
# Get current time
now = time.now()
# Create a specific time
meeting = time.time(year=2025, month=1, day=15, hour=14, minute=30)
# Parse duration
timeout = time.parse_duration("5m30s")
# Check timezone
is_valid = time.is_valid_timezone("America/New_York") # TrueFunctions:
- Basic:
ceil,floor,round,fabs(absolute value) - Powers:
pow,sqrt,exp - Trigonometry:
sin,cos,tan,asin,acos,atan,atan2 - Hyperbolic:
sinh,cosh,tanh,asinh,acosh,atanh - Logarithms:
log(x, base)- natural log by default - Angles:
degrees,radians - Other:
copysign,mod,remainder,hypot,gamma
Constants:
math.pi- Ο (approximately 3.14159)math.e- Euler's number (approximately 2.71828)
Examples:
# Calculate distance (Pythagorean theorem)
distance = math.sqrt(math.pow(3, 2) + math.pow(4, 2)) # 5.0
# Trigonometry
angle_rad = math.radians(45)
sine = math.sin(angle_rad)
# Logarithms
log_base_10 = math.log(100, 10) # 2.0
natural_log = math.log(math.e) # 1.0
# Rounding
rounded_up = math.ceil(3.2) # 4
rounded_down = math.floor(3.8) # 3Functions:
json.encode(value)- Convert Starlark value to JSON string- Handles: dicts, lists, strings, numbers, bools, None (β null)
json.decode(str, default)- Parse JSON string to Starlark value- Returns
defaultif parsing fails (otherwise fails)
- Returns
json.indent(str, prefix, indent)- Pretty-print JSON- Default indent: tab character
- Optional prefix for each line
Examples:
# Encode data to JSON
data = {"name": "Alice", "age": 30, "active": True}
json_str = json.encode(data)
# Result: '{"name":"Alice","age":30,"active":true}'
# Decode JSON to Starlark
parsed = json.decode('{"x": 42, "y": "test"}')
value = parsed["x"] # 42
# Pretty-print JSON
formatted = json.indent(json_str, indent=" ")
# Result: multi-line indented JSON
# Round-trip with data transformation
api_response = json.decode(api_data)
processed = [item for item in api_response["items"] if item["status"] == "active"]
result = json.encode({"processed": processed})Execute Starlark code with access to all connected MCP servers.
Parameters:
code(string): The Starlark code to executeparams(object, optional): Parameters available asparamsdict in the code
Features:
- π Server Access: Call any connected MCP server using
serverName.toolName(params) - π Full Starlark: Complete Python-like language with loops, conditionals, comprehensions
- π Data Processing: Built-in functions for transforming and analyzing data
- π Real-time Execution: Execute code immediately with live results
Examples:
Multi-server workflow:
# Call multiple services and combine results
user_data = github.getUser({"username": params.username})
recent_issues = github.listIssues({"creator": params.username, "state": "open"})
summary = {
"user": user_data.login,
"public_repos": user_data.public_repos,
"open_issues": len(recent_issues),
"most_recent": recent_issues[0].title if recent_issues else None
}Create or update a composite tool definition that can be executed later.
Parameters:
name(string): Tool identifierdescription(string): Human-readable description of what the tool doesinputSchema(object): JSON Schema for tool parameterscode(string): Starlark implementation of the tool
Example - GitHub Issue Processor:
{
"name": "github_issue_processor",
"description": "Analyzes GitHub issues and sends Slack notifications for urgent ones",
"inputSchema": {
"type": "object",
"properties": {
"repo": {"type": "string", "description": "Repository name (owner/repo)"},
"issue_number": {"type": "integer", "description": "Issue number to process"}
},
"required": ["repo", "issue_number"]
},
"code": `
# Fetch issue details from GitHub
issue = github.getIssue({
"owner": params.repo.split('/')[0],
"repo": params.repo.split('/')[1],
"issue_number": params.issue_number
})
# Analyze issue priority
is_urgent = any(label.name in ['urgent', 'critical', 'P0'] for label in issue.labels)
is_stale = issue.state == 'open' and len(issue.comments) == 0
# Send Slack notification if urgent
notification_sent = False
if is_urgent:
slack_result = slack.postMessage({
"channel": "#urgent-issues",
"text": f"π¨ Urgent issue detected: {issue.title}\n{issue.html_url}"
})
notification_sent = True
result = {
"issue_title": issue.title,
"is_urgent": is_urgent,
"is_stale": is_stale,
"assignee_count": len(issue.assignees),
"notification_sent": notification_sent,
"issue_url": issue.html_url
}
`
}List all saved composite tool definitions.
Parameters: None
Returns: A list of saved tools with their names and descriptions.
Example:
list_saved_tools() // Returns: {"tools": [{"name": "greet_user", "description": "A simple greeting tool"}]}Show the complete definition of a saved tool including its code, schema, and metadata.
Parameters:
name(string): The name of the tool to display
Example:
show_saved_tool({"name": "greet_user"}) // Returns complete tool definitionDelete a saved tool definition from storage.
Parameters:
name(string): The name of the tool to delete
Example:
delete_saved_tool({"name": "greet_user"}) // Removes the tool (restart server to unregister)Once saved with save_tool, custom tools become available as regular MCP tools:
// Call the GitHub issue processor tool
github_issue_processor({
"repo": "microsoft/vscode",
"issue_number": 12345
})
// Returns: {
// "issue_title": "Critical bug in editor",
// "is_urgent": true,
// "is_stale": false,
// "assignee_count": 2,
// "notification_sent": true,
// "issue_url": "https://github.com/microsoft/vscode/issues/12345"
// }- Incident Response: Combine monitoring alerts, GitHub issues, and Slack notifications
- Deployment Pipelines: Orchestrate builds, tests, and notifications across multiple services
- Code Review Automation: Analyze PRs, run checks, and update project management tools
- ETL Pipelines: Extract from APIs, transform data, and load into databases
- Report Generation: Aggregate data from multiple sources and distribute results
- Data Validation: Check data quality across different systems and alert on issues
- Support Ticket Routing: Analyze support requests and route to appropriate teams
- Customer Onboarding: Coordinate account setup across multiple platforms
- Health Monitoring: Track customer usage and trigger interventions
- Multi-Source Analysis: Combine data from GitHub, JIRA, Slack, and databases
- Automated Reporting: Generate insights and distribute to stakeholders
- Trend Detection: Monitor metrics across services and identify patterns
The project includes comprehensive test coverage:
# Run all tests
go test ./...
# Run with coverage
go test -cover ./...
# Run specific test suites
go test ./internal/starlark -v # Starlark integration tests
go test ./internal/tools -v # Tool composition tests
go test ./internal/proxy -v # MCP server proxy testsTest Coverage:
- β 450+ test cases covering all major functionality
- β Bridge integration tests for server namespaces and tool functions
- β End-to-end workflows validating multi-server composition
- β Error handling and edge case validation
- β Backward compatibility ensuring existing tools continue to work
βββββββββββββββββββ ββββββββββββββββββββ βββββββββββββββββββ
β Claude Code βββββΊβ MCP Metatool βββββΊβ MCP Servers β
β Client β β Server β β (GitHub, Slack, β
βββββββββββββββββββ β β β Database, etc) β
β ββββββββββββββββ β βββββββββββββββββββ
β β Starlark β β
β β Runtime β β
β ββββββββββββββββ β
β ββββββββββββββββ β
β β Saved Tools β β
β β Storage β β
β ββββββββββββββββ β
ββββββββββββββββββββ
Components:
- π Proxy Manager: Connects to and manages multiple MCP servers
- π Starlark Runtime: Executes Python-like scripts with server access
- π οΈ Tool Bridge: Exposes MCP tools as callable Starlark functions
- πΎ Persistence Layer: Stores and manages saved tool definitions
- β Validation Engine: JSON Schema validation for tool parameters
βββ main.go # Server setup and initialization
βββ internal/
β βββ config/ # MCP server configuration
β βββ persistence/ # Tool storage and management
β βββ proxy/ # MCP server connection management
β βββ starlark/
β β βββ executor.go # Starlark execution engine
β β βββ bridge.go # MCP tool integration β
β β βββ convert.go # GoβStarlark value conversion
β β βββ bridge_test.go # Integration tests (36 tests)
β β βββ executor_test.go # Execution tests (400+ tests)
β βββ tools/
β β βββ eval.go # eval_starlark with proxy support β
β β βββ saved.go # Saved tools with proxy support β
β β βββ integration_test.go # End-to-end tests (15 tests)
β β βββ [other tool handlers]
β βββ validation/ # JSON Schema validation
βββ spec.md # Complete technical specification
β = New/Enhanced for Starlark integration
The metatool uses a single directory for all persistent data:
~/.mcp-metatool/ # Default directory (or $MCP_METATOOL_DIR)
βββ servers.json # MCP server configuration
βββ tools/ # Saved tool definitions
βββ greet_user.json # Individual tool files
βββ data_processor.json
βββ ...
- Saved tools: Stored as JSON files in
tools/subdirectory - Server config: Single
servers.jsonfile for MCP server connections - Environment override: Use
MCP_METATOOL_DIRto customize location
Phase 1 - Foundation (Complete)
- β MCP server discovery and connection management
- β
Basic tool proxying with
serverName__toolNameformat - β File-based persistence and configuration
Phase 2 - Starlark Integration (Complete)
- β Starlark runtime with full Python-like language support
- β
Tool bridge enabling
serverName.toolName(params)syntax - β Composite tool creation with save_tool functionality
- β Parameter validation using JSON Schema
- β Comprehensive testing with 450+ test cases
Phase 2.5 - Production Hardening
- π Performance profiling and optimization
- π Enhanced error messages and debugging support
- π Tool execution metrics and monitoring
Phase 3 - Advanced Features
- β±οΈ Execution timeouts and resource limits for composite tools
- π Audit trails and execution logging for compliance
- π Tool versioning and migration support
- π― Performance optimizations for high-volume usage
Phase 4 - Ecosystem Integration
- π Tool marketplace for sharing composite tools
- π Plugin system for custom integrations
- π Analytics dashboard for tool usage insights
- π€ Collaboration features for team tool development
Built with β€οΈ using:
The MCP Metatool represents a major evolution in tool composition, transforming the MCP ecosystem from individual tools into a unified composition platform.
Ready to build the future of tool automation? Let's compose! π