A model-agnostic Ruby AI Agent framework powered by unified LLM provider architecture. Provides base classes for building Generators, Actions, Tasks, and Agents that can be used to build AI-powered applications in Ruby.
🚀 Now powered by RubyLLM for unified provider support!
For more detailed documentation visit our documentation site: https://docs.sublayer.com.
Architecture Migration to RubyLLM (In Progress)
We're migrating Sublayer from custom provider implementations to RubyLLM's unified architecture for better reliability, standardization, and expanded provider support.
- ✅ Task 1: Foundation Setup - RubyLLM integration with feature flags
- ✅ Task 2: Tool System Architecture - Output adapter → RubyLLM Tools migration
- ✅ Task 3: Provider Interface Migration - Full provider replacement
- ✅ Task 4: Base Class Refactoring - Enhanced generator capabilities with deprecation warnings
- 📋 Task 5: Actions Migration - Speech and file actions upgrade
- 📋 Task 6: Testing & Validation - Comprehensive test coverage
- 📋 Task 7: Documentation - Migration guides and examples
- 📋 Task 8: Deployment - Production rollout with monitoring
Ready to migrate? See our comprehensive Migration Guide for:
- Step-by-step migration instructions
- Compatibility analysis tools
- Testing strategies
- Common issues and solutions
- Rollback procedures
Quick migration CLI:
# Analyze your project for compatibility
sublayer migrate analyze
# Enable RubyLLM backend
sublayer migrate enable
# Test all generators
sublayer migrate testEnable the new RubyLLM backend:
# Enable RubyLLM backend (optional - defaults to legacy providers)
Sublayer.configuration.use_rubyllm = true
# Your existing code continues to work unchanged
Sublayer.configuration.ai_provider = Sublayer::Providers::Claude
Sublayer.configuration.ai_model = "claude-3-5-sonnet-20240620"Benefits of RubyLLM Backend:
- Unified interface across all LLM providers
- Built-in multimodal support (images, audio, documents)
- Standardized tool/function calling
- Enhanced error handling and retry logic
- Streaming response support
- Production-tested reliability
Pre-1.0 we anticipate many breaking changes to the API. Our current plan is to keep breaking changes to minor, 0.x releases, and patch releases (0.x.y) will be used for new features and bug fixes.
To maintain stability in your application, we recommend pinning the version of Sublayer in your Gemfile to a specific minor version. For example, to pin to version 0.2.x, you would add the following line to your Gemfile:
gem 'sublayer', '~> 0.2'Install the gem by running the following commands:
gem install sublayer
Or add this line to your application's Gemfile:
gem 'sublayer', '~> 0.2'Sublayer is model-agnostic and supports multiple LLM providers through both legacy and RubyLLM backends. The RubyLLM backend provides enhanced capabilities and unified provider management.
| Provider | Legacy Support | RubyLLM Support | Status |
|---|---|---|---|
| OpenAI | ✅ | ✅ | Stable |
| Claude (Anthropic) | ✅ | ✅ | Stable |
| Gemini | ✅ | Stable with RubyLLM | |
| Ollama | ❌ | ✅ | RubyLLM only |
| DeepSeek | ❌ | ✅ | RubyLLM only |
| OpenRouter | ❌ | ✅ | RubyLLM only |
| AWS Bedrock | ❌ | ✅ | RubyLLM only |
Expects you to have an OpenAI API key set in the OPENAI_API_KEY environment variable.
Visit OpenAI to get an API key.
Legacy Configuration:
Sublayer.configuration.ai_provider = Sublayer::Providers::OpenAI
Sublayer.configuration.ai_model = "gpt-4o"RubyLLM Configuration:
Sublayer.configuration.use_rubyllm = true
Sublayer.configuration.ai_model = "gpt-4o"
# API key automatically detected from OPENAI_API_KEYExpects you to have a Claude API key set in the ANTHROPIC_API_KEY environment variable.
Visit Anthropic to get an API key.
Legacy Configuration:
Sublayer.configuration.ai_provider = Sublayer::Providers::Claude
Sublayer.configuration.ai_model = "claude-3-5-sonnet-20240620"RubyLLM Configuration:
Sublayer.configuration.use_rubyllm = true
Sublayer.configuration.ai_model = "claude-3-5-sonnet-20240620"
# API key automatically detected from ANTHROPIC_API_KEYLegacy (UNSTABLE): Gemini's function calling API is in beta. Not recommended for production use.
RubyLLM (RECOMMENDED): Stable implementation with full feature support.
Expects you to have a Gemini API key set in the GEMINI_API_KEY environment variable.
Visit Google AI Studio to get an API key.
RubyLLM Configuration (Recommended):
Sublayer.configuration.use_rubyllm = true
Sublayer.configuration.ai_model = "gemini-1.5-pro"
# API key automatically detected from GEMINI_API_KEYThe RubyLLM backend enables access to additional providers:
Ollama (Local Models):
Sublayer.configuration.use_rubyllm = true
Sublayer.configuration.ai_model = "llama3:8b"
# Set OLLAMA_API_BASE if not using default http://localhost:11434DeepSeek:
Sublayer.configuration.use_rubyllm = true
Sublayer.configuration.ai_model = "deepseek-chat"
# Requires DEEPSEEK_API_KEYOpenRouter (Multi-Provider Gateway):
Sublayer.configuration.use_rubyllm = true
Sublayer.configuration.ai_model = "anthropic/claude-3.5-sonnet"
# Requires OPENROUTER_API_KEYGenerators are responsible for generating specific outputs based on input data. They focus on a single generation task and do not perform any actions or complex decision-making. Generators are the building blocks of the Sublayer framework.
With RubyLLM backend, generators gain:
- Enhanced multimodal capabilities
- Improved tool/function calling
- Better error handling and retries
- Streaming response support
Examples (in the /spec/generators/examples directory):
- CodeFromDescriptionGenerator: Generates code based on a description and the technologies used.
- DescriptionFromCodeGenerator: Generates a description of the code passed in to it.
- CodeFromBlueprintGenerator: Generates code based on a blueprint, a blueprint description, and a description of the desired code.
Actions perform specific operations to either get inputs for a Generator or use the generated output from a Generator. Actions do not involve complex decision making.
Enhanced capabilities with RubyLLM:
- Native multimodal file processing
- Improved audio/speech handling
- Better streaming support for real-time actions
Examples:
- WriteFileAction: Saves generated output to a file.
- RunTestCommandAction: Runs a generated command line command.
Sublayer Agents are autonomous entities designed to perform specific tasks or monitor systems.
Examples:
- RSpecAgent: Runs tests whenever a file is changed. If the tests fail the code is changed by the agent to pass the tests.
Sublayer Triggers are used in agents. Triggers decide when an agent is activated and performs its task
Examples:
- FileChange: This built in sublayer trigger, listens for file changes
- TimeInterval: This custom trigger tutorial shows how to create your own trigger, this one activates on a time interval
The migration is designed to be seamless with backward compatibility:
-
Enable RubyLLM (optional):
Sublayer.configuration.use_rubyllm = true
-
Your existing code works unchanged:
# This continues to work with both backends class MyGenerator < Sublayer::Generators::Base llm_output_adapter type: :single_string, name: "response", description: "Generated response" def prompt "Generate a response" end end
-
Test both backends during transition:
# Test with legacy backend Sublayer.configuration.use_rubyllm = false result1 = MyGenerator.new.generate # Test with RubyLLM backend Sublayer.configuration.use_rubyllm = true result2 = MyGenerator.new.generate # Results should be identical
-
Leverage new capabilities (RubyLLM only):
# Enhanced multimodal support coming in future tasks # Streaming responses # Better error handling
Run tests for both backends:
# Test legacy providers
rake spec
# Test RubyLLM integration
bundle exec cucumber features/basic_configuration.feature
# Test migration scenarios
bundle exec cucumber --tags @migration
# Test specific providers
bundle exec cucumber --tags @geminiThere are sample Generators in the /examples/ directory that demonstrate how to build generators using the Sublayer framework. Alternatively below are links to open source projects that are using generators in different ways:
-
Blueprints - An open source AI code assistant that allows you to capture patterns in your codebase to use as a base for generating new code.
-
Clag - A ruby gem that generates command line commands from a simple description right in your terminal.
We're actively working on the RubyLLM migration. Contributions are welcome! See the Migration Epic for detailed technical tasks and user stories.
git clone https://github.com/sublayerapp/sublayer.git
cd sublayer
bin/setup
bundle install# Run all tests
rake test
# Run RSpec only
rake spec
# Run Cucumber only
rake cucumber
# Run migration-specific tests
rake cucumber:migrationThe gem is available as open source under the terms of the MIT License.