Refactor ChatBot to use modular LLMFactory architecture #54
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Summary
This PR refactors the ChatBot class to use our modular LLMFactory architecture, eliminating 400+ lines of hardcoded provider logic while maintaining full backward compatibility. The change reduces code complexity, standardizes provider management, and improves maintainability across all LLM integrations.
Key Changes
Architecture Improvements
Before: Hardcoded Provider Logic
After: Clean Factory Pattern
Technical Details
Provider Consolidation
spoon_ai/llm/providers/directoryconfig.jsonvia ConfigManagerBackward Compatibility
ChatBotconstructor parameters preservedask()andask_tool()methods work unchangedProvider-Specific Fixes
output_queueattribute causing crashesTest Results
Integration Tests (7/7 Passing)
Provider Tests (7/7 Passing)
✅ Provider Registration (4/4 providers) ✅ Provider Instantiation ✅ Configuration Loading (env vars + config.json) ✅ Basic Chat Functionality ✅ Tool-based Chat Functionality ✅ Error Handling & Edge Cases ✅ API Endpoint ValidationDependencies
pytest>=8.4.1for improved test infrastructuretoml>=0.10.2for configuration parsingBreaking Changes
None. This is a purely internal refactoring that maintains the exact same public API.
Migration Notes
No action required for existing code. All ChatBot usage patterns continue to work:
Files Changed
Core Refactoring
spoon_ai/chat.py: Complete ChatBot refactor using LLMFactoryspoon_ai/llm/factory.py: Enhanced with provider registrationspoon_ai/llm/providers/: New unified provider directoryProvider Standardization
spoon_ai/llm/providers/anthropic.py: Official API endpoint, cache metricsspoon_ai/llm/providers/openai.py: Config.json integrationspoon_ai/llm/providers/deepseek.py: Official API endpointspoon_ai/llm/providers/gemini.py: Fixed streaming issuesTesting Infrastructure
tests/test_chatbot_integration.py: Comprehensive integration teststests/test_providers.py: Provider functionality testspyproject.toml: Added testing dependenciesValidation
Future Benefits
This refactoring enables:
🤖 Generated with Claude Code