A service that analyzes OpenAPI specifications and generates natural language summaries and documentation using AI.
- Upload and parse OpenAPI specifications (JSON/YAML)
- Generate natural language summaries of API endpoints
- Asynchronous processing with job status tracking
- Structured logging and error handling
- Simple web interface for file uploads and results access
- Export summaries in multiple formats (Markdown, HTML, DOCX)
- Ad-hoc user queries
- Ask the tool to test endpoints for you
Component | Technology | Purpose |
---|---|---|
Framework | FastAPI | Modern async web framework with automatic OpenAPI/Swagger docs |
Task Queue | Celery + Redis | Background processing for API analysis and summary generation |
Configuration | pydantic-settings | Type-safe configuration management with environment variables |
Logging | Loguru | Structured logging with customizable formatting |
Testing | pytest + pytest-asyncio | Async-aware testing with comprehensive fixtures |
Code Quality | Ruff, MyPy, pre-commit | Linting, type checking, and automated code quality checks |
AI Integration | OpenAI API | Natural language processing for API analysis |
Frontend | Tailwind CSS | Minimal web interface for file uploads and results |
api_introspection/
├── src/
│ ├── api/
│ │ ├── routes.py # FastAPI route definitions
│ │ └── models.py # Request/response Pydantic models
│ ├── core/
│ │ ├── config.py # Environment and app configuration
│ │ ├── storage.py # Job data storage management
│ │ └── logging/ # Logging package
│ │ ├── __init__.py # Package exports
│ │ ├── config.py # Logging configuration
│ │ ├── core.py # Core logging functionality
│ │ └── handlers.py # Custom logging handlers
│ ├── services/
│ │ └── openai.py # OpenAI API integration
│ └── tasks/
│ └── tasks.py # Celery task definitions
├── web/
│ └── index.html # Simple web interface for file uploads
├── tests/
│ ├── test_routes.py # API endpoint tests
│ └── conftest.py # pytest fixtures and configuration
├── celery_worker.py # Celery worker configuration
├── pyproject.toml # Project dependencies and tools config
└── DEBUGGING.md # Development and debugging guide
The service provides a simple web interface for file uploads and results access:
- Open your browser and navigate to
http://localhost:8080
- Use the file upload form to submit your OpenAPI specification
- Wait for the analysis to complete
- Download the results in your preferred format (Markdown or HTML)
For programmatic access, you can use the following API endpoints:
GET /api/health
Response:
{
"status": "healthy"
}
POST /api/spec/upload
Request:
Parameter | Type | Required | Description |
---|---|---|---|
file | File (multipart/form-data) | Yes | OpenAPI spec file (JSON/YAML) |
Supported Content Types:
- application/json
- text/yaml
- application/x-yaml
- text/plain
- text/x-yaml
Response:
{
"job_id": "string" // UUID for tracking the analysis job
}
Error Responses:
- 400: Invalid file type or no file provided
- 500: Server error during file processing
GET /api/spec/{job_id}/summary
Parameters:
Parameter | Type | Required | Description |
---|---|---|---|
job_id | string (path) | Yes | Job ID from upload response |
Response States:
- Processing:
{
"detail": "Job is still processing"
}
Status Code: 202 Accepted
- Completed:
{
"status": "SUCCESS",
"result": {
// Summary content structure
"endpoints": [...],
"schemas": [...],
"overview": "string"
}
}
Status Code: 200 OK
- Failed:
{
"detail": "Job failed"
}
Status Code: 500 Internal Server Error
GET /api/spec/{job_id}/export
Parameters:
Parameter | Type | Required | Description |
---|---|---|---|
job_id | string (path) | Yes | Job ID from upload response |
file_format | string (query) | No | Export format: "md" (default), "html", or "docx" |
Response:
- Format: Markdown (default)
- Content-Type: text/markdown
- Filename: api-summary-{job_id}.md
- Format: HTML
- Content-Type: text/html
- Direct HTML content
- Format: DOCX
- Content-Type: application/vnd.openxmlformats-officedocument.wordprocessingml.document
- Filename: api-summary-{job_id}.docx
Error Responses:
- 404: Job not found
- 400: Unsupported file format
- 500: Export generation error
- Clone the repository
- Install dependencies:
poetry install
- Create
.env
file with required variables:
OPENAI_API_KEY=your_api_key
ENV=development
LOG_LEVEL=DEBUG
REDIS_URL=redis://localhost:6379
- Start Redis:
docker run -d -p 6379:6379 redis
- Start Celery worker:
poetry run celery -A celery_worker worker --loglevel=info
- Start the API server:
poetry run uvicorn src.main:app --reload
The project includes a Makefile with several useful commands to help with development:
make pc # Run all code quality checks (ruff, mypy, pre-commit)
make fix # Run pre-commit hooks to fix code style issues
make run # Start the FastAPI development server with hot reload
make celery # Start the Celery worker
make redis # Start Redis server in daemon mode
make dev # Start complete development environment (API + Celery + Redis)
make clean # Stop all development processes (API, Celery, Redis)
Command | Description |
---|---|
make pc |
Run all code quality checks (ruff linting, formatting, mypy type checking, pre-commit hooks) |
make fix |
Run pre-commit hooks to automatically fix code style issues |
make test |
Run the test suite |
make run |
Start the FastAPI development server on port 8080 with hot reload |
make celery |
Start the Celery worker for background task processing |
make redis |
Start Redis server in daemon mode |
make dev |
Start the complete development environment (cleans up existing processes, starts Redis, API, and Celery) |
make clean |
Stop all development processes (API server, Celery worker, Redis) |
Run the test suite:
poetry run pytest
With coverage:
poetry run pytest --cov=src
I created this repo for the following purposes:
-
Get familiar with LLM tools I haven't used at all or only used to a limited extent.
-
Create a production-grade mini project with robust infrastructure: CI/CD, lint, tests, documentation. I might later turn it into a template for future projects, extend the functionality, etc.