Skip to content

jkthysse/odds-streamer

Repository files navigation

Odds Streamer: Real-Time Data Pipeline POC

A fully containerised reference implementation of a real-time sports betting odds pipeline. Odds Streamer simulates live market updates, streams them through Kafka/Redpanda, enriches the data, and exposes both current and historical odds via a low-latency API.

Highlights

  • Low latency lookups powered by Redis for hot odds data
  • 🧮 Time-series analytics backed by TimescaleDB hypertables
  • 🧵 Kafka/Redpanda streaming with resilient exponential backoff producers
  • 🐳 Docker-first local development that mirrors production deployment
  • 📦 TypeScript monorepo with npm workspaces and shared domain types

Architecture

┌────────┐    ┌──────────┐    ┌────────────┐    ┌───────────────┐
│ Mock   │ => │ Ingester │ => │  Kafka /   │ => │  Processor    │
│  API   │    │ Service  │    │ Redpanda   │    │   Service     │
└────────┘    └──────────┘    └────────────┘    ├───────────────┤
                                                 │ Redis (hot)  │
                                                 │ TimescaleDB  │
                                                 │ (historical) │
                                                 └────┬─────────┘
                                                      │
                                             ┌────────▼────────┐
                                             │   Alert API     │
                                             │ (Current + Hist)│
                                             └─────────────────┘
Component Role
services/mock-api Generates realistic odds snapshots and simulates upstream rate limits
services/ingester Polls the mock API, applies exponential backoff, publishes to Kafka
services/processor Consumes the stream, computes deltas, updates Redis and TimescaleDB
services/alert-api REST API serving current markets from Redis and history from Timescale

Repository Layout

.
├── services/
│   ├── alert-api/      # Express API for consumers
│   ├── ingester/       # Kafka producer polling the mock upstream
│   ├── mock-api/       # Synthetic odds feed with rate limiting behaviour
│   └── processor/      # Kafka consumer with Redis + TimescaleDB writers
├── packages/
│   └── shared/         # Typed contracts and helpers shared across services
├── docker-compose.yml  # Full local stack (Redpanda, Redis, TimescaleDB, ClickHouse + services)
├── scripts/            # Database bootstrap scripts
└── .github/            # Issue / PR templates and community health files

Getting Started

Prerequisites

  • Node.js 18.18+
  • npm 8.19+
  • Docker Engine & Docker Compose

Install dependencies

npm install

Build TypeScript

npm run build

Run the stack

docker-compose up --build

Give TimescaleDB ~30 seconds to finish booting. Once logs settle:

To stop and clean up:

docker-compose down -v

Service Commands

Run services individually in watch mode:

npm run dev:mock-api
npm run dev:ingester
npm run dev:processor
npm run dev:alert-api

Run linting and tests across the workspace:

npm run lint
npm test

Format code with Prettier:

npm run format

API Reference (Alert API)

Method Endpoint Description
GET /health Service and dependency status
GET /markets/current/:marketId Current odds snapshot cached in Redis
GET /markets/history/:marketId Historical odds rows from TimescaleDB
GET /markets/movers?minutes=15 Top movers by average delta in the lookback window

Example:

curl "http://localhost:3000/markets/current/evt_1234_pinnacle?bookmaker=Pinnacle"

TimescaleDB Bootstrapping

The processor service runs schema migrations on startup, creating:

  • odds_history hypertable (partitioned by recorded_at)
  • TimescaleDB extension (idempotent)

You can also apply the SQL manually:

psql postgresql://postgres:postgres@localhost:5432/odds_streamer \
  -f scripts/init-timescaledb.sql

Contributing

We welcome contributions! Please review:

Open an issue or discussion if you have questions or ideas. PRs should follow Conventional Commits and include tests or docs where appropriate.

Roadmap

  • Streaming aggregates into ClickHouse for analytics use-cases
  • Metrics and tracing via OpenTelemetry
  • CI pipeline (lint/test/build) with GitHub Actions
  • Frontend dashboard for visualising market movement

Built with ❤️ to demonstrate production-grade real-time data engineering practices.

About

A fully containerised reference implementation of a real-time sports betting odds pipeline.

Resources

License

Code of conduct

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published