Skip to content

Releases: unnoq/orpc

v1.11.3

17 Nov 08:46

Choose a tag to compare

Cloudflare Worker Ratelimit Adapter docs

Adapter for Cloudflare Workers Ratelimit.

import { CloudflareRatelimiter } from '@orpc/experimental-ratelimit/cloudflare-ratelimit'

export default {
  async fetch(request, env) {
    const limiter = new CloudflareRatelimiter(env.MY_RATE_LIMITER)

    return new Response(`Hello World!`)
  }
}

   🚀 Features

    View changes on GitHub

v1.11.2

12 Nov 07:33

Choose a tag to compare

   🚀 Features

  • standard-server: Send initial comment in event stream to flush response headers immediately  -  by @unnoq in #1204 (91ac3)

   🐞 Bug Fixes

  • tanstack-query: Set stream query to success immediately after stream resolves  -  by @unnoq in #1202 (bbe55)
    View changes on GitHub

Tip

If you find oRPC valuable and would like to support its development, you can do so here.

v1.11.1

10 Nov 02:22

Choose a tag to compare

   🚀 Features

  • pino:
    • Enhance getLogger util type  -  by @unnoq (8606f)
    • Update logger structure to use 'rpc' instead of 'orpc'  -  by @unnoq (734a8)

Tip

If you find oRPC valuable and would like to support its development, you can do so here.

    View changes on GitHub

v1.11.0

09 Nov 12:04

Choose a tag to compare

Pino Logging Integration docs

Easy add structured logging, request tracking, and error monitoring to your oRPC powered by Pino

const logger = pino()

const handler = new RPCHandler(router, {
  plugins: [
    new LoggingHandlerPlugin({
      logger, // Custom logger instance
      generateId: ({ request }) => crypto.randomUUID(), // Custom ID generator
      logRequestResponse: true, // Log request start/end (disabled by default)
      logRequestAbort: true, // Log when requests are aborted (disabled by default)
    }),
  ],
})

Ratelimit Helpers docs

The Rate Limit package provides flexible rate limiting for oRPC with multiple storage backend support. It includes adapters for in-memory, Redis, and Upstash, along with middleware and plugin helpers for seamless integration.

Ratelimiter

const ratelimiter = new MemoryRatelimiter({
  maxRequests: 10,
  window: 60000,
})

Manually usage

const result = await limiter.limit('user:123')

if (!result.success) {
  throw new ORPCError('TOO_MANY_REQUESTS', {
    data: {
      limit: result.limit,
      remaining: result.remaining,
      reset: result.reset,
    },
  })
}

Built-in middleware

const loginProcedure = os
  .$context<{ ratelimiter: Ratelimiter }>()
  .input(z.object({ email: z.email() }))
  .use(
    createRatelimitMiddleware({
      limiter: ({ context }) => context.ratelimiter,
      key: ({ context }, input) => `login:${input.email}`,
    }),
  )
  .handler(({ input }) => {
    return { success: true }
  })


const result = await call(
  loginProcedure,
  { email: '[email protected]' },
  { context: { ratelimiter } }
)

Response Header Plugin

import { RatelimitHandlerPlugin } from '@orpc/experimental-ratelimit'

const handler = new RPCHandler(router, {
  plugins: [
    new RatelimitHandlerPlugin(),
  ],
})

Retry After Plugin docs

The Retry After Plugin automatically retries requests based on server Retry-After headers. This is particularly useful for handling rate limiting and temporary server unavailability.

import { RetryAfterPlugin } from '@orpc/client/plugins'

const link = new RPCLink({
  url: 'http://localhost:3000/rpc',
  plugins: [
    new RetryAfterPlugin({
      condition: (response, options) => {
        // Override condition to determine if a request should be retried
        return response.status === 429 || response.status === 503
      },
      maxAttempts: 5, // Maximum retry attempts
      timeout: 5 * 60 * 1000, // Maximum time to spend retrying (ms)
    }),
  ],
})

Expand support union/interaction in openapi generator

Now you can use union/interaction for define params, query, headers, ...

const procedure = os
   .route({ path: '/{type}' })
   .input(z.discriminatedUnion('type', [
      z.object({
        type: z.literal("a"),
        foo: z.number().int().positive(),
      }),
      z.object({
        type: z.literal("b"),
        foo: z.number().int().negative(),
      }),
    ]))

   🚀 Features

Tip

If you find oRPC valuable and would like to support its development, you can do so here.

    View changes on GitHub

v1.10.4

05 Nov 11:03

Choose a tag to compare

   🚀 Features

  • ai-sdk: Disable validation at oRPC level in createTool to avoid double validation  -  by @unnoq in #1166 (3bec7)
    View changes on GitHub

v1.10.3

02 Nov 10:52

Choose a tag to compare

AI SDK implementTool & createTool helpers

Implement/Convert a procedure/contract -> AI SDK Tool

const getWeatherTool = implementTool(getWeatherContract, {
  execute: async ({ location }) => ({
    location,
    temperature: 72 + Math.floor(Math.random() * 21) - 10,
  }),
})

const getWeatherTool = createTool(getWeatherProcedure, {
  context: {}, // provide initial context if needed
})

   🚀 Features

   🐞 Bug Fixes

    View changes on GitHub

v1.10.2

26 Oct 09:47

Choose a tag to compare

OpenAPI - custom error format docs

By default, OpenAPIHandler, OpenAPIGenerator, and OpenAPILink share the same error response format. You can customize one, some, or all of them based on your requirements.

const handler = new OpenAPIHandler(router, {
  customErrorResponseBodyEncoder(error) {
    return error.toJSON()
  },
})

Native Fastify adapter docs

Previously, oRPC in Fastify used the Node adapter, which didn't integrate well with Fastify's ecosystem (e.g., cookies, helpers, middleware). This native adapter supports Fastify's request/reply APIs directly, enabling full access to Fastify features within oRPC.

import Fastify from 'fastify'
import { RPCHandler } from '@orpc/server/fastify'
import { onError } from '@orpc/server'

const handler = new RPCHandler(router, {
  interceptors: [
    onError((error) => {
      console.error(error)
    })
  ]
})

const fastify = Fastify()

fastify.addContentTypeParser('*', (request, payload, done) => {
  // Fully utilize oRPC feature by allowing any content type
  // And let oRPC parse the body manually by passing `undefined`
  done(null, undefined)
})

fastify.all('/rpc/*', async (req, reply) => {
  const { matched } = await handler.handle(req, reply, {
    prefix: '/rpc',
    context: {} // Provide initial context if needed
  })

  if (!matched) {
    reply.status(404).send('Not found')
  }
})

fastify.listen({ port: 3000 }).then(() => console.log('Server running on http://localhost:3000'))

   🚀 Features

Tip

If you find oRPC valuable and would like to support its development, you can do so here.

    View changes on GitHub

v1.10.1

24 Oct 09:14

Choose a tag to compare

Message port transfer docs

By default, oRPC serializes request/response messages to string/binary data before sending over message port. If needed, you can define the transfer option to utilize full power of MessagePort: postMessage() method, such as transferring ownership of objects to the other side or support unserializable objects like OffscreenCanvas.

const handler = new RPCHandler(router, {
  experimental_transfer: (message, port) => {
    const transfer = deepFindTransferableObjects(message) // implement your own logic
    return transfer.length ? transfer : null // only enable when needed
  }
})

   🚀 Features

Tip

If you find oRPC valuable and would like to support its development, you can do so here.

    View changes on GitHub

v1.10.0

20 Oct 11:50

Choose a tag to compare

Publisher helper docs

The Publisher is a helper that enables you to listen to and publish events to subscribers. Combined with the Event Iterator, it allows you to build streaming responses, real-time updates, and server-sent events with minimal requirements.

const publisher = new MemoryPublisher<{
  'something-updated': {
    id: string
  }
}>()

const live = os
  .handler(async function* ({ input, signal, lastEventId }) {
    const iterator = publisher.subscribe('something-updated', { signal, lastEventId })
    for await (const payload of iterator) {
      // Handle payload here or yield directly to client
      yield payload
    }
  })

const publish = os
  .input(z.object({ id: z.string() }))
  .handler(async ({ input }) => {
    await publisher.publish('something-updated', { id: input.id })
  })

Available Adapters

Name Resume Support Description
MemoryPublisher A simple in-memory publisher
IORedisPublisher Adapter for ioredis
UpstashRedisPublisher Adapter for Upstash Redis

eventIteratorToUnproxiedDataStream util

Prefer using eventIteratorToUnproxiedDataStream over eventIteratorToStream when integrating oRPC with the AI SDK. The AI SDK uses structuredClone internally, which doesn't support proxied data. Since oRPC may proxy events to attach metadata, you should unproxy them before passing to the AI SDK.

const { messages, sendMessage, status } = useChat({
  transport: {
    async sendMessages(options) {
      return eventIteratorToUnproxiedDataStream(await client.chat({
        chatId: options.chatId,
        messages: options.messages,
      }, { signal: options.abortSignal }))
    },
    reconnectToStream(options) {
      throw new Error('Unsupported')
    },
  },
})

   🚀 Features

  • client, server: Add eventIteratorToUnproxiedDataStream util  -  by @unnoq in #1110 (e91d2)
  • publisher: Memory, ioredis, upstash redis publishers  -  by @unnoq and Joonseo Lee in #1094 (22ef1)

   🏎 Performance

Tip

If you find oRPC valuable and would like to support its development, you can do so here.

    View changes on GitHub

v1.9.4

14 Oct 10:33

Choose a tag to compare

   🚀 Features

Tip

If you find oRPC valuable and would like to support its development, you can do so here.

    View changes on GitHub