Ordinality is a framework-agnostic migration tool for any JavaScript environment. It provides a clean API for running and rolling back tasks.
- Database agnostic
- Works with any JavaScript environment - Node, Browser, WebWorker, etc.
- Supports multiple storages for migration data
- Programmatic API for migrations
The following minimal example uses PGlite database and persists the migration data in the database itself.
import { MigrationRunner, getSQLCompilerPreset, SQLStorage } from 'ordinality';
import { PGlite } from '@electric-sql/pglite';
const db = new PGlite();
const migrations = new MigrationRunner({
context: { db },
// You may use one of the included storage implementations, or implement custom storage
// Here we use SQLStorage to persist migration data in any SQL-like storage
storage: new SQLStorage({
// You may use SQL query compiler for your DB
// by preset name like sqlite, mysql, postgres
compiler: getSQLCompilerPreset('postgres'),
// SQLStorage requires a `query` method implementation, to run queries on DB.
// This method receives query and bindings, and returns an array of rows.
async query(query, bindings) {
const { rows } = await db.query(query, bindings);
return rows as Record<string, any>[];
}
}),
migrations: [
{
uid: 'add_notes_table',
// Context is passed to every migration as the first argument
apply: async ({ context: { db } }) =>
db.exec(`CREATE TABLE "notes" (
"id" UUID PRIMARY KEY DEFAULT gen_random_uuid(),
"created_at" TIMESTAMP WITH TIME ZONE DEFAULT now(),
"updated_at" TIMESTAMP WITH TIME ZONE DEFAULT now(),
"title" TEXT NOT NULL,
"text" TEXT NOT NULL,
"visible" BOOLEAN DEFAULT true
)`),
// Optionally, migration object may provide a method to revert migration
revert: async ({ context: { db } }) => db.exec(`DROP TABLE "notes"`),
},
],
});
// Checks migrations and runs them if they are not already applied.
// To keep track of the executed migrations, a table called `schema_migrations`
// will be automatically created (if it doesn't exist already) and parsed.
await migrations.up();Ordinality is built to run in any environment, including Node, Browser, WebWorker, etc.
Examples in docs use imports from ordinality, but in production code you should use specific imports like ordinality/MigrationRunner, ordinality/storage/SQLStorage, etc. This way ensures you will import only code you really use in the project, and it also decreases bundle size.
In case you use imports from ordinality, you may encounter errors like fs module not found if your target is the browser, and your bundler is not configured for aggressive tree shaking.
Install the package via npm install ordinality
Migrations runner is a class purposed to manage migrations state to apply/revert migrations and fetch executed/pending migration names lists.
const migrations = new MigrationRunner({ /* options */ });You can get a list of pending (i.e. not yet executed) migrations with the pending() method:
const migrationNames = await migrations.pending()
// returns an array of all pending migrationsYou can get a list of already executed migrations with the executed() method:
const migrationNames = await migrations.executed()
// returns an array of all already executed migrationsThe up method can be used to execute all pending migrations.
const migrations = await migrations.up()
// returns an array of all executed migrationsIt is also possible to pass the name of a migration in order to just run the migrations from the current state to the passed migration name (inclusive).
await migrations.up({to: '20141101203500-task'})To limit the number of migrations that are run, the step parameter can be used:
// This will run the next two migrations
await migrations.up({step: 2})Note that migrations order is very important. In case any mismatch is detected in the order of applied migrations and provided migrations or their ids - all methods will throw an error and do not take any actions, to prevent corruption of your data.
The down method can be used to revert the last executed migration.
const migration = await migrations.down()
// reverts the last migration and returns itTo revert more than one migration, you can use the step parameter:
// This will revert the last two migrations
await migrations.down({step: 2})It is possible to pass the name of a migration until which (exclusive) the migrations should be reverted. This allows the reverting of multiple migrations at once.
const migrations = await migrations.down({to: '20141031080000-task'})
// returns an array of all reverted migrations up to `20141031080000-task` but not including itTo revert all migrations, you can pass 0 as the to parameter:
await migrations.down({to: 0})Migration is just an object that defines a unique migration name, migration meta information, and implements methods to apply and (optionally) to revert migration.
The formal definition is
export type MigrationMeta<T, Context> = ((context: Context) => Promise<T>) | T;
export type MigrationContext<Context, Meta> = {
context: Context;
meta: Meta;
};
export type Migration<Context = void, Meta = void> = {
readonly uid: string;
/**
* Method to apply migration
*/
apply(context: MigrationContext<Context, Meta>): Promise<void>;
/**
* Optional method to revert migration
*/
revert?(context: MigrationContext<Context, Meta>): Promise<void>;
} & (void extends Meta
? {
/**
* Migration meta
*/
meta?: undefined;
}
: {
/**
* Migration meta
*/
meta: MigrationMeta<Meta, Context>;
});You may pass migrations manually as in examples before
const migrations = new MigrationRunner({
/* Another options... */
migrations: [
{
uid: 'add_notes_table',
// Context is passed to every migration as the first argument
apply: async ({ context: { db } }) =>
db.exec(`CREATE TABLE "notes" (
"id" UUID PRIMARY KEY DEFAULT gen_random_uuid(),
"created_at" TIMESTAMP WITH TIME ZONE DEFAULT now(),
"updated_at" TIMESTAMP WITH TIME ZONE DEFAULT now(),
"title" TEXT NOT NULL,
"text" TEXT NOT NULL,
"visible" BOOLEAN DEFAULT true
)`),
// Optionally, migration object may provide a method to revert migration
revert: async ({ context: { db } }) => db.exec(`DROP TABLE "notes"`),
},
],
});or generate migrations in an idiomatic way for your project
import { Migration } from 'ordinality';
type DBMigration = Migration<{
db: PGLite;
}>;
/**
* Convert import from .sql file to a migration that runs the SQL queries in transaction
*
* Transaction is automatically committed on success and rolled back on error
*/
export async function convertSQLToMigrationObject(
uid: string,
module: Promise<typeof import('*.sql')>,
): Promise<DBMigration> {
const { default: sql } = await module;
return {
uid,
apply: async ({ context: { db } }) =>
db.transaction(async (tx) => {
await tx.exec(sql);
}),
};
}
export const getMigrationsList = async (): Promise<DBMigration[]> =>
Promise.all([
// This code implies your bundler is configured to return the content of `.sql` files as a string
convertSQLToMigrationObject('init_db', import('./sql/init_db.sql')),
convertSQLToMigrationObject('add_notes_table', import('./sql/add_notes_table.sql')),
convertSQLToMigrationObject('add_workspaces_table', import('./sql/add_workspaces_table.sql')),
]);
const migrations = new MigrationRunner({
/* Another options... */
migrations: await getMigrationsList(),
});The logic of migrations fetching highly depends on project design, so your code may fetch SQL files, or import JavaScript files to build the migrations array.
This package does not provide any helpers to do it, to exclude dependencies that will not be used by most package users.
The migrations storage purpose is to keep a log of executed migrations and their order.
Some default storage implementations are included
Keeps migrations data in RAM. Purposed to be used in temporary environments and tests.
import { MigrationRunner, InMemoryStorage } from 'ordinality';
const migrations = new MigrationRunner({
/* Another options... */
storage: new InMemoryStorage(),
});Keeps data in files, using Node fs API.
import { MigrationRunner, MigrationFileStorage } from 'ordinality';
const migrations = new MigrationRunner({
/* Another options... */
storage: new MigrationFileStorage('./migrations.json'),
});By default, state is encoded as a JSON string. In case you want to use another format (Yaml, XML, etc), or want to transform/encrypt the state file, you may pass the codec parameter
import { MigrationRunner, MigrationFileStorage } from 'ordinality';
const migrations = new MigrationRunner({
/* Another options... */
storage: new MigrationFileStorage('./migrations.data', {
codec: {
encode(data) {
return encryptObjectToBuffer(data);
},
decode(buffer) {
return decryptBufferToObject(buffer);
},
},
}),
});SQL storage keeps migration data in any SQL-like storage.
This storage requires you to pass
compiler, to build SQL query and bindingsquery, an adapter to query your database
Example:
import { MigrationRunner, SQLStorage, getSQLCompilerPreset } from 'ordinality';
import { PGlite } from '@electric-sql/pglite';
// We use PGlite here, but you may use any database
const db = new PGlite();
const migrations = new MigrationRunner({
/* Another options... */
storage: new SQLStorage({
// You may use SQL query compiler for your DB
// by preset name like sqlite, mysql, postgres
compiler: getSQLCompilerPreset('postgres'),
// SQLStorage requires a `query` method implementation, to run queries on DB.
// This method receives query and bindings, and returns an array of rows.
async query(query, bindings) {
const { rows } = await db.query(query, bindings);
return rows as Record<string, any>[];
}
}),
});You may use getSQLCompilerPreset utility to build a compiler from a preset for your database.
In case you want to tune an SQL compiler for non-standard syntax, you may create it yourself (see docs for nano-queries).
import { MigrationRunner, SQLStorage, BasicSQLTypes } from 'ordinality';
import { SQLCompiler } from 'nano-queries/compilers/SQLCompiler';
import { PGlite } from '@electric-sql/pglite';
// We use PGlite here, but you may use any database
const db = new PGlite();
const migrations = new MigrationRunner({
/* Another options... */
storage: new SQLStorage({
compiler: new SQLCompiler<BasicSQLTypes>({
getPlaceholder(valueIndex) {
// Placeholder compiler for Postgres queries
return '$' + (valueIndex + 1);
},
}),
// SQLStorage requires a `query` method implementation, to run queries on DB.
// This method receives query and bindings, and returns an array of rows.
async query(query, bindings) {
const { rows } = await db.query(query, bindings);
return rows as Record<string, any>[];
}
}),
});You may implement custom storage for your app.
See an example with storage backed by PGlite.
import { MigrationStorage } from 'ordinality';
import { PGlite } from '@electric-sql/pglite';
/**
* Custom storage that persists migration names in Postgres DB
*/
export class PostgresMigrationsStorage<C extends MigrationContext<any, any>> implements MigrationStorage<C> {
constructor(
private readonly db: PGlite,
private readonly tableName = 'schema_migrations',
) { }
async log(uid: string, _context: C): Promise<void> {
await this.ensureTable();
await this.db.query(`INSERT INTO ${this.tableName} (name) VALUES ($1)`, [uid]);
}
async unlog(uid: string, _context: C): Promise<void> {
await this.ensureTable();
await this.db.query(`DELETE FROM ${this.tableName} WHERE name=$1`, [uid]);
}
async list(): Promise<string[]> {
await this.ensureTable();
const { rows } = await this.db.query(
`SELECT name FROM ${this.tableName} ORDER BY executed_at ASC`,
);
return rows.map((row: any) => row.name);
}
// ensure migrations table exists
async ensureTable() {
await this.db.exec(
`CREATE TABLE IF NOT EXISTS ${this.tableName} (
id SERIAL PRIMARY KEY,
name TEXT NOT NULL UNIQUE,
executed_at TIMESTAMP WITH TIME ZONE DEFAULT now()
);`,
);
}
}Join the development. Any contributions are welcome, check github repo to file issues or send pull requests.
Do you like it? Spread the word about the project.