- Published on
Implementing a Migration Framework in Node.js
- Authors
- Name
- Manuel Sousa
- @mlrcbsousa
Implementing a migration framework in a Node.js backend can be essential, particularly when you’re working with a backend that doesn’t already include a migration system from an existing web framework.
This guide demonstrates how to set up a migration framework in Node.js, even for NoSQL databases, using node-migrate
, an agnostic library that can help ensure data consistency and handle evolving data structures. This framework supports migrations, rollbacks, and schema versioning, offering flexibility for environments that use NoSQL data stores.
Building a Migration Framework for Node.js
Establishing a migration framework for a Node.js backend serves multiple purposes. Primarily, it ensures data consistency across different environments, allowing structured migration, rollback, and versioning for database schemas and data.
Even in a NoSQL context, such a framework provides a reliable approach for managing changes in data structure over time. This is critical for maintaining a seamless user experience and addressing any structural adjustments that might otherwise lead to unexpected data issues.
Setting Up Migration Commands
To start, add the migration commands to your package.json
file. This enables easy access to essential migration operations, such as running migrations, rolling them back, listing existing migrations, and generating new migration files. If you’re familiar with Ruby on Rails, these commands work similarly to Rails’ rake tasks for database migrations (e.g., db:migrate
, db:rollback
, db:create
). Here’s what each command does:
Run all migrations:
bashyarn db:migrate
This command uses
migrate up
to apply all pending migrations. It's comparable to Rails’db:migrate
task, ensuring that the database schema is up to date.Rollback migrations:
bashyarn db:rollback <name of migration>
The
db:rollback
command rolls back migrations to a specified state by targeting a particular migration name. You’ll need to provide the exact name of the migration file, just like specifying versions in Rails withdb:rollback VERSION
. This gives you control to revert changes selectively.List all migrations:
bashyarn migrate:list
This command lists all the migrations that have been applied, which is similar to Rails’ ability to check which migrations are pending or have been executed. The list is stored in the
MigrationsStore
, which helps maintain an overview of migration states across environments.Generate a new migration file:
bashyarn generate:migration <name of migration>
This command is akin to Rails’
db:migrate:create
task. It creates a new migration file in thedb/migrations/
directory, using the provided name to give it context. The generated file follows a set template, making it easy to implement theup
anddown
methods for your migration.
Your package.json
file should include these commands as shown below:
{
"scripts": {
"db:migrate": "migrate up --compiler ts:./db/migratec.js --migrations-dir db/migrations --store ./src/lib/MigrationsStore.ts",
"db:rollback": "migrate down --compiler ts:./db/migratec.js --migrations-dir db/migrations --store ./src/lib/MigrationsStore.ts",
"migrate:list": "migrate list --compiler ts:./db/migratec.js --migrations-dir db/migrations --store ./src/lib/MigrationsStore.ts",
"generate:migration": "migrate create --template-file db/templates/migration.ts --migrations-dir db/migrations"
}
}
These commands simplify migration management by providing a structured way to run, rollback, and track migrations within the Node.js environment.
Install Dependencies
Start by installing the required dependencies with:
yarn add migrate ts-node @types/node nano config
These dependencies provide a solid foundation for building a migration system with support for TypeScript and CouchDB.
Configure TypeScript for Migrations
To configure TypeScript, create a file at db/migratec.js
with the following configuration:
/* eslint-disable */
const tsnode = require('ts-node');
module.exports = tsnode.register;
This configuration allows ts-node
to compile TypeScript files within the migration framework, enabling TypeScript support for the migration scripts.
Database Initialization
For CouchDB integration, the database initialization code should be created in src/lib/Database.ts
. This module provides a singleton instance of the database connection, ensuring only one active connection to the CouchDB instance. The following code sets up the connection, handling initialization, reconnection, and error handling:
Click to see the Code
import Nano from 'nano'
import config from 'config'
export type NanoClient = Nano.DocumentScope<unknown>
export type NanoServer = Nano.ServerScope
const RETRY_DELAY = 500
const RETRY_COUNT = 3
export class Database {
private static instance: Database
private server: NanoServer
private client: NanoClient | undefined
private connected: boolean = false
private retries: number = 0
/* Public methods */
/* Singleton */
public static getInstance(): Database {
if (!Database.instance) {
Database.instance = new Database()
}
return Database.instance
}
/* To use in scripts that don't need the client connected */
public getServer(): NanoServer {
return this.server
}
/* To use in migrations or scripts that need the client connected */
public async init(): Promise<NanoClient> {
await this.connect()
return this.getClient()
}
/* To use once on app startup */
public async connect(): Promise<void> {
if (!this.client) {
const dbName: string = config.get('couchdb.name')
this.client = this.server.use(dbName)
}
await this.check()
}
/* To use in the app code after having called this.connect() on app startup */
public getClient(): NanoClient {
if (!this.connected) {
throw new Error('Not connected to CouchDB')
}
if (!this.client) {
throw new Error('No CouchDB client')
}
return this.client
}
/* Private methods */
private constructor() {
const couchUser: string = config.get('couchdb.user')
const couchPassword: string = config.get('couchdb.password')
const couchProtocol: string = config.get('couchdb.protocol')
const couchHost: string = config.get('couchdb.host')
const couchPort: string = config.get('couchdb.port')
const couchUrl = `${couchProtocol}://${couchUser}:${couchPassword}@${couchHost}${couchPort ? `:${couchPort}` : ''}`
this.server = Nano(couchUrl)
}
private async check(): Promise<void> {
if (this.connected || !this.client) {
return
}
try {
await this.client.info()
this.connected = true
console.log('Connected to CouchDB')
} catch (e) {
this.connected = false
if (this.retries++ < RETRY_COUNT) {
console.warn('Failed to connect to CouchDB. Retrying...', this.retries)
setTimeout(this.check.bind(this), RETRY_DELAY)
} else {
console.error('Failed to connect to CouchDB.', e)
throw e
}
}
}
}
export default Database.getInstance()
This Database
class serves as the central connection manager for the application, retrying connection attempts as needed.
Migrations Store
To manage migration states, create a MigrationsStore
class at src/lib/MigrationsStore.ts
. This store handles saving and loading migration states, ensuring that migrations persist across runs. The following code demonstrates how to implement the migration store for CouchDB:
Click to see the Code
import { Callback, MigrationSet } from 'migrate'
import { MaybeDocument } from 'nano'
import db from './Database'
type LoadCallback = ((err: Error) => void) & ((err: null, set: MigrationSet) => void)
const collection = 'migrations'
export default class MigrationsStore {
async load(cb: LoadCallback): Promise<void> {
let set: MigrationSet
try {
const dbClient = await db.init()
const { docs } = await dbClient.find({ selector: { collection } })
const doc: MaybeDocument = docs[0]
if (!doc) {
console.log('Cannot read migrations from the database. If this is the first time you run migrations, then this is normal.')
return cb(null, {} as MigrationSet)
}
set = doc as MigrationSet
} catch (err) {
console.error('Failed to load migration state:', err)
return cb(err as Error)
}
return cb(null, set)
}
async save({ migrations, lastRun }: MigrationSet, cb: Callback): Promise<void> {
try {
const dbClient = await db.init()
const { docs } = await dbClient.find({ selector: { collection } })
const doc: MaybeDocument = docs[0]
const updatedSet = { ...(doc ?? {}), migrations, lastRun, collection }
const { ok } = await dbClient.insert(updatedSet)
if (ok) {
return cb(null)
} else {
const err = new Error('Failed to save migration state')
console.error(err.message)
return cb(err as Error)
}
} catch (err) {
console.error('Failed to save migration state:', err)
return cb(err as Error)
}
}
}
This custom store enables the migration state to be saved directly within CouchDB, offering a consistent way to track which migrations have been applied.
Migration Template
Next, create a migration template at db/templates/migration.ts
. This template offers placeholders for the up
and down
migration functions, simplifying the process of creating new migration files:
'use strict'
import db from '../../src/lib/Database'
export const description = 'Fill in the description here'
export const up = async () => {
const client = await db.init()
client
}
export const down = async () => {
const client = await db.init()
client
}
This file provides the basic structure needed to implement new migrations by adding logic to the up
and down
functions.
Checking Migrations
To verify that all migrations are applied, add a migration check script at src/lib/check-migrations.ts
. This script checks for pending migrations and throws an error if any are found:
import path from 'path'
import { FileStore, MigrationSet, load } from 'migrate'
import MigrationsStore from './MigrationsStore'
export const checkMigrations = async () => {
const set = await new Promise((resolve, reject) => {
load({
stateStore: new MigrationsStore() as FileStore,
migrationsDirectory: path.resolve('db', 'migrations'),
}, (err, set) => {
if (err) return reject(err)
resolve(set)
})
})
const { lastRun, migrations } = set as unknown as MigrationSet
if (!lastRun) {
throw new Error('No migrations have been run.')
}
const pendingMigrations = migrations.filter(migration => !migration.timestamp)
if (pendingMigrations.length > 0) {
throw new Error(`There are pending migrations: ${pendingMigrations.map(m => m.title).join(', ')}`)
}
}
This step helps ensure that your application doesn’t run without the necessary migrations applied.
Database Error Handling Wrapper
Adding a wrapper for error handling can streamline database interactions and ensure consistent error responses. Create a wrapper module at src/db.ts
:
import { ServerError, Status } from 'nice-grpc'
import db from './lib/Database'
export default {
async connect() {
try {
await db.connect()
} catch (error) {
throw new ServerError(Status.FAILED_PRECONDITION, 'Failed to connect to CouchDB')
}
},
getClient() {
try {
return db.getClient()
} catch (error) {
throw new ServerError(Status.FAILED_PRECONDITION, 'Not connected to CouchDB')
}
},
}
This module wraps the database connection logic and throws errors with clear messages if issues arise, making it easier to identify and resolve connection issues.
Integrate with the Application
Integrate the migration framework with the main application in src/index.ts
. This allows the application to initialize the database connection and run migrations during startup:
import { createServer } from 'nice-grpc'
import config from 'config'
import db from './db'
async function main() {
await db.connect()
// Check migrations here
let server = createServer()
await server.listen(config.get('server.bindAddress'))
console.log('Server started in', config.get('server.bindAddress'))
}
main()
This setup ensures the database connection is established, and migrations are checked before the application starts accepting requests.
Conclusion
By following this guide, you can establish a Node.js migration framework that provides versioning, migration, and rollback support for your database. Using node-migrate
offers flexibility and control, making it an excellent choice even for NoSQL databases. This setup is invaluable for maintaining data consistency and managing schema changes across different environments. By adopting this approach, you ensure your application is equipped to handle data structure evolution, enabling a seamless user experience across various deployments.