Effect Typescript Database Migration
Explore how to use the Effect TypeScript library to manage database migrations with a focus on functional programming and type safety.
12 mai 2024
Published
Hugo Mufraggi
Author

Effect Typescript Database Migration
Recently, I came across Effect, a library designed to tackle challenges associated with managing asynchronous effects securely and predictably using advanced functional concepts and the robust type safety offered by TypeScript. It simplifies the composition of asynchronous programs, improves error handling, and advocates for a declarative and pure approach to effect programming.
I propose building a small API to test this library and its ecosystem.
I will use the SQL repository in this article to implement dependency injection in a future service.
Package
I’ll be using @effect/schema it to define the schema, and effect/sql it is an SQL toolkit tailored for Effect-ts.
@effect/schemaIts goal is to unify type definitions. You can learn more about its objectives here.
Practice
Project Init
I use Bun as a runtime tool. You can adapt it to your runtime environment if you don't have it installed.
bun init
bun add effect
bun add @effect/sql
bun add @effect/schema
Your project should be ready to go once you’ve completed these steps.
You can find more information about the effect/sql package here.
SQL Script
First, begin by defining the SQL schema task.
Create a folder named db, and within it, create a subfolder named migrations.
Next, create a new file named. db/migrations/0001_add_tasks.ts.
import * as Effect from "effect/Effect";
import * as Sql from "@effect/sql-pg"
export default Effect.flatMap(
Sql.client.PgClient,
(sql) => sql`
CREATE TABLE tasks (
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
task_name VARCHAR(255) NOT NULL,
task_description TEXT,
status VARCHAR(50) DEFAULT 'Todo',
created_at TIMESTAMP NOT NULL DEFAULT NOW(),
updated_at TIMESTAMP NOT NULL DEFAULT NOW()
)
`
)
This code snippet is straightforward; it defines an SQL script to create our table. The first three lines involve functional aspects. As far as I understand, we are exporting a Monad (although I’m not entirely certain).
In the future, if you wish to add new tables, you can create a new file like src/migrations/0002_add_users.ts.
export default Effect.flatMap(
Sql.client.PgClient,
(sql) => sql`
ADD YOUR SQL SCRIPT
`
)
You must add your SQL script and reuse the rest of the logic.
Migrations
The next step is to create the script to apply the migrations. You can find detailed documentation here.
import { Config, Effect, Layer, pipe } from "effect"
import { NodeContext, NodeRuntime } from "@effect/platform-node"
import * as Sql from "@effect/sql-pg"
import { fileURLToPath } from "node:url"
import * as Secret from "effect/Secret";
const program = Effect.gen(function* (_) {
// ...
})
const SqlLive = Sql.client.layer({
database: Config.succeed("gdg"),
username: Config.succeed("user"),
port: Config.succeed(5432),
password: Config.succeed( Secret.fromString("password")),
})
const MigratorLive = Sql.migrator
.layer({
loader: Sql.migrator.fromFileSystem(
fileURLToPath(new URL("migrations", import.meta.url))
),
schemaDirectory: "migrations"
})
.pipe(Layer.provide(SqlLive))
const EnvLive = Layer.mergeAll(SqlLive, MigratorLive).pipe(
Layer.provide(NodeContext.layer)
)
pipe(program, Effect.provide(EnvLive), NodeRuntime.runMain)
Now, let’s delve into the practical details!
The documentation, effect defines four key functions:
program: This function defines the main program.SqlLive: It configures the SqlClient.MigratorLive: This sets up the Migrator settings.EnvLive: It combines the context ofSqlLiveandMigrator.
These functions are executed sequentially within a pipe, as described in the documentation.
Next, we need to add a script inside the package.json.
"scripts": {
"migrate": "bun run db/migrate.ts"
},
Just before running this script, make sure to set up a local database. You can find a seed for your local
Conclusion
This article is the first installment of a series; I plan to write two more articles, one focusing on the SQL repository and another on the handler API part. You can follow me on Medium and subscribe to my mailing list to stay updated and not miss the rest of the series.