Integrating Slonik with Express.js

Gajus Kuizinas

Fullstack Engineer
ExpressJS
Slonik
TypeScript

What is Slonik?

For those uninitiated, 

Slonik

 is a battle-tested SQL query building and execution library for Node.js. Its primary goal is to allow you to write and compose SQL queries in a safe and convenient way. Now, let's see how it pairs with 

Express.js

.

Install the Necessary Packages

First things first, you'll need to install the necessary packages:

npm install slonik express

Basic App Setup

Let's create a basic Express app:

import express from 'express';

const app = express()
const port = 3000

app.get('/', (req, res) => {
res.send('Hello, World!')
})

app.listen(port, () => {
console.log(`Example app listening on port ${port}`)
})

Running this app will start a server on port 3000. If you visit 

http://localhost:3000

, you should see the message "Hello, World!".

Connecting to a Database

Slonik manages database connections using 

connection pools

. A connection pool manages a set of database connections, which can be used and reused, minimizing the overhead of creating new connections.

Here is an example of how to create a Slonik connection pool:

import { createPool } from 'slonik';

const main = async () => {
await createPool('postgres://...');
};

main();

Note: Your 

connection URI

 goes in place of postgres://....

You may have noticed that we are awaiting for the result of createPool. This is because Slonik needs to query the database in order to setup the connection pool. This requires us to slightly modify our app.

Using Slonik with Express.js

Let's see how we can use Slonik with Express.js:

import express from 'express';
import { createPool } from 'slonik';

const main = async () => {
const pool = await createPool('postgres://...');

const app = express();
const port = 3000;

app.get('/', (req, res) => {
res.send('Hello, World!')
});

app.listen(port, () => {
console.log(`Example app listening on port ${port}`)
});
};

void main();

We've added a few lines to our app. First, we created a connection pool using the createPool function. Then, we added a main function that creates the connection pool and starts the Express.js server. The only reason we need to do this is because we need to await the createPool function. Since we can't use await at the top level, we need to wrap it in a function.

Integrating Slonik with an Express.js Route

With the pool ready, it's straightforward to execute queries within your Express routes:

app.get('/users', async (req, res) => {
const users = await pool.any(sql.unsafe`SELECT * FROM users`);

res.json(users);
});

Here, we're fetching all users from the database and sending them as a JSON response.

Because we are using a connection pool to manage connections, we don't need to worry about opening and closing connections. Slonik handles this for us. For most use cases, this is the recommended way to execute queries. However, if you need more control over the connection, you can use the 

transaction

 or 

connect

 methods.

Transactions

Slonik provides a 

transaction

 method that allows you to execute queries within the same connection and a single transaction:

await pool.connect(async (connection) => {
await connection.query(sql.unsafe`INSERT INTO logs (message) VALUES ('foo')`);
await connection.query(sql.unsafe`INSERT INTO logs (message) VALUES ('bar')`);
});

With this, both insertions either succeed together or fail together.

Using Transactions with Express.js

Let's see how we can use transactions with Express.js. First, we'll create a route that inserts a user into the database:

app.post('/users', async (req, res) => {
const { name } = req.body;

await pool.query(sql.unsafe`INSERT INTO users (name) VALUES (${name})`);

res.json({ message: 'User created!' });
});

If it is just a single query, we don't need to use a transaction. However, let's say we want to log the creation of the user in the database:

app.post('/users', async (req, res) => {
const { name } = req.body;

await pool.transaction(async (transaction) => {
await transaction.query(sql.unsafe`INSERT INTO users (name) VALUES (${name})`);
await transaction.query(sql.unsafe`INSERT INTO logs (message) VALUES (${name} was created!)`);
});

res.json({ message: 'User created!' });
});

Here, we are using the 

transaction

 method to execute two queries within the same connection and a single transaction. This ensures that both queries either succeed together or fail together. If either query fails, the transaction is rolled back and the error is thrown.

Optimizing Transactions

This section is tangential to the main topic of this article, but I think it's important to mention.

One key principle when working with transactions in relational databases is to keep them as concise as possible. Transactions that extend for long durations can lead to a host of issues.

Extended transaction durations can block other operations, increasing contention for resources. This can, in turn, escalate to more severe problems such as 

deadlocks

. In a deadlock scenario, two or more transactions wait indefinitely for each other to release locks.

Such problems can significantly degrade the performance of the system. Especially in situations with high concurrency, bottlenecks can emerge, stifling the smooth operation of the database.

Moreover, there's the risk of long-running transactions exhausting connection pool resources. When this happens, it can result in application slowdowns, as various parts of the application are queued, waiting for available connections.

In light of these potential pitfalls, it becomes paramount to ensure that transactions are designed to encapsulate only the absolutely necessary operations. They should be optimized for both speed and efficiency. By adhering to these principles, you not only uphold the integrity of the transaction but also ensure the responsiveness and scalability of the broader application and database system.

In short, keep your transactions short and sweet.

Reserving a Connection

Warning: Reserving a connection from the pool is an advanced feature. It is recommended to use transactions instead.

connect

 method allows you to reserve a connection from the pool and execute queries on it.

It is important to note that this is a rare use case, esp. in the context of services that produce responses to user requests. Most of the time, you should use connection from the connection pool or transactions instead. However, there are some cases where you may need to reserve a connection from the pool. For example, you may want to set time zone for the connection:

app.get('/events', async (req, res) => {
const events = await pool.connect(async (connection) => {
await connection.query(sql.unsafe`SET LOCAL timezone = 'America/New_York'`);

return await connection.any(sql.unsafe`SELECT event_name, event_time FROM events`);
});

res.json(events);
});

Here, we are reserving a connection from the pool and executing two queries on it. The first query sets the time zone for the connection, and the second query fetches all events from the database. Once the connection is released, it is returned to the pool.

Passing a Connection to a Function

A common a question I am asked is how to pass a connection to a function. For example, let's say you have a function that fetches all users from the database:

const getUsers = async () => {
return await connection.any(sql.unsafe`SELECT * FROM users`);
};

app.get('/users', async (req, res) => {
const users = await getUsers();

res.json(users);
});

The simple solution is to pass the connection as an argument:

const getUsers = async (pool: DatabasePool) => {
return await pool.any(sql.unsafe`SELECT * FROM users`);
};

app.get('/users', async (req, res) => {
const users = await getUsers(pool);

res.json(users);
});

I realize that this is a very simple example, but this is true also even if your function that requires access to the database handle is nested deep within a call stack. You must an instance of the connection pool down to every function in the chain.

Anti-Patterns: AsyncLocalStorage

AsyncLocalStorage

 is a core module introduced in Node.js that provides a mechanism to store and retrieve data based on the current execution context. It might be tempting to use this to pass a connection to a function:

import express from 'express';
import { createPool } from 'slonik';
import { AsyncLocalStorage } from 'async_hooks';

const asyncLocalStorage = new AsyncLocalStorage();

const main = async () => {
const pool = createPool('postgres://...');

const app = express();
const port = 3000;

app.use((req, res, next) => {
pool.connect((err, connection) => {
if (err) {
return next(err);
}
asyncLocalStorage.run(connection, next);
});
});

app.get('/users', async (req, res, next) => {
try {
const users = await fetchUsersFromDb();
res.json(users);
} catch (error) {
next(error);
}
});

async function fetchUsersFromDb() {
const connection = asyncLocalStorage.getStore();
const users = await connection.any('SELECT * FROM users');
return users;
}

app.listen(port, () => {
console.log(`Example app listening on port ${port}`);
});
};

I strongly recommend against this. It is not a good idea both in terms of performance and design. If you need to pass a connection to a function, you should pass it as an argument.

Refer to the 

Passing a Connection to a Function

 section for more information.

Anti-Patterns: wrapping the entire request in a transaction

Another anti-pattern I see is wrapping the entire request in a transaction:

app.post('/users', async (req, res) => {
await pool.transaction(async (transaction) => {
// do something that is part of the request, but not part of the transaction
await foo();

await transaction.query(sql.unsafe`INSERT INTO users (name) VALUES ('foo')`);

res.json(
await transaction.one(sql.unsafe`SELECT * FROM users WHERE name = 'foo'`)
);
});
});

This is not a good idea. Transactions should be as short as possible. If you need to do something that is not part of the transaction, you should do it outside of the transaction.

app.post('/users', async (req, res) => {
await foo();

await pool.transaction(async (transaction) => {
await transaction.query(sql.unsafe`INSERT INTO users (name) VALUES ('foo')`);

res.json(
await transaction.one(sql.unsafe`SELECT * FROM users WHERE name = 'foo'`)
);
});
});

Refer to the 

Optimizing Transactions

 section for more information.

Anti-Patterns: wrapping the entire request in a connection

Another anti-pattern I see is wrapping the entire request in a connection:

app.post('/users', async (req, res) => {
await pool.connect(async (connection) => {
res.json(await foo());
});
});

This is not a good idea. Implementing this pattern means that your application will be able to handle only as many concurrent requests as there are connections in the connection pool. If you have a connection pool with 10 connections, your application will be able to handle only 10 concurrent requests.

Instead, you should pass an instance of the connection pool to the function that needs access to the database handle and use the connection pool to execute queries and transactions.

Refer to the 

Passing a Connection to a Function

 section for more information.

Async Middleware

Express.js does not support asynchronous routes out of the box. For the purpose of keeping the examples simple, we used await in our routes. You can use 

express-promise-router

 or 

express-async-handler

 to support asynchronous routes in your application.

Conclusion

By now, you should have a good grasp on integrating Slonik with Express.js in TypeScript. If you have any questions, feel free to reach out to me on 

Twitter

 or leave an issue on 

GitHub

.

Partner With Gajus
View Services

More Projects by Gajus