Class 29: Connecting Node.js/Express to a Relational Database
We've built a CRUD API with Express.js using in-memory data, and we've learned the fundamentals of relational databases and SQL. Now, it's time to connect our Node.js/Express application to a real relational database to persist our data.
For this class, we'll focus on PostgreSQL, a powerful and popular open-source relational database. We'll use a query builder called Knex.js to interact with the database, which provides a more convenient and secure way to write SQL queries in JavaScript.
Choosing a Relational Database & Setup
While SQLite is great for local practice, for a production-ready application, you'd typically choose a robust RDBMS like PostgreSQL or MySQL.
Setting up PostgreSQL Locally:
There are several ways to install PostgreSQL.
-
Using a native installer: Download from
postgresql.org/download/. This usually includes
psql
(command-line client) and pgAdmin (GUI tool). -
Using Docker (Recommended for development):
Docker provides an isolated, consistent environment.
# Pull the PostgreSQL image docker pull postgres # Run a PostgreSQL container # -d: detached mode # --name: name your container # -p 5432:5432: map host port 5432 to container port 5432 # -e POSTGRES_PASSWORD=mysecretpassword: set environment variable for password # postgres: image name docker run --name my-postgres -e POSTGRES_PASSWORD=mysecretpassword -p 5432:5432 -d postgres
Creating a New Database and Sample Table:
Once PostgreSQL is running, you can connect to it using
psql
or pgAdmin (or a tool like DBeaver) to create your database and
tables.
-- Connect to PostgreSQL (e.g., using psql or pgAdmin query tool)
-- psql -U postgres -h localhost -p 5432
-- Create a new database for our application
CREATE DATABASE book_api_db;
-- Connect to the new database (if using psql: \c book_api_db)
-- Create the books table
CREATE TABLE books (
id SERIAL PRIMARY KEY, -- SERIAL automatically increments and is unique
title VARCHAR(255) NOT NULL,
author VARCHAR(255) NOT NULL,
year INTEGER
);
-- Insert some sample data
INSERT INTO books (title, author, year) VALUES
('The Hitchhiker''s Guide to the Galaxy', 'Douglas Adams', 1979),
('1984', 'George Orwell', 1949),
('To Kill a Mockingbird', 'Harper Lee', 1960);
Database Connection in Node.js
Directly using a database driver (like pg
for
PostgreSQL) involves writing raw SQL queries. While possible, it
can be prone to SQL injection attacks and less convenient.
Instead, we'll use a Query Builder.
Environment Variables for Sensitive Credentials:
Never hardcode database credentials (passwords, usernames)
directly in your code. Use environment variables. The
dotenv
package is commonly used for this in
Node.js.
npm install dotenv
Create a .env
file in your project root:
# .env
DB_HOST=localhost
DB_PORT=5432
DB_USER=postgres
DB_PASSWORD=mysecretpassword
DB_NAME=book_api_db
Introduction to ORMs and Query Builders
-
Why use an ORM/Query Builder?
- Abstraction: Write database operations using JavaScript objects and methods instead of raw SQL.
- Security: Helps prevent SQL injection attacks by automatically sanitizing input.
- Productivity: Faster development with less boilerplate code.
- Portability: Some ORMs/Query Builders support multiple database systems, allowing easier switching.
-
Overview of popular options:
- ORMs (Object-Relational Mappers): Map database tables to JavaScript objects. Examples: Sequelize, Prisma. They provide a higher level of abstraction.
- Query Builders: Provide a programmatic interface to build SQL queries. Examples: Knex.js. They are closer to SQL but offer convenience and security features.
-
Focus on
Knex.js
:Knex.js is a "batteries included" SQL query builder for Postgres, MySQL, SQLite3, Oracle, and MSSQL. It's a great choice for its flexibility and ease of use.
npm install knex pg # 'pg' is the PostgreSQL client library
Knex.js Configuration:
Create a knexfile.js
in your project root to
configure database connections for different environments.
// knexfile.js
require('dotenv').config(); // Load environment variables
module.exports = {
development: {
client: 'pg', // PostgreSQL client
connection: {
host: process.env.DB_HOST,
port: process.env.DB_PORT,
user: process.env.DB_USER,
password: process.env.DB_PASSWORD,
database: process.env.DB_NAME,
},
migrations: {
tableName: 'knex_migrations',
directory: './db/migrations' // Where your migration files will live
},
seeds: {
directory: './db/seeds' // Where your seed files will live
}
},
// You can add production, staging configurations here
};
Then, create the Knex instance in your
server.js
(or a dedicated db.js
file):
// db.js
require('dotenv').config();
const knexfile = require('../knexfile');
const knex = require('knex')(knexfile.development); // Use the development configuration
module.exports = knex;
// server.js
// const knex = require('./db'); // Import the knex instance
Performing CRUD Operations with Node.js/Express
Now, let's refactor our Express API to use Knex.js to interact with our PostgreSQL database.
// server.js (updated to use Knex)
require('dotenv').config(); // Load environment variables from .env
const express = require('express');
const app = express();
const port = process.env.PORT || 3000; // Use port from env or default
// Initialize Knex.js
const knexfile = require('./knexfile');
const knex = require('knex')(knexfile.development);
app.use(express.json());
// Test database connection (optional, but good practice)
knex.raw('SELECT 1')
.then(() => {
console.log('Database connected successfully!');
})
.catch((err) => {
console.error('Database connection failed:', err);
process.exit(1); // Exit if DB connection fails
});
// --- CRUD Operations ---
// GET /api/books - Get all books
app.get('/api/books', async (req, res) => {
try {
const books = await knex('books').select('*'); // SELECT * FROM books
res.status(200).json(books);
} catch (error) {
console.error('Error fetching books:', error);
res.status(500).json({ message: 'Internal server error.' });
}
});
// GET /api/books/:id - Get a single book by ID
app.get('/api/books/:id', async (req, res) => {
try {
const { id } = req.params;
const book = await knex('books').where({ id }).first(); // SELECT * FROM books WHERE id = :id LIMIT 1
if (book) {
res.status(200).json(book);
} else {
res.status(404).json({ message: `Book with ID ${id} not found.` });
}
} catch (error) {
console.error('Error fetching book by ID:', error);
res.status(500).json({ message: 'Internal server error.' });
}
});
// POST /api/books - Create a new book
app.post('/api/books', async (req, res) => {
try {
const { title, author, year } = req.body;
if (!title || !author || !year) {
return res.status(400).json({ message: 'Title, author, and year are required.' });
}
// Insert the new book and return the inserted record (PostgreSQL returns all columns by default)
const [newBook] = await knex('books').insert({ title, author, year: parseInt(year) }).returning('*');
// For SQLite, .returning('*') might not work directly, you might need to query by last_insert_rowid()
res.status(201).json(newBook);
} catch (error) {
console.error('Error creating book:', error);
res.status(500).json({ message: 'Internal server error.' });
}
});
// PUT /api/books/:id - Fully update a book
app.put('/api/books/:id', async (req, res) => {
try {
const { id } = req.params;
const { title, author, year } = req.body;
if (!title || !author || !year) {
return res.status(400).json({ message: 'Title, author, and year are required for a full update.' });
}
const [updatedBook] = await knex('books')
.where({ id })
.update({ title, author, year: parseInt(year) })
.returning('*');
if (updatedBook) {
res.status(200).json(updatedBook);
} else {
res.status(404).json({ message: `Book with ID ${id} not found for update.` });
}
} catch (error) {
console.error('Error updating book:', error);
res.status(500).json({ message: 'Internal server error.' });
}
});
// PATCH /api/books/:id - Partially update a book
app.patch('/api/books/:id', async (req, res) => {
try {
const { id } = req.params;
const updates = req.body;
// Filter out undefined/null values from updates to only apply specified fields
const validUpdates = Object.keys(updates).reduce((acc, key) => {
if (updates[key] !== undefined && updates[key] !== null) {
acc[key] = key === 'year' ? parseInt(updates[key]) : updates[key];
}
return acc;
}, {});
if (Object.keys(validUpdates).length === 0) {
return res.status(400).json({ message: 'No valid fields provided for partial update.' });
}
const [updatedBook] = await knex('books')
.where({ id })
.update(validUpdates)
.returning('*');
if (updatedBook) {
res.status(200).json(updatedBook);
} else {
res.status(404).json({ message: `Book with ID ${id} not found for partial update.` });
}
} catch (error) {
console.error('Error patching book:', error);
res.status(500).json({ message: 'Internal server error.' });
}
});
// DELETE /api/books/:id - Delete a book
app.delete('/api/books/:id', async (req, res) => {
try {
const { id } = req.params;
const deletedCount = await knex('books').where({ id }).del(); // Returns number of deleted rows
if (deletedCount > 0) {
res.status(204).send(); // 204 No Content
} else {
res.status(404).json({ message: `Book with ID ${id} not found for deletion.` });
}
} catch (error) {
console.error('Error deleting book:', error);
res.status(500).json({ message: 'Internal server error.' });
}
});
// Basic root route
app.get('/', (req, res) => {
res.send('Welcome to the Book API powered by PostgreSQL and Knex.js!');
});
// Start the server
app.listen(port, () => {
console.log(`Server running on port ${port}`);
});
Important Notes:
-
Each database operation is wrapped in a
try...catch
block to handle potential database errors. -
We use
async/await
to handle asynchronous database operations, making the code more readable. -
.returning('*')
is used with PostgreSQL to get the full updated/inserted row back. For SQLite, you might need different approaches (e.g., fetching bylast_insert_rowid()
).
Error Handling for Database Operations
When interacting with a database, various errors can occur: connection issues, unique constraint violations, foreign key violations, invalid SQL syntax, etc.
-
Catching database connection errors:
As shown in the
knex.raw('SELECT 1')
example, it's good to check the connection status at startup. -
Handling query errors: All database
operations (
.select()
,.insert()
, etc.) are asynchronous and can throw errors. Always wrap them intry...catch
blocks or use.catch()
with Promises. -
Returning appropriate HTTP status codes:
-
400 Bad Request
: For client-side validation errors (e.g., missing required fields). -
404 Not Found
: When a resource with the given ID does not exist. -
409 Conflict
: If a unique constraint is violated (e.g., trying to insert a user with an email that already exists). -
500 Internal Server Error
: For any unexpected server-side or database errors.
-
Database Migrations
Database Migrations are version control for your database schema. They allow you to define changes to your database structure (creating tables, adding columns, etc.) in code, track them, and apply them consistently across different environments (development, staging, production).
Knex.js has built-in support for migrations.
# Create a migration file
knex migrate:make create_books_table
# This will create a file like: db/migrations/YYYYMMDDHHMMSS_create_books_table.js
Inside the migration file, you define up
(to apply
changes) and down
(to revert changes) methods:
// db/migrations/YYYYMMDDHHMMSS_create_books_table.js
exports.up = function(knex) {
return knex.schema.createTable('books', function(table) {
table.increments('id').primary(); // Auto-incrementing primary key
table.string('title').notNullable();
table.string('author').notNullable();
table.integer('year');
table.timestamps(true, true); // Adds created_at and updated_at columns
});
};
exports.down = function(knex) {
return knex.schema.dropTable('books');
};
# Run migrations to apply changes to the database
knex migrate:latest
Using migrations is a best practice for managing database schema changes in a team environment and for deploying applications.
You've now successfully connected your Express.js API to a PostgreSQL database and implemented all CRUD operations using Knex.js. This is a major step towards building robust, data-driven full-stack applications! In the next class, we'll explore another important category of databases: NoSQL databases, focusing on MongoDB.