← Back to Blog

How to Back Up a PostgreSQL Database in a Docker Container

12/18/2024·Matija Žiberna·
Coding
How to Back Up a PostgreSQL Database in a Docker Container

Imagine you’re actively working on a project with PostgreSQL and Prisma. You decide to modify your Prisma schema by adding a new model. Unfortunately, this model happens to share the same name as an older one you had previously removed.

This conflict triggers a cascade of issues:

  • Prisma migrations fail due to the name conflict, and even manually removing failed migrations from both your codebase and database doesn’t solve the problem.

Prisma’s solution? Reset the database.

But resetting the database would erase all your data — a nightmare scenario if you don’t have a proper backup.

This situation highlights the importance of creating regular backups that can save you from losing critical work.

This article will guide you step-by-step on how to back up your PostgreSQL database when it’s running in a Docker container. By the end of this guide, you’ll have a secure backup of your database that you can restore or migrate whenever needed.

Step-by-Step Guide

While this guide is for use with Docker and Prisma, the concepts are applicable even without them.

1. Identify the Running Container

First, locate the Docker container running your PostgreSQL database. You can use the docker ps command to list all running containers:

docker ps

Example output:

CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES0dc49acc42d8 postgres:17 "docker-entrypoint.s…" 5 weeks ago Up 5 weeks 0.0.0.0:54321->5432/tcp scraper-saas-db

In this example, the container ID is 0dc49acc42d8, and the container name is scraper-saas-db. You can use either identifier in the subsequent commands.

2. Create the Backup

You have two options for creating the backup, depending on whether you want to execute commands inside the container or connect to it remotely.

Option A: Using docker exec

Run the following command from your host machine:

docker exec -t scraper-saas-db pg_dump -U postgres -d <database_name> > backup.sql

  • Replace <database_name> with the name of your database.
  • The -U postgres flag specifies the PostgreSQL user (update it if you’re using a different user).
  • The output will be saved to a file named backup.sql on your host machine.

Option B: Using pg_dump on the Host

If psql is installed on your host machine and you’ve mapped the database port (e.g., 54321), use this command:

pg_dump -h localhost -p 54321 -U postgres -d <database_name> > backup.sql

  • -h localhost: Host to connect to.
  • -p 54321: Port on which the database is accessible.
  • -U postgres: PostgreSQL user.

3. Rename the Backup File

To make your backup easier to identify, rename it to include a timestamp:

mv backup.sql backup_scraper_saas_$(date +%d_%m_%Y).sql

This renames the file to something like backup_scraper_saas_18_12_2024.sql.

4. Verify the Backup

Before proceeding, ensure that the backup was successfully created. List the file with the following command:

ls -lh backup_scraper_saas_18_12_2024.sql

Check the file's contents if needed to ensure it’s a valid SQL dump. The size must be what you would expect from the number of records.

  • Small database (10MB of data): ~10MB (uncompressed), ~2–5MB (compressed).
  • Medium database (500MB of data): ~500MB (uncompressed), ~50–100MB (compressed).
  • Large database (5GB of data): ~5GB (uncompressed), ~500MB-1GB (compressed).

5. Secure the Backup

Having the backup on the same device defeats its purpose in case of a hardware failure. Save your backup in a secure location, such as:

  • Cloud storage services (e.g., Google Drive, AWS S3).
  • A physical external drive.
  • A remote server.

The lazy way (my preferred)? Upload the file to a temporary sharing service like file.io:

curl -F "file=@./backup_scraper_saas_18_12_2024.sql" https://file.io

This generates a temporary URL that you can use to download the file elsewhere.

Conclusion

Backing up your PostgreSQL database in a Docker container is a straightforward but critical task. By following the steps outlined above, you can avoid losing valuable data when things go wrong, like migration issues or accidental deletions.

Having regular, automated backups is even better. Interested in learning how to automate this process and upload backups to the cloud daily?

In the next article, we will have a look into how to back up a database.

Thanks,
Matija