How to Back Up Your PostgreSQL Database

Updated , by Jared Kobos

Traducciones al Español
Estamos traduciendo nuestros guías y tutoriales al Español. Es posible que usted esté viendo una traducción generada automáticamente. Estamos trabajando con traductores profesionales para verificar las traducciones de nuestro sitio web. Este proyecto es un trabajo en curso.
Try this guide to receive $100 at signup on a new account.

How to Back Up Your PostgreSQL Database

If you are using PostgreSQL in a production environment, it is important to take precautions to ensure that your users' data is not lost. By frequently backing up your database, and/or automating backups with a cron task, you will be able to quickly restore your system in the event that your database is lost or corrupted. Fortunately, PostgreSQL includes tools to make this task simple and easy to manage.

Before You Begin

You should have a working installation of PostgreSQL on your system before beginning this guide. Go through our How to Install PostgreSQL on Ubuntu guide to install PostgreSQL and create a sample database.

The steps in this guide require root privileges. Be sure to run the steps below as root or with the sudo prefix. For more information on privileges, see our Users and Groups guide.

One-Time SQL Dump

Single Database

PostgreSQL provides the pg_dump utility to simplify backing up a single database. This command must be run as a user with read permissions to the database you intend to back up.

  1. Log in as the postgres user:

    su - postgres
  2. Dump the contents of a database to a file by running the following command. Replace dbname with the name of the database to be backed up.

    pg_dump dbname > dbname.bak

    The resulting backup file, dbname.bak, can be transferred to another host with scp or stored locally for later use.

  3. To demonstrate restoring lost data, delete your example database and create an empty database in its place:

    dropdb dbname
    createdb dbname
  4. Restore the database using psql:

    psql test < dbname.bak

    There are several options for the backup format:

    • *.bak: compressed binary format
    • *.sql: plaintext dump
    • *.tar: tarball

Remote Database

Just as psql allows you to connect to a remote host, pg_dump can be run from a client computer to back up data on a remote server. Use the -h flag to specify the IP address of your Linode and -p to identify the port on which PostgreSQL is listening:

pg_dump -h -p 5432 dbname > dbname.bak

All Databases

Because pg_dump only creates a backup of one database at a time, it does not store information about database roles or other cluster-wide configuration. To store this information, and back up all of your databases simultaneously, you can use pg_dumpall.

  1. Create a backup file:

    pg_dumpall > pg_backup.bak
  2. Restore all databases from the backup:

    psql -f pg_backup.bak postgres

Automate Backups with a Cron Task

You may want to set up a cron job so that your database will be backed up automatically at regular intervals. The steps in this section will set up a cron task that will run pg_dump once every week.

  1. Make sure you are logged in as the postgres user:

    su - postgres
  2. Create a directory to store the automatic backups:

    mkdir -p ~/postgres/backups
  3. Edit the crontab to create the new cron task:

    crontab -e
  4. Add the following line to the end of the crontab:

    File: crontab
    0 0 * * 0 pg_dump -U postgres dbname > ~/postgres/backups/dbname.bak
  5. Save and exit from the editor. Your database will be backed up at midnight every Sunday. To change the time or frequency of the updates, see our Schedule Tasks with Cron guide.

Next Steps

PostgreSQL also offers more advanced ways to back up your databases. The official docs describe how to set up continuous archiving and point-in-time recovery. This is a much more complex process, but it will maintain a constant archive of your database and make it possible replay PostgreSQL’s logs to recover the state of the database at any point in the past.

This method can also be helpful if you have a very large database although continuously archiving a large database consumes resources. Since the process is ongoing, there is no need to make frequent and time consuming full backups.

This page was originally published on

Your Feedback Is Important

Let us know if this guide helped you find the answer you were looking for.