Skip to main content
RapidDev - Software Development Agency
supabase-tutorial

How to Backup a Supabase Database

To backup a Supabase database, use pg_dump with your project's connection string to create a full SQL dump. Pro plan users also get automatic daily backups and point-in-time recovery through the Dashboard. For scheduled backups, set up a cron job that runs pg_dump and stores the output in a secure location. Always test restoring from your backups to verify they work before you need them.

What you'll learn

  • How to create manual backups with pg_dump using your Supabase connection string
  • How automatic daily backups and point-in-time recovery work on Pro plans
  • How to schedule automated backups with cron
  • How to verify backup integrity by testing restores
Book a free consultation
4.9Clutch rating
600+Happy partners
17+Countries served
190+Team members
Intermediate7 min read10-15 minSupabase (all plans for pg_dump, Pro+ for auto backups), PostgreSQL 15+March 2026RapidDev Engineering Team
TL;DR

To backup a Supabase database, use pg_dump with your project's connection string to create a full SQL dump. Pro plan users also get automatic daily backups and point-in-time recovery through the Dashboard. For scheduled backups, set up a cron job that runs pg_dump and stores the output in a secure location. Always test restoring from your backups to verify they work before you need them.

Backing Up Your Supabase Database with pg_dump and Dashboard Tools

Your Supabase project is a full PostgreSQL database, which means you can use standard PostgreSQL backup tools like pg_dump. This tutorial covers three backup strategies: manual pg_dump for on-demand backups, Supabase's built-in automatic backups for Pro+ plans, and scheduled cron jobs for automated recurring backups. You will also learn how to verify your backups work by testing a restore.

Prerequisites

  • A Supabase project with data you want to back up
  • PostgreSQL client tools installed (pg_dump, psql) — install via brew install libpq or apt-get install postgresql-client
  • Your database connection string from Dashboard > Settings > Database
  • Terminal/command line access

Step-by-step guide

1

Find your database connection string

Go to your Supabase Dashboard and navigate to Settings > Database. Copy the connection string under the Connection string section. Select the URI format. This string contains your host, port, database name, and credentials. For pg_dump, use the direct connection (port 5432), not the connection pooler (port 6543), because pg_dump requires a direct connection for schema operations.

typescript
1# Connection string format:
2# postgresql://postgres.[project-ref]:[password]@aws-0-[region].pooler.supabase.com:5432/postgres
3
4# Find it in Dashboard > Settings > Database > Connection string > URI

Expected result: You have the direct database connection string copied and ready to use.

2

Create a manual backup with pg_dump

Run pg_dump with your connection string to create a full SQL dump of your database. The --clean flag adds DROP statements before CREATE statements, making the dump restorable to a fresh database. The --if-exists flag prevents errors if objects do not exist during restore. Pipe the output to a file with a timestamp for easy identification.

typescript
1# Full database backup to a timestamped SQL file
2pg_dump "postgresql://postgres.[ref]:[password]@aws-0-[region].pooler.supabase.com:5432/postgres" \
3 --clean \
4 --if-exists \
5 --no-owner \
6 --no-privileges \
7 > backup_$(date +%Y%m%d_%H%M%S).sql
8
9# Compressed backup (recommended for large databases)
10pg_dump "postgresql://postgres.[ref]:[password]@aws-0-[region].pooler.supabase.com:5432/postgres" \
11 --clean \
12 --if-exists \
13 --no-owner \
14 --no-privileges \
15 | gzip > backup_$(date +%Y%m%d_%H%M%S).sql.gz
16
17# Schema-only backup (no data)
18pg_dump "postgresql://postgres.[ref]:[password]@aws-0-[region].pooler.supabase.com:5432/postgres" \
19 --schema-only \
20 > schema_$(date +%Y%m%d_%H%M%S).sql

Expected result: A .sql or .sql.gz backup file is created in your current directory.

3

Back up specific schemas or tables

You can back up specific schemas or tables instead of the entire database. This is useful when you only need to back up your application data (public schema) without system schemas. You can also back up individual tables if you need a quick snapshot of specific data.

typescript
1# Back up only the public schema
2pg_dump "postgresql://postgres.[ref]:[password]@aws-0-[region].pooler.supabase.com:5432/postgres" \
3 --schema=public \
4 --clean \
5 --if-exists \
6 > public_schema_backup.sql
7
8# Back up a specific table
9pg_dump "postgresql://postgres.[ref]:[password]@aws-0-[region].pooler.supabase.com:5432/postgres" \
10 --table=public.orders \
11 --data-only \
12 > orders_data_backup.sql
13
14# Back up storage metadata (bucket configs and policies)
15pg_dump "postgresql://postgres.[ref]:[password]@aws-0-[region].pooler.supabase.com:5432/postgres" \
16 --schema=storage \
17 > storage_schema_backup.sql

Expected result: Targeted backup files are created for the specified schemas or tables.

4

Use automatic daily backups on Pro plans

Supabase Pro plan and above includes automatic daily backups with 7-day retention. Go to Dashboard > Database > Backups to view available backups and their timestamps. Pro plan also offers point-in-time recovery (PITR), allowing you to restore your database to any second within the retention window. These backups are managed by Supabase and require no configuration.

typescript
1# No code needed automatic backups are managed by Supabase
2# View backups: Dashboard > Database > Backups
3# PITR: Available on Pro plan, 7-day retention window
4# To restore: Contact Supabase support or use the Dashboard restore button

Expected result: Automatic backups are visible in the Dashboard with timestamps and restore options.

5

Schedule automated backups with cron

For production systems, set up a cron job that runs pg_dump on a schedule. This script creates compressed backups, retains only the last 7 days of backups, and can optionally upload to cloud storage. Save the script and add it to your system's crontab.

typescript
1#!/bin/bash
2# backup-supabase.sh Run daily via cron
3
4DB_URL="postgresql://postgres.[ref]:[password]@aws-0-[region].pooler.supabase.com:5432/postgres"
5BACKUP_DIR="/home/user/backups/supabase"
6RETENTION_DAYS=7
7
8mkdir -p $BACKUP_DIR
9
10# Create compressed backup
11pg_dump "$DB_URL" --clean --if-exists --no-owner --no-privileges \
12 | gzip > "$BACKUP_DIR/backup_$(date +%Y%m%d_%H%M%S).sql.gz"
13
14# Remove backups older than retention period
15find $BACKUP_DIR -name "backup_*.sql.gz" -mtime +$RETENTION_DAYS -delete
16
17echo "Backup completed: $(date)"
18
19# Add to crontab (run daily at 2 AM):
20# crontab -e
21# 0 2 * * * /home/user/scripts/backup-supabase.sh >> /home/user/logs/backup.log 2>&1

Expected result: Automated backups run daily, creating compressed SQL files with automatic cleanup of old backups.

Complete working example

backup-supabase.sh
1#!/bin/bash
2# Supabase Database Backup Script
3# Schedule with cron: 0 2 * * * /path/to/backup-supabase.sh
4
5set -euo pipefail
6
7# Configuration
8DB_URL="${SUPABASE_DB_URL:?'Set SUPABASE_DB_URL environment variable'}"
9BACKUP_DIR="${BACKUP_DIR:-/home/user/backups/supabase}"
10RETENTION_DAYS="${RETENTION_DAYS:-7}"
11TIMESTAMP=$(date +%Y%m%d_%H%M%S)
12
13# Create backup directory
14mkdir -p "$BACKUP_DIR"
15
16echo "[$(date)] Starting Supabase backup..."
17
18# Full compressed backup
19pg_dump "$DB_URL" \
20 --clean \
21 --if-exists \
22 --no-owner \
23 --no-privileges \
24 --format=custom \
25 --file="$BACKUP_DIR/backup_${TIMESTAMP}.dump"
26
27# Verify the backup file is not empty
28BACKUP_SIZE=$(stat -f%z "$BACKUP_DIR/backup_${TIMESTAMP}.dump" 2>/dev/null || stat --printf="%s" "$BACKUP_DIR/backup_${TIMESTAMP}.dump")
29if [ "$BACKUP_SIZE" -lt 1000 ]; then
30 echo "[$(date)] ERROR: Backup file suspiciously small ($BACKUP_SIZE bytes)"
31 exit 1
32fi
33
34echo "[$(date)] Backup created: backup_${TIMESTAMP}.dump ($BACKUP_SIZE bytes)"
35
36# Clean up old backups
37DELETED=$(find "$BACKUP_DIR" -name "backup_*.dump" -mtime +"$RETENTION_DAYS" -delete -print | wc -l)
38echo "[$(date)] Cleaned up $DELETED old backup(s)"
39
40# Optional: Upload to S3 or cloud storage
41# aws s3 cp "$BACKUP_DIR/backup_${TIMESTAMP}.dump" \
42# "s3://my-backups/supabase/backup_${TIMESTAMP}.dump"
43
44echo "[$(date)] Backup complete"

Common mistakes when backing up a Supabase Database

Why it's a problem: Using the connection pooler URL (port 6543) for pg_dump instead of the direct connection (port 5432)

How to avoid: pg_dump requires a direct connection to PostgreSQL. Use port 5432, not 6543. Check your connection string in Dashboard > Settings > Database.

Why it's a problem: Never testing backup restores, then discovering the backup is corrupted or incomplete when needed

How to avoid: Regularly test restores to a local Supabase instance (supabase start, then psql < backup.sql) or a separate test project. A backup you have not tested is not a real backup.

Why it's a problem: Storing backup files with database credentials in a public or unencrypted location

How to avoid: Store backups in an encrypted location with restricted access. Encrypt backup files with gpg if storing on disk. Never commit backup files or connection strings to version control.

Best practices

  • Run pg_dump with --clean --if-exists --no-owner --no-privileges for portable, restorable backups
  • Use compressed format (--format=custom or gzip) for large databases to save storage space
  • Maintain your own pg_dump backups even if you have Supabase's automatic daily backups on Pro plans
  • Test restoring from backups regularly to verify integrity — at least once per month
  • Store backups in a different location than your database (separate cloud provider, different region)
  • Use environment variables for database credentials in backup scripts, never hardcode them
  • Keep a minimum of 7 days of backup retention, with longer retention for critical production databases
  • Log backup operations with timestamps so you can audit when backups ran and whether they succeeded

Still stuck?

Copy one of these prompts to get a personalized, step-by-step explanation.

ChatGPT Prompt

I need to set up automated daily backups for my Supabase PostgreSQL database. Show me a bash script that runs pg_dump, compresses the output, uploads it to AWS S3, and cleans up backups older than 30 days. Include error handling and logging.

Supabase Prompt

Create a backup strategy for my Supabase project that includes: a pg_dump script for daily full backups, a separate schema-only backup for version control, and instructions for testing the restore process on a local Supabase instance.

Frequently asked questions

Does Supabase automatically back up my database?

Yes, but only on Pro plans and above. Free plan projects do not have automatic backups. Pro plan includes daily backups with 7-day retention and point-in-time recovery. Regardless of your plan, you should maintain your own pg_dump backups.

Can I use pg_dump on the free plan?

Yes. pg_dump works with any Supabase plan because it connects directly to your PostgreSQL database. The connection string is available in Dashboard > Settings > Database regardless of your plan.

How large can a pg_dump backup file get?

It depends on your data. A database with 1 million rows might produce a 500 MB SQL file. Using --format=custom or gzip compression typically reduces the file size by 70-90%. For very large databases, consider backing up individual schemas or tables.

Does pg_dump back up RLS policies and functions?

Yes. pg_dump includes table definitions, RLS policies, functions, triggers, indexes, and all other database objects. Use --schema-only if you want structure without data.

Can I restore a backup to a different Supabase project?

Yes. Use psql with the target project's connection string to restore: psql 'postgresql://...' < backup.sql. Use --no-owner and --no-privileges when creating the backup to avoid permission issues.

What about backing up Supabase Storage files?

pg_dump backs up storage metadata (bucket configurations, policies) but not the actual files stored in S3. For file backups, use the storage API to list and download files programmatically, or use the Supabase CLI.

Can RapidDev set up automated backups for my Supabase project?

Yes. RapidDev can configure automated backup pipelines including scheduled pg_dump scripts, cloud storage uploads, backup verification, and disaster recovery procedures for your Supabase database.

RapidDev

Talk to an Expert

Our team has built 600+ apps. Get personalized help with your project.

Book a free consultation

Need help with your project?

Our experts have built 600+ apps and can accelerate your development. Book a free consultation — no strings attached.

Book a free consultation

We put the rapid in RapidDev

Need a dedicated strategic tech and growth partner? Discover what RapidDev can do for your business! Book a call with our team to schedule a free, no-obligation consultation. We'll discuss your project and provide a custom quote at no cost.