Learn how to back up n8n workflows using UI exports, CLI commands, database backups, and automated scripts to securely save and restore your automation data.
Book a call with an Expert
Starting a new venture? Need to upgrade your web app? RapidDev builds application with your growth in mind.
To back up n8n workflows, you can use n8n's built-in export functionality, command-line interface, or set up automated backups using the database backup approach. The specific method depends on your deployment type and needs, but all methods allow you to save your workflows and related data for safekeeping or migration purposes.
Step 1: Understanding n8n Backup Options
Before diving into specific backup methods, it's important to understand the various approaches available:
The approach you choose will depend on your specific n8n deployment (self-hosted, Docker, desktop app) and your backup requirements.
Step 2: Backing Up Workflows Through the Web Interface
The simplest way to back up individual workflows or multiple workflows is through the n8n web interface:
1. Log into your n8n instance
2. Navigate to the Workflows section
3. Open the workflow you want to back up
4. Click on the three dots (⋮) in the top-right corner
5. Select "Download"
6. Your workflow will be saved as a JSON file to your local computer
1. Log into your n8n instance
2. Navigate to the Workflows section
3. Select the checkboxes next to the workflows you want to export
4. Click the "Export" button that appears at the top of the list
5. Your selected workflows will be downloaded as a single JSON file
This method is best for occasional backups of specific workflows or when you need to transfer workflows between instances.
Step 3: Using n8n CLI for Complete Backups
For more comprehensive backups, you can use the n8n command-line interface (CLI). This approach allows you to back up all workflows, credentials, and even execution data.
# First, ensure you have n8n installed globally (if not using Docker)
npm install -g n8n
# To export all workflows
n8n export:workflow --all --output=workflows-backup.json
# To export all workflows and credentials (requires encryption key)
n8n export:workflow --all --credentials --output=full-backup.json
# To export specific workflows by ID
n8n export:workflow --id=123,456 --output=specific-workflows.json
If you're using Docker, you need to run these commands within your n8n container:
docker exec -it your-n8n-container n8n export:workflow --all --output=/data/backups/workflows-backup.json
Make sure to replace "your-n8n-container" with your actual container name or ID.
Step 4: Setting Up Automated Backups Using Cron Jobs
For regular automatic backups, you can set up a cron job on your server:
# Create a backup script (e.g., backup-n8n.sh)
#!/bin/bash
# Set variables
BACKUP\_DIR="/path/to/backups"
DATE=$(date +"%Y-%m-%d\_%H-%M")
FILENAME="n8n_backup_$DATE.json"
# Create backup directory if it doesn't exist
mkdir -p $BACKUP\_DIR
# Execute n8n export command
n8n export:workflow --all --credentials --output=$BACKUP\_DIR/$FILENAME
# Optional: Compress the backup
gzip $BACKUP\_DIR/$FILENAME
# Optional: Remove backups older than 30 days
find $BACKUP_DIR -name "n8n_backup\_\*.json.gz" -mtime +30 -delete
Make the script executable:
chmod +x backup-n8n.sh
Add a cron job to run this script regularly:
# Edit crontab
crontab -e
# Add a line to run the backup daily at 2 AM
0 2 _ _ \* /path/to/backup-n8n.sh
If using Docker, you'll need to create a script that executes the export command within your container.
Step 5: Backing Up n8n Database Directly
Since n8n stores all workflows and credentials in its database, backing up the database is another effective approach.
For SQLite (default database):
# Find your SQLite database file (typically ~/.n8n/database.sqlite)
# Create a copy of the database file
cp ~/.n8n/database.sqlite /path/to/backups/database\_$(date +"%Y-%m-%d").sqlite
# For a running n8n instance, it's safer to use the SQLite .backup command
sqlite3 ~/.n8n/database.sqlite ".backup /path/to/backups/database\_$(date +"%Y-%m-%d").sqlite"
For PostgreSQL:
# Replace with your actual database details
pg_dump -U username -d n8n_db -f /path/to/backups/n8n_db_$(date +"%Y-%m-%d").sql
For MySQL/MariaDB:
# Replace with your actual database details
mysqldump -u username -p n8n_db > /path/to/backups/n8n_db\_$(date +"%Y-%m-%d").sql
Step 6: Backing Up Docker Volumes
If you're running n8n in Docker and using volumes for persistence, you can back up the volumes directly:
# Identify the n8n Docker volume
docker volume ls | grep n8n
# Create a backup of the volume
docker run --rm -v n8n_data:/source -v $(pwd):/backup alpine tar -czf /backup/n8n_data\_$(date +"%Y-%m-%d").tar.gz -C /source .
Replace "n8n_data" with your actual volume name.
Step 7: Automating Docker-Based Backups
For Docker deployments, you can create a more comprehensive backup script:
#!/bin/bash
# Set variables
BACKUP\_DIR="/path/to/backups"
DATE=$(date +"%Y-%m-%d\_%H-%M")
CONTAINER\_NAME="n8n-container"
VOLUME_NAME="n8n_data"
# Create backup directory if it doesn't exist
mkdir -p $BACKUP\_DIR
# Export workflows using n8n CLI
docker exec $CONTAINER_NAME n8n export:workflow --all --credentials --output=/tmp/n8n_workflows\_$DATE.json
docker cp $CONTAINER_NAME:/tmp/n8n_workflows_$DATE.json $BACKUP_DIR/
# Backup the Docker volume
docker run --rm -v $VOLUME_NAME:/source -v $BACKUP_DIR:/backup alpine tar -czf /backup/n8n_volume_$DATE.tar.gz -C /source .
# Optional: Remove backups older than 30 days
find $BACKUP_DIR -name "n8n_\*" -mtime +30 -delete
echo "Backup completed successfully to $BACKUP\_DIR"
Schedule this script using cron as shown in Step 4.
Step 8: Backing Up Credentials Securely
When backing up n8n credentials, it's important to handle them securely:
# Export credentials with encryption (uses your existing encryption key)
n8n export:credentials --all --output=credentials-backup.json
# If you need to specify a different encryption key
n8n export:credentials --all --output=credentials-backup.json --decryptionKey=your-key --encryptionKey=new-key
Important security notes:
Step 9: Restoring Workflows from Backups
Knowing how to restore from your backups is just as important as creating them:
1. Log into your n8n instance
2. Navigate to the Workflows section
3. Click the "Import from file" button
4. Select your workflow JSON file
5. Confirm the import
# Import workflows from a backup file
n8n import:workflow --input=workflows-backup.json
# Import workflows and credentials
n8n import:workflow --input=full-backup.json
For SQLite:
# Stop n8n first
# Then replace the database file
cp backup-file.sqlite ~/.n8n/database.sqlite
For PostgreSQL:
# Create a new database if needed
createdb -U username n8n\_db
# Restore from backup
psql -U username -d n8n\_db -f backup-file.sql
For MySQL/MariaDB:
# Create a new database if needed
mysql -u username -p -e "CREATE DATABASE IF NOT EXISTS n8n\_db"
# Restore from backup
mysql -u username -p n8n\_db < backup-file.sql
Step 10: Creating a Comprehensive Backup Strategy
For production n8n instances, implement a complete backup strategy:
Example of a comprehensive backup strategy:
#!/bin/bash
# Comprehensive n8n backup script
# ----------------------------------
# Configuration
BACKUP\_DIR="/path/to/backups"
REMOTE_BACKUP_DIR="user@remote-server:/path/to/backups"
DATE=$(date +"%Y-%m-%d")
N8N\_DIR="/path/to/n8n"
DB\_TYPE="sqlite" # or "postgres" or "mysql"
DB_NAME="n8n_db"
DB_USER="n8n_user"
RETENTION\_DAYS=30
# Create backup directory structure
mkdir -p "$BACKUP\_DIR/$DATE"
# 1. Export workflows and credentials using n8n CLI
cd $N8N\_DIR
n8n export:workflow --all --output="$BACKUP\_DIR/$DATE/workflows.json"
n8n export:credentials --all --output="$BACKUP\_DIR/$DATE/credentials.json"
# 2. Backup the database
if [ "$DB\_TYPE" = "sqlite" ]; then
sqlite3 "$N8N_DIR/.n8n/database.sqlite" ".backup $BACKUP_DIR/$DATE/database.sqlite"
elif [ "$DB\_TYPE" = "postgres" ]; then
pg_dump -U $DB_USER $DB_NAME > "$BACKUP_DIR/$DATE/database.sql"
elif [ "$DB\_TYPE" = "mysql" ]; then
mysqldump -u $DB_USER -p $DB_NAME > "$BACKUP\_DIR/$DATE/database.sql"
fi
# 3. Create a single archive of the day's backup
tar -czf "$BACKUP_DIR/n8n-backup-$DATE.tar.gz" -C "$BACKUP_DIR" "$DATE"
# 4. Copy to remote location (optional)
rsync -av "$BACKUP_DIR/n8n-backup-$DATE.tar.gz" "$REMOTE_BACKUP\_DIR/"
# 5. Clean up old backups
find "$BACKUP_DIR" -name "n8n-backup-\*.tar.gz" -mtime +$RETENTION_DAYS -delete
rm -rf "$BACKUP\_DIR/$DATE"
echo "n8n backup completed: $BACKUP\_DIR/n8n-backup-$DATE.tar.gz"
Step 11: Specific Considerations for Cloud-Hosted n8n
If you're using n8n.cloud or n8n.io hosted service rather than self-hosting:
// Example n8n workflow for self-backup (pseudocode)
1. Schedule trigger (e.g., every week)
2. n8n node to fetch all workflows (using n8n API)
3. HTTP Request node with credentials to call your own instance's API
4. For each workflow:
- Make HTTP request to /workflows/{id}
- Save the result
1. Combine all workflows into one JSON file
2. Send to storage (S3, Dropbox, etc.) or email to yourself
Step 12: Troubleshooting Common Backup Issues
When backing up n8n, you might encounter these common issues:
# Ensure your backup script has appropriate permissions
chmod +x backup-script.sh
# When backing up database files, you might need sudo
sudo cp ~/.n8n/database.sqlite /path/to/backups/
# For Docker volumes, ensure you have Docker permissions
sudo docker run --rm -v n8n_data:/source -v $(pwd):/backup alpine tar -czf /backup/n8n_volume\_backup.tar.gz -C /source .
# If you see encryption errors, ensure you're using the correct key
# Check your n8n configuration file for the encryption key
cat ~/.n8n/config
# Or specify the key explicitly in export/import commands
n8n export:credentials --all --output=credentials.json --decryptionKey=your-actual-key
# For SQLite, the database might be locked if n8n is running
# Stop n8n before backing up, or use the .backup command
systemctl stop n8n # if using systemd
sqlite3 ~/.n8n/database.sqlite ".backup /path/to/backup.sqlite"
systemctl start n8n
Conclusion and Best Practices
Remember these key points for effective n8n workflow backups:
By following this comprehensive guide, you should now have a solid understanding of how to back up your n8n workflows and related data, ensuring that your automation work remains protected against data loss or corruption.
When it comes to serving you, we sweat the little things. That’s why our work makes a big impact.