To back up n8n workflows, use the built-in export feature to download individual workflows as JSON files, or use the n8n CLI command n8n export:workflow --all to export all workflows at once. For comprehensive backups, also back up the n8n database and the N8N_ENCRYPTION_KEY. Automate backups by creating a workflow that exports data on a schedule and stores it in cloud storage.
Backing Up n8n Workflows: Manual Export, CLI, and Automated Strategies
Losing workflows to accidental deletion, server failures, or botched upgrades can cost hours or days of rebuilding. n8n offers multiple backup strategies: manual JSON export from the editor, bulk CLI export, database-level backups, and even self-referencing workflows that back themselves up. This tutorial covers all approaches so you can choose the right one for your setup.
Prerequisites
- A running n8n instance (self-hosted for CLI and database backups)
- Terminal access to the n8n server for CLI exports
- Basic familiarity with the n8n editor
Step-by-step guide
Export a single workflow from the editor
Export a single workflow from the editor
Open any workflow in the n8n editor, click the three-dot menu in the top-right corner, and select Download. n8n saves the workflow as a JSON file to your local downloads folder. This JSON file contains the complete workflow definition including all nodes, connections, settings, and static data. Credential values are not included in the export for security. Repeat this for each workflow you want to back up. This method is best for quick one-off backups of individual workflows.
Expected result: A .json file is downloaded containing the complete workflow definition, which can be imported into any n8n instance.
Bulk export all workflows using the n8n CLI
Bulk export all workflows using the n8n CLI
For self-hosted installations, use the n8n CLI to export all workflows at once. Run the export:workflow command with the --all flag and specify an output directory. Each workflow is saved as a separate JSON file named with its ID. This is the fastest way to back up an entire n8n instance. Run this command on a schedule using cron for regular automated backups.
1# Export all workflows to a backup directory2n8n export:workflow --all --output=./n8n-backups/workflows/34# Export all credentials (encrypted, requires same encryption key to restore)5n8n export:credentials --all --output=./n8n-backups/credentials/67# Docker: run the export inside the container8docker exec -it n8n n8n export:workflow --all --output=/home/node/.n8n/backups/workflows/910# Cron job for daily backups at 2 AM11# Add to crontab: crontab -e120 2 * * * cd /path/to/n8n && n8n export:workflow --all --output=/backups/n8n/workflows/$(date +\%Y-\%m-\%d)/ >> /var/log/n8n-backup.log 2>&1Expected result: All workflows are exported as individual JSON files in the specified directory, one file per workflow.
Back up the n8n database for a complete restore
Back up the n8n database for a complete restore
Exporting workflows as JSON does not capture execution history, credential data, or user accounts. For a complete backup, create a database dump. If you use SQLite (the default), copy the database.sqlite file from the .n8n directory. If you use PostgreSQL, use pg_dump. Always stop n8n before copying the SQLite file to avoid corruption. Also back up the N8N_ENCRYPTION_KEY — without it, credential data in the database backup is useless.
1# SQLite backup (stop n8n first)2systemctl stop n8n3cp ~/.n8n/database.sqlite /backups/n8n/database-$(date +%Y-%m-%d).sqlite4systemctl start n8n56# PostgreSQL backup (n8n can keep running)7pg_dump -U n8n -h localhost n8n > /backups/n8n/n8n-db-$(date +%Y-%m-%d).sql89# Docker PostgreSQL backup10docker exec n8n-postgres pg_dump -U n8n n8n > /backups/n8n/n8n-db-$(date +%Y-%m-%d).sql1112# Back up the encryption key (critical!)13echo "N8N_ENCRYPTION_KEY=$N8N_ENCRYPTION_KEY" > /backups/n8n/encryption-key-$(date +%Y-%m-%d).env14chmod 600 /backups/n8n/encryption-key-$(date +%Y-%m-%d).envExpected result: A complete database dump and encryption key backup exist in your backup directory, allowing full n8n restoration.
Build an automated backup workflow inside n8n
Build an automated backup workflow inside n8n
Create an n8n workflow that backs up your other workflows automatically. Use a Schedule trigger node to run daily, an HTTP Request node to call the n8n API endpoint GET /api/v1/workflows, a Code node to process the data, and a node to store the backup (such as an S3, Google Drive, or local file node). This self-referencing approach keeps backups running as long as n8n itself is running and sends the data off-server for disaster recovery.
1// Code node: Process workflow data for backup storage2const workflows = $input.all();3const timestamp = new Date().toISOString().split('T')[0];45const backupData = {6 backupDate: timestamp,7 totalWorkflows: workflows.length,8 workflows: workflows.map(item => ({9 id: item.json.id,10 name: item.json.name,11 active: item.json.active,12 updatedAt: item.json.updatedAt,13 nodes: item.json.nodes,14 connections: item.json.connections,15 settings: item.json.settings16 }))17};1819return [{20 json: backupData,21 binary: {22 data: {23 data: Buffer.from(24 JSON.stringify(backupData, null, 2)25 ).toString('base64'),26 mimeType: 'application/json',27 fileName: `n8n-backup-${timestamp}.json`28 }29 }30}];Expected result: A scheduled workflow exports all workflow definitions and stores them in cloud storage daily.
Test restoring from a backup
Test restoring from a backup
A backup is only useful if you can restore from it. Test the restore process by importing a workflow JSON file into a fresh n8n instance. Go to the Workflows panel, click the Import from File button (or use the three-dot menu and select Import from File), and select your backup JSON file. The workflow appears in your list with all nodes and connections intact. You will need to reconnect credentials since they are not included in the export. For database restores, stop n8n, replace the database file or restore the pg_dump, ensure the encryption key matches, and restart.
1# Restore all workflows from CLI backup2n8n import:workflow --input=/backups/n8n/workflows/34# Restore credentials (requires same N8N_ENCRYPTION_KEY)5n8n import:credentials --input=/backups/n8n/credentials/67# Restore PostgreSQL database8psql -U n8n -h localhost n8n < /backups/n8n/n8n-db-2026-03-27.sql910# Restore SQLite (stop n8n first)11systemctl stop n8n12cp /backups/n8n/database-2026-03-27.sqlite ~/.n8n/database.sqlite13systemctl start n8nExpected result: Workflows are restored from backup and functional after reconnecting credentials.
Complete working example
1#!/bin/bash2# n8n Complete Backup Script3# Run daily via cron: 0 2 * * * /opt/scripts/n8n-backup.sh45BACKUP_DIR="/backups/n8n"6DATE=$(date +%Y-%m-%d)7DAY_BACKUP_DIR="$BACKUP_DIR/$DATE"8RETENTION_DAYS=30910# Create backup directory11mkdir -p "$DAY_BACKUP_DIR/workflows"12mkdir -p "$DAY_BACKUP_DIR/credentials"1314echo "[$DATE] Starting n8n backup..."1516# Export workflows via CLI17n8n export:workflow --all --output="$DAY_BACKUP_DIR/workflows/" 2>&118WF_COUNT=$(ls -1 "$DAY_BACKUP_DIR/workflows/" | wc -l)19echo "Exported $WF_COUNT workflows"2021# Export credentials (encrypted)22n8n export:credentials --all --output="$DAY_BACKUP_DIR/credentials/" 2>&123CRED_COUNT=$(ls -1 "$DAY_BACKUP_DIR/credentials/" | wc -l)24echo "Exported $CRED_COUNT credentials"2526# Database backup (PostgreSQL)27if [ "$DB_TYPE" = "postgresdb" ]; then28 pg_dump -U n8n -h localhost n8n > "$DAY_BACKUP_DIR/database.sql" 2>&129 echo "PostgreSQL database backed up"30else31 # SQLite backup32 cp ~/.n8n/database.sqlite "$DAY_BACKUP_DIR/database.sqlite" 2>&133 echo "SQLite database backed up"34fi3536# Compress the backup37tar -czf "$BACKUP_DIR/n8n-backup-$DATE.tar.gz" -C "$BACKUP_DIR" "$DATE/"38rm -rf "$DAY_BACKUP_DIR"39echo "Backup compressed to n8n-backup-$DATE.tar.gz"4041# Remove backups older than retention period42find "$BACKUP_DIR" -name "n8n-backup-*.tar.gz" -mtime +$RETENTION_DAYS -delete43echo "Old backups cleaned up (retention: $RETENTION_DAYS days)"4445echo "[$DATE] Backup complete."Common mistakes when backking Up n8n Workflows
Why it's a problem: Backing up workflows but not the N8N_ENCRYPTION_KEY, making credentials unrecoverable
How to avoid: Always include the encryption key in your backup process. Store it securely in a password manager or secrets vault.
Why it's a problem: Copying the SQLite database while n8n is running, resulting in a corrupted backup
How to avoid: Stop n8n before copying database.sqlite, or switch to PostgreSQL which supports hot backups via pg_dump.
Why it's a problem: Assuming exported workflow JSON files include credential values
How to avoid: Credential values are excluded from workflow exports for security. After restoring, you must re-enter all credential values manually or restore from a credential export with the matching encryption key.
Why it's a problem: Only keeping backups on the same server as n8n
How to avoid: Copy backups to a separate server, cloud storage, or external drive. A server failure that takes down n8n will also destroy backups stored on the same machine.
Best practices
- Always back up the N8N_ENCRYPTION_KEY separately — without it, credential backups are useless
- Use PostgreSQL instead of SQLite for production instances to enable hot backups without stopping n8n
- Store backups off-server using cloud storage like S3 or Google Cloud Storage for disaster recovery
- Test your restore process at least once to verify backups actually work before you need them
- Set a retention policy to automatically delete backups older than 30 days to manage storage
- Include both workflow exports and database dumps in your backup strategy for different recovery scenarios
- Run backups during low-traffic hours to minimize impact on active workflows
- Version control your workflows by connecting n8n to a Git repository for change tracking
Still stuck?
Copy one of these prompts to get a personalized, step-by-step explanation.
I need a comprehensive backup strategy for my self-hosted n8n instance running on Docker with PostgreSQL. Help me create a bash script that exports workflows, dumps the database, backs up the encryption key, compresses everything, and uploads to S3 daily with 30-day retention.
Build an n8n workflow with a Schedule trigger that runs daily at 2 AM, calls the n8n API to fetch all workflows, converts them to a JSON backup file, and uploads the file to an S3 bucket using the AWS S3 node.
Frequently asked questions
Are credentials included in workflow exports?
No, credential values are never included in workflow JSON exports for security. Only the credential name and type are referenced. You must re-enter credential values after importing, or use the n8n CLI credential export/import with the same encryption key.
Can I back up n8n Cloud workflows?
n8n Cloud does not provide CLI or database access. You can export individual workflows as JSON from the editor. For bulk backups, create a workflow that calls the n8n API to fetch all workflows and stores them externally.
How often should I back up my n8n instance?
Daily backups are sufficient for most setups. If you make frequent changes to workflows throughout the day, consider running backups every 6-12 hours. Critical production instances may warrant real-time replication of the PostgreSQL database.
What is the N8N_ENCRYPTION_KEY and why is it critical for backups?
The N8N_ENCRYPTION_KEY is used to encrypt and decrypt all credential data stored in the database. If you restore a database backup without the matching encryption key, all credentials will be unreadable and you will need to re-enter every API key and password.
Can I use Git to version control my workflows?
Yes, export workflows as JSON and commit them to a Git repository. Some teams automate this by running the CLI export command in a pre-commit hook or a cron job that exports and commits changes daily.
How do I restore a single workflow without overwriting everything else?
Use the editor's Import from File feature to import a single workflow JSON file. It creates a new workflow without affecting existing ones. If a workflow with the same ID exists, n8n may prompt you to overwrite or create a copy.
What is the maximum size of an n8n workflow export?
There is no hard limit on workflow JSON export size. However, workflows with large amounts of static data or pinned data can produce files several megabytes in size. Most workflows export to under 100KB.
Can RapidDev help set up automated backups for my n8n instance?
Yes, RapidDev can configure automated backup pipelines for your n8n deployment, including scheduled exports, database dumps, encryption key management, cloud storage integration, and restore testing.
Talk to an Expert
Our team has built 600+ apps. Get personalized help with your project.
Book a free consultation