Skip to main content
RapidDev - Software Development Agency
n8n-tutorial

How to Back Up n8n Workflows

To back up n8n workflows, use the built-in export feature to download individual workflows as JSON files, or use the n8n CLI command n8n export:workflow --all to export all workflows at once. For comprehensive backups, also back up the n8n database and the N8N_ENCRYPTION_KEY. Automate backups by creating a workflow that exports data on a schedule and stores it in cloud storage.

What you'll learn

  • How to export individual workflows as JSON from the n8n editor
  • How to bulk export all workflows using the n8n CLI
  • How to back up the n8n database for a complete restore
  • How to build an automated backup workflow inside n8n
Book a free consultation
4.9Clutch rating
600+Happy partners
17+Countries served
190+Team members
Beginner8 min read20-30 minutesn8n 1.0+ (self-hosted; limited on Cloud)March 2026RapidDev Engineering Team
TL;DR

To back up n8n workflows, use the built-in export feature to download individual workflows as JSON files, or use the n8n CLI command n8n export:workflow --all to export all workflows at once. For comprehensive backups, also back up the n8n database and the N8N_ENCRYPTION_KEY. Automate backups by creating a workflow that exports data on a schedule and stores it in cloud storage.

Backing Up n8n Workflows: Manual Export, CLI, and Automated Strategies

Losing workflows to accidental deletion, server failures, or botched upgrades can cost hours or days of rebuilding. n8n offers multiple backup strategies: manual JSON export from the editor, bulk CLI export, database-level backups, and even self-referencing workflows that back themselves up. This tutorial covers all approaches so you can choose the right one for your setup.

Prerequisites

  • A running n8n instance (self-hosted for CLI and database backups)
  • Terminal access to the n8n server for CLI exports
  • Basic familiarity with the n8n editor

Step-by-step guide

1

Export a single workflow from the editor

Open any workflow in the n8n editor, click the three-dot menu in the top-right corner, and select Download. n8n saves the workflow as a JSON file to your local downloads folder. This JSON file contains the complete workflow definition including all nodes, connections, settings, and static data. Credential values are not included in the export for security. Repeat this for each workflow you want to back up. This method is best for quick one-off backups of individual workflows.

Expected result: A .json file is downloaded containing the complete workflow definition, which can be imported into any n8n instance.

2

Bulk export all workflows using the n8n CLI

For self-hosted installations, use the n8n CLI to export all workflows at once. Run the export:workflow command with the --all flag and specify an output directory. Each workflow is saved as a separate JSON file named with its ID. This is the fastest way to back up an entire n8n instance. Run this command on a schedule using cron for regular automated backups.

typescript
1# Export all workflows to a backup directory
2n8n export:workflow --all --output=./n8n-backups/workflows/
3
4# Export all credentials (encrypted, requires same encryption key to restore)
5n8n export:credentials --all --output=./n8n-backups/credentials/
6
7# Docker: run the export inside the container
8docker exec -it n8n n8n export:workflow --all --output=/home/node/.n8n/backups/workflows/
9
10# Cron job for daily backups at 2 AM
11# Add to crontab: crontab -e
120 2 * * * cd /path/to/n8n && n8n export:workflow --all --output=/backups/n8n/workflows/$(date +\%Y-\%m-\%d)/ >> /var/log/n8n-backup.log 2>&1

Expected result: All workflows are exported as individual JSON files in the specified directory, one file per workflow.

3

Back up the n8n database for a complete restore

Exporting workflows as JSON does not capture execution history, credential data, or user accounts. For a complete backup, create a database dump. If you use SQLite (the default), copy the database.sqlite file from the .n8n directory. If you use PostgreSQL, use pg_dump. Always stop n8n before copying the SQLite file to avoid corruption. Also back up the N8N_ENCRYPTION_KEY — without it, credential data in the database backup is useless.

typescript
1# SQLite backup (stop n8n first)
2systemctl stop n8n
3cp ~/.n8n/database.sqlite /backups/n8n/database-$(date +%Y-%m-%d).sqlite
4systemctl start n8n
5
6# PostgreSQL backup (n8n can keep running)
7pg_dump -U n8n -h localhost n8n > /backups/n8n/n8n-db-$(date +%Y-%m-%d).sql
8
9# Docker PostgreSQL backup
10docker exec n8n-postgres pg_dump -U n8n n8n > /backups/n8n/n8n-db-$(date +%Y-%m-%d).sql
11
12# Back up the encryption key (critical!)
13echo "N8N_ENCRYPTION_KEY=$N8N_ENCRYPTION_KEY" > /backups/n8n/encryption-key-$(date +%Y-%m-%d).env
14chmod 600 /backups/n8n/encryption-key-$(date +%Y-%m-%d).env

Expected result: A complete database dump and encryption key backup exist in your backup directory, allowing full n8n restoration.

4

Build an automated backup workflow inside n8n

Create an n8n workflow that backs up your other workflows automatically. Use a Schedule trigger node to run daily, an HTTP Request node to call the n8n API endpoint GET /api/v1/workflows, a Code node to process the data, and a node to store the backup (such as an S3, Google Drive, or local file node). This self-referencing approach keeps backups running as long as n8n itself is running and sends the data off-server for disaster recovery.

typescript
1// Code node: Process workflow data for backup storage
2const workflows = $input.all();
3const timestamp = new Date().toISOString().split('T')[0];
4
5const backupData = {
6 backupDate: timestamp,
7 totalWorkflows: workflows.length,
8 workflows: workflows.map(item => ({
9 id: item.json.id,
10 name: item.json.name,
11 active: item.json.active,
12 updatedAt: item.json.updatedAt,
13 nodes: item.json.nodes,
14 connections: item.json.connections,
15 settings: item.json.settings
16 }))
17};
18
19return [{
20 json: backupData,
21 binary: {
22 data: {
23 data: Buffer.from(
24 JSON.stringify(backupData, null, 2)
25 ).toString('base64'),
26 mimeType: 'application/json',
27 fileName: `n8n-backup-${timestamp}.json`
28 }
29 }
30}];

Expected result: A scheduled workflow exports all workflow definitions and stores them in cloud storage daily.

5

Test restoring from a backup

A backup is only useful if you can restore from it. Test the restore process by importing a workflow JSON file into a fresh n8n instance. Go to the Workflows panel, click the Import from File button (or use the three-dot menu and select Import from File), and select your backup JSON file. The workflow appears in your list with all nodes and connections intact. You will need to reconnect credentials since they are not included in the export. For database restores, stop n8n, replace the database file or restore the pg_dump, ensure the encryption key matches, and restart.

typescript
1# Restore all workflows from CLI backup
2n8n import:workflow --input=/backups/n8n/workflows/
3
4# Restore credentials (requires same N8N_ENCRYPTION_KEY)
5n8n import:credentials --input=/backups/n8n/credentials/
6
7# Restore PostgreSQL database
8psql -U n8n -h localhost n8n < /backups/n8n/n8n-db-2026-03-27.sql
9
10# Restore SQLite (stop n8n first)
11systemctl stop n8n
12cp /backups/n8n/database-2026-03-27.sqlite ~/.n8n/database.sqlite
13systemctl start n8n

Expected result: Workflows are restored from backup and functional after reconnecting credentials.

Complete working example

n8n-backup-script.sh
1#!/bin/bash
2# n8n Complete Backup Script
3# Run daily via cron: 0 2 * * * /opt/scripts/n8n-backup.sh
4
5BACKUP_DIR="/backups/n8n"
6DATE=$(date +%Y-%m-%d)
7DAY_BACKUP_DIR="$BACKUP_DIR/$DATE"
8RETENTION_DAYS=30
9
10# Create backup directory
11mkdir -p "$DAY_BACKUP_DIR/workflows"
12mkdir -p "$DAY_BACKUP_DIR/credentials"
13
14echo "[$DATE] Starting n8n backup..."
15
16# Export workflows via CLI
17n8n export:workflow --all --output="$DAY_BACKUP_DIR/workflows/" 2>&1
18WF_COUNT=$(ls -1 "$DAY_BACKUP_DIR/workflows/" | wc -l)
19echo "Exported $WF_COUNT workflows"
20
21# Export credentials (encrypted)
22n8n export:credentials --all --output="$DAY_BACKUP_DIR/credentials/" 2>&1
23CRED_COUNT=$(ls -1 "$DAY_BACKUP_DIR/credentials/" | wc -l)
24echo "Exported $CRED_COUNT credentials"
25
26# Database backup (PostgreSQL)
27if [ "$DB_TYPE" = "postgresdb" ]; then
28 pg_dump -U n8n -h localhost n8n > "$DAY_BACKUP_DIR/database.sql" 2>&1
29 echo "PostgreSQL database backed up"
30else
31 # SQLite backup
32 cp ~/.n8n/database.sqlite "$DAY_BACKUP_DIR/database.sqlite" 2>&1
33 echo "SQLite database backed up"
34fi
35
36# Compress the backup
37tar -czf "$BACKUP_DIR/n8n-backup-$DATE.tar.gz" -C "$BACKUP_DIR" "$DATE/"
38rm -rf "$DAY_BACKUP_DIR"
39echo "Backup compressed to n8n-backup-$DATE.tar.gz"
40
41# Remove backups older than retention period
42find "$BACKUP_DIR" -name "n8n-backup-*.tar.gz" -mtime +$RETENTION_DAYS -delete
43echo "Old backups cleaned up (retention: $RETENTION_DAYS days)"
44
45echo "[$DATE] Backup complete."

Common mistakes when backking Up n8n Workflows

Why it's a problem: Backing up workflows but not the N8N_ENCRYPTION_KEY, making credentials unrecoverable

How to avoid: Always include the encryption key in your backup process. Store it securely in a password manager or secrets vault.

Why it's a problem: Copying the SQLite database while n8n is running, resulting in a corrupted backup

How to avoid: Stop n8n before copying database.sqlite, or switch to PostgreSQL which supports hot backups via pg_dump.

Why it's a problem: Assuming exported workflow JSON files include credential values

How to avoid: Credential values are excluded from workflow exports for security. After restoring, you must re-enter all credential values manually or restore from a credential export with the matching encryption key.

Why it's a problem: Only keeping backups on the same server as n8n

How to avoid: Copy backups to a separate server, cloud storage, or external drive. A server failure that takes down n8n will also destroy backups stored on the same machine.

Best practices

  • Always back up the N8N_ENCRYPTION_KEY separately — without it, credential backups are useless
  • Use PostgreSQL instead of SQLite for production instances to enable hot backups without stopping n8n
  • Store backups off-server using cloud storage like S3 or Google Cloud Storage for disaster recovery
  • Test your restore process at least once to verify backups actually work before you need them
  • Set a retention policy to automatically delete backups older than 30 days to manage storage
  • Include both workflow exports and database dumps in your backup strategy for different recovery scenarios
  • Run backups during low-traffic hours to minimize impact on active workflows
  • Version control your workflows by connecting n8n to a Git repository for change tracking

Still stuck?

Copy one of these prompts to get a personalized, step-by-step explanation.

ChatGPT Prompt

I need a comprehensive backup strategy for my self-hosted n8n instance running on Docker with PostgreSQL. Help me create a bash script that exports workflows, dumps the database, backs up the encryption key, compresses everything, and uploads to S3 daily with 30-day retention.

n8n Prompt

Build an n8n workflow with a Schedule trigger that runs daily at 2 AM, calls the n8n API to fetch all workflows, converts them to a JSON backup file, and uploads the file to an S3 bucket using the AWS S3 node.

Frequently asked questions

Are credentials included in workflow exports?

No, credential values are never included in workflow JSON exports for security. Only the credential name and type are referenced. You must re-enter credential values after importing, or use the n8n CLI credential export/import with the same encryption key.

Can I back up n8n Cloud workflows?

n8n Cloud does not provide CLI or database access. You can export individual workflows as JSON from the editor. For bulk backups, create a workflow that calls the n8n API to fetch all workflows and stores them externally.

How often should I back up my n8n instance?

Daily backups are sufficient for most setups. If you make frequent changes to workflows throughout the day, consider running backups every 6-12 hours. Critical production instances may warrant real-time replication of the PostgreSQL database.

What is the N8N_ENCRYPTION_KEY and why is it critical for backups?

The N8N_ENCRYPTION_KEY is used to encrypt and decrypt all credential data stored in the database. If you restore a database backup without the matching encryption key, all credentials will be unreadable and you will need to re-enter every API key and password.

Can I use Git to version control my workflows?

Yes, export workflows as JSON and commit them to a Git repository. Some teams automate this by running the CLI export command in a pre-commit hook or a cron job that exports and commits changes daily.

How do I restore a single workflow without overwriting everything else?

Use the editor's Import from File feature to import a single workflow JSON file. It creates a new workflow without affecting existing ones. If a workflow with the same ID exists, n8n may prompt you to overwrite or create a copy.

What is the maximum size of an n8n workflow export?

There is no hard limit on workflow JSON export size. However, workflows with large amounts of static data or pinned data can produce files several megabytes in size. Most workflows export to under 100KB.

Can RapidDev help set up automated backups for my n8n instance?

Yes, RapidDev can configure automated backup pipelines for your n8n deployment, including scheduled exports, database dumps, encryption key management, cloud storage integration, and restore testing.

RapidDev

Talk to an Expert

Our team has built 600+ apps. Get personalized help with your project.

Book a free consultation

Need help with your project?

Our experts have built 600+ apps and can accelerate your development. Book a free consultation — no strings attached.

Book a free consultation

We put the rapid in RapidDev

Need a dedicated strategic tech and growth partner? Discover what RapidDev can do for your business! Book a call with our team to schedule a free, no-obligation consultation. We'll discuss your project and provide a custom quote at no cost.