n8n Backup and Restore: How to Never Lose Your Workflows
/home/node/.n8n directory. Automate all three. Credentials are stored separately — export them intentionally.n8n stores everything — your workflows, credentials, execution history, and settings — in a combination of a database and the local filesystem. Lose either one and you lose your automations. Most self-hosted users find this out at the worst possible time: after a server crash, a botched upgrade, or an accidental deletion.
This guide covers what you need to back up, how to do it, and how to restore when something goes wrong.
What Actually Needs to Be Backed Up
n8n has four data stores:
- Workflows: Stored in the database. The core of what you have built — node configurations, connections, settings.
- Credentials: Stored in the database, encrypted with your n8n encryption key. Includes all OAuth tokens, API keys, and passwords you have connected.
- Execution history: Stored in the database. Input/output data for every workflow run. This grows indefinitely without pruning — see our guide on n8n memory fixes for pruning setup.
- Binary data / files: Stored in
/home/node/.n8n/storage/. Files downloaded or processed by workflows live here.
For most teams, workflows and credentials are what matter. Execution history and binary data are secondary — useful for debugging, but not worth restoring if the server is gone.
Export Workflows from the UI
The fastest way to back up individual workflows: open the workflow → top menu → Download. This exports the workflow as a JSON file that can be re-imported into any n8n instance.
To export all workflows at once: go to Settings → Import/Export → Export All. This creates a single JSON file containing every workflow in your instance. Store this in a safe location — Google Drive, S3, or a Git repository. Do this before every major change or upgrade.
Credentials are not included in workflow exports by default — connection references are exported but the actual secrets are not. To export credentials, go to Settings → Import/Export → Export Credentials. Keep this file separate and encrypted.
Automate Workflow Export via API
Manual exports are better than nothing, but an automated backup that runs daily is far more reliable. Use the n8n API to pull all workflows and save them to persistent storage.
First, generate an API key: Settings → API → Create API Key. Then set up a workflow that runs on a daily schedule and calls the n8n API:
// Schedule Trigger (daily at 3am) → HTTP Request node
// Method: GET
// URL: http://localhost:5678/api/v1/workflows
// Headers:
// X-N8N-API-KEY: your-api-key
// This returns all workflows as JSON
// Then: Google Drive node → Upload File
// Or: HTTP Request → PUT to S3
For a complete automated backup pipeline: Schedule Trigger → HTTP Request (GET /api/v1/workflows) → Code node (stringify JSON) → Google Drive (upload file with date-stamped filename). This runs silently every night and builds a rolling archive of your workflow state.
Database Backups
Workflow exports cover your workflow logic, but not execution history, tags, or some settings. For a full backup, copy the database itself.
SQLite (default)
n8n uses SQLite by default. The database file is at /home/node/.n8n/database.sqlite. Back this up by copying the file while n8n is stopped, or use SQLite's backup command while running:
sqlite3 /home/node/.n8n/database.sqlite ".backup /tmp/n8n-backup-$(date +%Y%m%d).sqlite"
PostgreSQL
If you have configured n8n to use PostgreSQL (required for queue mode), use pg_dump:
pg_dump -U n8n_user -h localhost n8n_db > /tmp/n8n-backup-$(date +%Y%m%d).sql
Schedule this as a cron job and upload the dump to S3, R2, or any object storage. A daily backup with 30-day retention is sufficient for most teams.
Restore from Backup
To restore workflows from a JSON export: Settings → Import/Export → Import. Select the JSON file. n8n imports all workflows — existing workflows with the same ID are overwritten, new ones are created.
To restore from a database backup:
- Stop n8n:
docker stop n8n-n8n-1 - Replace the database file:
cp /tmp/n8n-backup-20260410.sqlite /home/node/.n8n/database.sqlite - Restart n8n:
docker start n8n-n8n-1
For PostgreSQL: drop and recreate the database, then restore from the dump file using psql.
N8N_ENCRYPTION_KEY environment variable. If you restore a database to a new instance, you must set the same encryption key — otherwise all credentials will be unreadable. Keep this key backed up separately from the database.How Managed Hosting Handles Backups
Self-hosted backup management is reliable when set up correctly, but it is one more thing to maintain. On n8nautomation.cloud, automated backups run daily and are stored in Cloudflare R2 with 30-day retention. Restoring to a previous backup is a one-click operation from the dashboard.
You still get full workflow export capability — the n8n UI and API work the same way. The managed layer adds an additional safety net: even if your workflows run fine, the underlying database and filesystem are snapshotted independently of anything n8n does.