Artur
Artur
Founder

Migrating n8n from SQLite to PostgreSQL

April 24, 2026

n8n-migrationsqlite-to-postgresn8n-databasequeue-modeself-hosted-n8n

PostgreSQL is required for n8n queue mode. If you started with SQLite and need to scale, here's how to migrate without losing workflows or credentials.

The process is non-destructive - you'll export everything before touching your PostgreSQL setup. The critical piece most guides miss: your encryption key must stay identical, or all credentials become unreadable.

Before You Start

Back up your encryption key. Find it in your environment variables as N8N_ENCRYPTION_KEY. Store it somewhere safe. Using a different key after migration renders every stored credential useless. Source

Lock your n8n version. Export and import must use identical n8n versions. Don't use the :latest tag during migration - pin to your current version number. Version mismatches cause import failures.

Stop n8n. The SQLite file must not change during export. Shut down your instance completely before proceeding.

Check your n8n version:

n8n --version

Export Your Data

n8n versions 1.67+ use the export:entities command. Earlier versions require separate export:workflow and export:credentials commands.

For n8n 1.67+:

n8n export:entities --backup --output=./n8n-backup.json

This exports workflows, credentials, users, projects, and folders in a single file.

For earlier versions:

n8n export:workflow --all --output=./workflows.json
n8n export:credentials --all --output=./credentials.json

The --all flag captures everything. Without it, you'd need to specify individual IDs.

Verify your export files exist and contain data before proceeding. Open them in a text editor - you should see JSON with your workflow names visible.

Set Up PostgreSQL

Create a fresh PostgreSQL instance. Here's a Docker Compose setup that works:

version: '3.8'

services:
  postgres:
    image: postgres:15
    restart: always
    environment:
      POSTGRES_USER: n8n
      POSTGRES_PASSWORD: your_secure_password
      POSTGRES_DB: n8n
    volumes:
      - postgres_data:/var/lib/postgresql/data
    ports:
      - "5432:5432"

  n8n:
    image: n8nio/n8n:1.67.0  # Pin to your current version
    restart: always
    environment:
      - DB_TYPE=postgresdb
      - DB_POSTGRESDB_HOST=postgres
      - DB_POSTGRESDB_PORT=5432
      - DB_POSTGRESDB_DATABASE=n8n
      - DB_POSTGRESDB_USER=n8n
      - DB_POSTGRESDB_PASSWORD=your_secure_password
      - N8N_ENCRYPTION_KEY=your_existing_encryption_key
    ports:
      - "5678:5678"
    volumes:
      - n8n_data:/home/node/.n8n
    depends_on:
      - postgres

volumes:
  postgres_data:
  n8n_data:

Replace your_existing_encryption_key with the exact key from your SQLite setup. This is non-negotiable.

For managed PostgreSQL (Supabase, AWS RDS, etc.), just update the connection parameters - the n8n configuration stays the same. Source

Import Your Data

Start your new PostgreSQL-backed n8n instance first. It needs to create the database schema before import.

docker-compose up -d

Wait 30 seconds for initialization, then run the import:

For n8n 1.67+:

docker exec -it n8n n8n import:entities --input=/path/to/n8n-backup.json

For earlier versions:

docker exec -it n8n n8n import:workflow --input=/path/to/workflows.json
docker exec -it n8n n8n import:credentials --input=/path/to/credentials.json

The import command outputs each workflow and credential as it processes. Watch for errors - a clean import shows no red text.

Validation Checklist

Don't assume success. Verify explicitly:

Workflow check: Open the n8n editor. Every workflow should appear in the sidebar. Open three random workflows and confirm nodes display correctly.

Credential check: Go to Credentials in the menu. Every credential should be listed. Edit one and verify the connection test passes - this confirms decryption worked.

Execution test: Run one workflow manually. Check that it executes and the execution appears in the history.

Create new data: Make a simple test workflow and save it. Create a test credential. Restart n8n. Verify both persist after restart. This confirms PostgreSQL is actually being used, not a stale SQLite file.

If credentials show as "invalid" or won't decrypt, your encryption key doesn't match. You'll need to re-export from the original SQLite instance with the correct key.

What About Execution History?

Execution history can be migrated but often isn't worth it. The history table grows large quickly, and old executions rarely matter. The CLI commands above don't include execution history by default.

If you need it, the ServerSpace guide covers manual table migration using DBeaver to connect both databases simultaneously.

Troubleshooting

Import fails with version error: Your export and import n8n versions don't match. Pin both to the same version number.

Credentials won't decrypt: Encryption key mismatch. Check N8N_ENCRYPTION_KEY in both environments - they must be byte-for-byte identical.

Workflows appear empty: Database connection might be failing silently. Check PostgreSQL logs and verify the connection parameters.

Import command not found: You're on an older n8n version. Use export:workflow and export:credentials separately instead of export:entities.

Why Bother With PostgreSQL?

SQLite works fine for small deployments. PostgreSQL becomes necessary when you need queue mode for horizontal scaling, or when your execution history grows large enough to slow down the UI.

Queue mode - which distributes workflow executions across multiple workers - requires PostgreSQL as a hard dependency. SQLite's file-based locking prevents concurrent access from multiple processes.

Alternative Method: Direct Database Copy

If CLI commands aren't working, some users prefer connecting both databases in DBeaver (a database GUI tool) and copying tables directly. This bypasses n8n's export/import logic entirely but requires more database knowledge. Source

FAQ

Will I lose any workflows during migration?

No. The export captures all workflows completely. As long as you export with --all and import to a fresh database, nothing is lost.

Do my credentials survive the migration?

Yes, if you preserve the same N8N_ENCRYPTION_KEY. Credentials are encrypted at rest - changing the key makes them unreadable. Source

Can I migrate back to SQLite if PostgreSQL doesn't work out?

Yes. The same export/import process works in reverse. Export from PostgreSQL, spin up a SQLite instance with the same encryption key, import.

How long does migration take?

Minutes for most deployments. Export and import are fast - the time is mostly spent setting up PostgreSQL and verifying everything works.

Do I need to stop n8n during migration?

Yes, during export. SQLite files can corrupt if written to during a copy operation. PostgreSQL doesn't have this limitation for future operations.

What if I'm using npm/pm2 instead of Docker?

The CLI commands are identical. Just run them directly instead of through docker exec. Make sure your PostgreSQL connection environment variables are set before running the import.


Need help with n8n migrations or setting up queue mode for production workloads? n8n Logic provides automation consulting and managed n8n deployments. Get in touch to discuss your requirements.


Migrating n8n from SQLite to PostgreSQL | n8nlogic