/n8n-tutorials

How to use n8n with self-hosted databases?

Learn how to integrate n8n with self-hosted databases like MySQL, PostgreSQL, MongoDB, and more. Set up connections, create workflows, ensure security, and optimize performance for powerful automation.

Matt Graham, CEO of Rapid Developers

Book a call with an Expert

Starting a new venture? Need to upgrade your web app? RapidDev builds application with your growth in mind.

Book a free consultation

How to use n8n with self-hosted databases?

N8n is a workflow automation tool that can be integrated with self-hosted databases to create powerful automation workflows. To use n8n with self-hosted databases, you'll need to set up the n8n instance, configure database connections using the appropriate database nodes, create workflows that interact with your database, and implement proper error handling and security measures.

 

Setting Up N8n for Database Integration

 

Step 1: Install and Deploy N8n

Before connecting to self-hosted databases, you need to have n8n installed and running. Here are the options:

Option A: Install using NPM


# Install n8n globally
npm install -g n8n

# Start n8n
n8n

Option B: Install using Docker


docker run -it --rm \\
  --name n8n \\
  -p 5678:5678 \\
  -v ~/.n8n:/home/node/.n8n \\
  n8nio/n8n

Option C: Install using Docker Compose

Create a docker-compose.yml file:


version: '3'

services:
  n8n:
    image: n8nio/n8n
    restart: always
    ports:
    - "5678:5678"
    volumes:
    - ~/.n8n:/home/node/.n8n
    environment:
    - N8N_ENCRYPTION_KEY=your-secure-encryption-key
    - N8N\_PROTOCOL=https
    - NODE\_ENV=production

Then run:


docker-compose up -d

After installation, access n8n at http://localhost:5678 (or your server IP/domain).

 

Step 2: Database Preparation

 

Before connecting n8n to your self-hosted database, ensure your database is:

  • Properly secured with strong credentials
  • Accessible from the network where n8n is running
  • Has appropriate user permissions for the operations n8n will perform

Additionally, prepare the database connection details:

  • Hostname/IP address
  • Port number
  • Database name
  • Username and password
  • Any required SSL certificates

 

Connecting N8n to Different Self-Hosted Databases

 

Step 3: Connecting to MySQL/MariaDB

 

A. Using the MySQL Node

  1. In your n8n workflow, add a new node by clicking the "+" button.
  2. Search for "MySQL" and select it.
  3. Configure the connection:

{
  "host": "your-mysql-server",
  "port": 3306,
  "database": "your-database",
  "user": "your-username",
  "password": "your-password",
  "ssl": false  // Set to true if using SSL
}
  1. Set up the operation:
  • Choose an operation (Execute Query, Insert, Update, etc.)
  • For "Execute Query", provide your SQL query
  • For other operations, select the table and configure the required fields

B. Creating a Basic MySQL Query Workflow

  1. Start with a trigger node (e.g., Webhook or Schedule).
  2. Add the MySQL node and connect it to the trigger.
  3. Configure the MySQL node for a query:

SELECT \* FROM customers WHERE active = 1 LIMIT 10
  1. Add an output node (e.g., Respond to Webhook) to return the results.

 

Step 4: Connecting to PostgreSQL

 

A. Using the PostgreSQL Node

  1. Add a new node to your workflow.
  2. Search for "PostgreSQL" and select it.
  3. Configure the connection:

{
  "host": "your-postgres-server",
  "port": 5432,
  "database": "your-database",
  "user": "your-username",
  "password": "your-password",
  "ssl": {
    "enabled": false,  // Set to true if using SSL
    "rejectUnauthorized": true
  }
}
  1. Set the operation (similar to MySQL):
  • Choose "Execute Query" for custom SQL
  • Or select other operations like Insert, Update, etc.

B. Example: PostgreSQL with Parameterized Queries

For secure database interactions, use parameterized queries:


SELECT \* FROM users WHERE user\_id = $1 AND status = $2

Set the "Additional Fields" → "Query Parameters" to:


[
  "{{$node["Input\_Node"].data.userId}}",
  "active"
]

 

Step 5: Connecting to MongoDB

 

A. Using the MongoDB Node

  1. Add a new node to your workflow.
  2. Search for "MongoDB" and select it.
  3. Configure the connection:

{
  "url": "mongodb://your-username:your-password@your-mongodb-server:27017/your-database",
  "options": {
    "ssl": false  // Set to true if using SSL
  }
}

Or provide individual connection parameters:


{
  "hostname": "your-mongodb-server",
  "port": 27017,
  "database": "your-database",
  "user": "your-username",
  "password": "your-password",
  "ssl": false
}
  1. Set the operation:
  • Find
  • Find One
  • Insert
  • Update
  • Delete

B. Example: Finding Documents in MongoDB

To query MongoDB documents:

  1. Select the "Find" operation.
  2. Set the collection name (e.g., "users").
  3. Configure the query filter:

{
  "status": "active",
  "age": { "$gt": 21 }
}
  1. Optional: Set sort, limit, and skip parameters for pagination.

 

Step 6: Connecting to Microsoft SQL Server

 

A. Using the Microsoft SQL Node

  1. Add a new node to your workflow.
  2. Search for "Microsoft SQL" and select it.
  3. Configure the connection:

{
  "server": "your-mssql-server",
  "port": 1433,
  "database": "your-database",
  "user": "your-username",
  "password": "your-password",
  "domain": "",  // If using domain authentication
  "connectionTimeout": 15000,
  "requestTimeout": 15000,
  "options": {
    "encrypt": true,  // Set to true for encrypted connections
    "trustServerCertificate": false  // Set to true to trust self-signed certs
  }
}
  1. Set the operation (similar to other database nodes).

B. Example: Executing a Stored Procedure

For SQL Server stored procedures:

  1. Choose "Execute Query" operation.
  2. Enter the stored procedure call:

EXEC GetCustomerDetails @CustomerId = 12345

Or with parameterized values:


EXEC GetCustomerDetails @CustomerId = @param1

And set the query parameters to:


[
  "{{$node["Previous\_Node"].data.customerId}}"
]

 

Step 7: Connecting to SQLite

 

A. Using the SQLite Node

  1. Add a new node to your workflow.
  2. Search for "SQLite" and select it.
  3. Configure the connection:

{
  "database": "/path/to/your/database.sqlite"
}

For Docker installations, place the SQLite database file in a mounted volume.

  1. Set the operation (similar to other SQL database nodes).

B. Example: Creating and Populating a SQLite Database

To create a new table and insert data:

  1. Choose "Execute Query" operation.
  2. Enter the SQL to create a table:

CREATE TABLE IF NOT EXISTS tasks (
  id INTEGER PRIMARY KEY AUTOINCREMENT,
  title TEXT NOT NULL,
  description TEXT,
  status TEXT DEFAULT 'pending',
  created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
)
  1. Add another SQLite node for insertion:

INSERT INTO tasks (title, description, status)
VALUES ('{{$json["title"]}}', '{{$json["description"]}}', '{{$json["status"]}}')

 

Step 8: Connecting to Redis

 

A. Using the Redis Node

  1. Add a new node to your workflow.
  2. Search for "Redis" and select it.
  3. Configure the connection:

{
  "host": "your-redis-server",
  "port": 6379,
  "password": "your-password",
  "database": 0,
  "ssl": false
}
  1. Set the operation:
  • Get
  • Set
  • Delete
  • Incr
  • Keys
  • TTL
  • Type
  • Expire
  • etc.

B. Example: Using Redis for Caching

To implement a simple caching mechanism:

  1. Add a Function node to check if data exists in cache:

return [
  {
    key: 'user:' + $input.item.json.userId,
    checkCache: true
  }
];
  1. Add a Redis node with "Get" operation using the key from the Function node.
  2. Add an IF node to check if Redis returned data.
  3. If no cache hit, query your main database and then cache the result with another Redis node:

// Redis Set operation
{
  "operation": "set",
  "key": "user:{{$node["Function"].json["key"]}}",
  "value": "{{$json}}",
  "expireTime": 3600
}

 

Advanced Database Integration Techniques

 

Step 9: Managing Database Transactions

 

For operations requiring transactions, use the "Execute Query" operation with transaction statements:

A. MySQL/PostgreSQL Transactions

  1. Start a transaction:

BEGIN;
  1. Execute operations:

UPDATE accounts SET balance = balance - 100 WHERE user\_id = 1;
UPDATE accounts SET balance = balance + 100 WHERE user\_id = 2;
  1. Commit or rollback:

COMMIT;
-- or to revert: ROLLBACK;

B. Transaction Error Handling

For better transaction management:

  1. Add an Error Trigger node to handle errors.
  2. Connect it to a database node to execute the ROLLBACK statement.
  3. Use a Function node to log the error:

// Function node to log error and prepare rollback
if ($input.item.json.error) {
  console.log('Transaction error:', $input.item.json.error);
  return [{ command: 'ROLLBACK;' }];
}
return [{ command: 'COMMIT;' }];

 

Step 10: Implementing Database Migrations

 

Use n8n to manage database schema changes:

A. Creating a Migration Workflow

  1. Create a new workflow triggered by a webhook or schedule.
  2. Add a Function node to determine migration actions:

// Check current schema version
return [
  {
    query: "SELECT value FROM app_settings WHERE key = 'schema_version'"
  }
];
  1. Add a database node to execute the query.
  2. Add a Switch node to run different migrations based on the current version.
  3. For each version, add database nodes with the appropriate schema changes:

-- Version 1 to 2
ALTER TABLE users ADD COLUMN last\_login TIMESTAMP NULL;
UPDATE app_settings SET value = '2' WHERE key = 'schema_version';

 

Step 11: Working with Multiple Databases

 

A. Synchronizing Data Between Databases

To copy data from one database to another:

  1. Start with a trigger (Schedule or Webhook).
  2. Add the source database node (e.g., MySQL) to retrieve data:

SELECT id, name, email, updated\_at FROM customers 
WHERE updated_at > '{{$json["last_sync\_time"]}}'
  1. Add a Function node to transform data if needed:

return $input.item.json.map(item => ({
  customer\_id: item.id,
  full\_name: item.name,
  email\_address: item.email,
  last_modified: item.updated_at
}));
  1. Add the target database node (e.g., PostgreSQL) to insert/update data:

INSERT INTO customer_mirror (customer_id, full_name, email_address, last\_modified)
VALUES ({{$json["customer_id"]}}, '{{$json["full_name"]}}', '{{$json["email_address"]}}', '{{$json["last_modified"]}}')
ON CONFLICT (customer\_id) 
DO UPDATE SET full_name = EXCLUDED.full_name, 
              email_address = EXCLUDED.email_address,
              last_modified = EXCLUDED.last_modified
  1. Add a final database node to update the last sync time.

 

Step 12: Implementing Database Security Best Practices

 

A. Managing Credentials

Store database credentials securely:

  1. In n8n, go to Settings → Credentials.
  2. Create new credentials for your database type.
  3. Fill in the connection details and save.
  4. Reference these credentials in your workflow nodes instead of hardcoding them.

B. Implementing Least Privilege

Create database users with minimal required permissions:


-- MySQL example for read-only user
CREATE USER 'n8n\_readonly'@'%' IDENTIFIED BY 'strong-password';
GRANT SELECT ON database_name.\* TO 'n8n_readonly'@'%';
FLUSH PRIVILEGES;

-- PostgreSQL example for specific table access
CREATE USER n8n\_app WITH PASSWORD 'strong-password';
GRANT CONNECT ON DATABASE my_database TO n8n_app;
GRANT SELECT, INSERT, UPDATE ON table_name TO n8n_app;

C. Using Parameterized Queries

Always use parameterized queries to prevent SQL injection:


-- Instead of:
SELECT \* FROM users WHERE username = '{{$json["username"]}}'

-- Use:
SELECT \* FROM users WHERE username = $1

With query parameters set to:


[
  "{{$json["username"]}}"
]

 

Step 13: Optimizing Database Performance

 

A. Implementing Connection Pooling

For high-volume workflows, set up connection pooling:

  1. For PostgreSQL, modify your connection string:

{
  "host": "your-postgres-server",
  "port": 5432,
  "database": "your-database",
  "user": "your-username",
  "password": "your-password",
  "ssl": false,
  "poolOptions": {
    "min": 2,
    "max": 10,
    "idleTimeoutMillis": 30000
  }
}
  1. For MySQL:

{
  "host": "your-mysql-server",
  "port": 3306,
  "database": "your-database",
  "user": "your-username",
  "password": "your-password",
  "ssl": false,
  "connectionLimit": 10
}

B. Query Optimization

Optimize database queries for performance:

  1. Select only needed columns instead of using SELECT *
  2. Add appropriate indexes to your database tables
  3. Use LIMIT to restrict result sets
  4. Consider using database views for complex queries

-- Create an optimized view
CREATE VIEW active\_customers AS
SELECT customer\_id, name, email
FROM customers
WHERE status = 'active'

Then in n8n:


SELECT \* FROM active\_customers LIMIT 100

 

Step 14: Implementing Data Validation and Error Handling

 

A. Validate Data Before Database Operations

Use a Function node to validate data:


// Data validation example
const data = $input.item.json;
const errors = [];

if (!data.email || !data.email.includes('@')) {
  errors.push('Invalid email address');
}

if (!data.name || data.name.length < 2) {
  errors.push('Name is too short');
}

if (errors.length > 0) {
  return { valid: false, errors };
} else {
  return { valid: true, data };
}

B. Handling Database Errors

  1. Add an Error Trigger node to catch database errors.
  2. Connect it to notification nodes (e.g., Slack, Email) to alert about failures.
  3. Implement retry mechanisms for transient errors:

// Function node to implement retry logic
const maxRetries = 3;
const retryCount = $input.item.json.retryCount || 0;

if ($input.item.json.error && retryCount < maxRetries) {
  // Wait with exponential backoff
  const waitTime = Math.pow(2, retryCount) \* 1000;
  
  // Sleep and then retry
  return new Promise((resolve) => {
    setTimeout(() => {
      resolve([{
        ...($input.item.json.originalData || {}),
        retryCount: retryCount + 1
      }]);
    }, waitTime);
  });
} else if (retryCount >= maxRetries) {
  // Max retries reached, log and alert
  return [{ 
    status: 'failed',
    error: $input.item.json.error,
    message: 'Max retries exceeded'
  }];
} else {
  // No error, continue normally
  return $input.item;
}

 

Step 15: Creating a Complete Database CRUD Workflow

 

Let's create a complete workflow for managing a "products" table:

A. Setting Up the API Endpoints

  1. Create a new workflow with a Webhook node configured for:
  • GET: List/retrieve products
  • POST: Create a product
  • PUT: Update a product
  • DELETE: Delete a product

// Webhook configuration
{
  "path": "products",
  "responseMode": "lastNode",
  "options": {
    "allowedMethods": ["GET", "POST", "PUT", "DELETE"]
  }
}

B. Implementing the CRUD Operations

  1. Add a Switch node to handle different HTTP methods:

// Switch based on request method
{
  "rules": [
    {
      "value": "GET",
      "conditions": [{
        "value1": "={{$node["Webhook"].json["method"]}}",
        "operation": "equals",
        "value2": "GET"
      }]
    },
    {
      "value": "POST",
      "conditions": [{
        "value1": "={{$node["Webhook"].json["method"]}}",
        "operation": "equals",
        "value2": "POST"
      }]
    },
    {
      "value": "PUT",
      "conditions": [{
        "value1": "={{$node["Webhook"].json["method"]}}",
        "operation": "equals",
        "value2": "PUT"
      }]
    },
    {
      "value": "DELETE",
      "conditions": [{
        "value1": "={{$node["Webhook"].json["method"]}}",
        "operation": "equals",
        "value2": "DELETE"
      }]
    }
  ]
}
  1. For GET requests, add a Database node:

// If query parameter id exists, get one product
{{$node["Webhook"].json\["query"]\["id"] !== undefined ? 
  `SELECT * FROM products WHERE id = ${$node["Webhook"].json["query"]["id"]}` : 
  "SELECT \* FROM products ORDER BY id"}}
  1. For POST requests, add a Function node for validation:

const data = $node["Webhook"].json["body"];
const required = ["name", "price", "category"];
const errors = [];

for (const field of required) {
  if (!data[field]) {
    errors.push(`Missing required field: ${field}`);
  }
}

if (errors.length > 0) {
  return { valid: false, errors };
}

return { 
  valid: true, 
  data: {
    name: data.name,
    price: parseFloat(data.price),
    category: data.category,
    description: data.description || null
  }
};
  1. Add a database node for insertion:

INSERT INTO products (name, price, category, description)
VALUES (
  '{{$node["Validation"].json\["data"]\["name"]}}',
  {{$node["Validation"].json\["data"]\["price"]}},
  '{{$node["Validation"].json\["data"]\["category"]}}',
  {{$node["Validation"].json\["data"]\["description"] ? "'" + $node["Validation"].json\["data"]\["description"] + "'" : "NULL"}}
)
RETURNING \*
  1. For PUT requests, add validation and update nodes:

UPDATE products
SET 
  name = '{{$node["Update\_Validation"].json\["data"]\["name"]}}',
  price = {{$node["Update\_Validation"].json\["data"]\["price"]}},
  category = '{{$node["Update\_Validation"].json\["data"]\["category"]}}',
  description = {{$node["Update_Validation"].json\["data"]\["description"] ? "'" + $node["Update_Validation"].json\["data"]\["description"] + "'" : "NULL"}}
WHERE id = {{$node["Webhook"].json\["query"]\["id"]}}
RETURNING \*
  1. For DELETE requests, add a database node:

DELETE FROM products
WHERE id = {{$node["Webhook"].json\["query"]\["id"]}}
RETURNING id
  1. Add a final Set node to format the response:

// Format successful response
{
  "statusCode": 200,
  "data": {{$json}},
  "success": true
}

 

Step 16: Implementing Database Backup with N8n

 

Create an automated backup workflow:

A. Setting Up Regular Backups

  1. Start with a Schedule trigger node (e.g., daily at midnight).
  2. Add an Execute Command node for PostgreSQL backup:

// For PostgreSQL
PGPASSWORD=your-password pg\_dump -h your-postgres-server -U your-username -d your-database -F c -f /backups/backup-{{$today}}.dump

Or for MySQL:


// For MySQL
mysqldump -h your-mysql-server -u your-username -p'your-password' your-database > /backups/backup-{{$today}}.sql
  1. Add an error handler to notify of backup failures.

B. Uploading Backups to Cloud Storage

  1. Add an S3 or other storage node to upload the backup:

// S3 Upload configuration
{
  "operation": "upload",
  "bucket": "your-backup-bucket",
  "fileName": "database-backups/backup-{{$today}}.sql",
  "filePath": "/backups/backup-{{$today}}.sql"
}
  1. Add a cleanup node to remove local backups after successful upload:

// Cleanup old backups
find /backups -name "\*.sql" -mtime +7 -delete

 

Step 17: Monitoring Database Health with N8n

 

Create a workflow to monitor database health:

A. Checking Database Connection

  1. Start with a Schedule trigger (e.g., every 5 minutes).
  2. Add a database node with a simple query:

// Health check query
SELECT 1 as health\_check
  1. Add an IF node to check if the query was successful.
  2. On failure, send notifications via email, Slack, etc.

B. Monitoring Database Metrics

  1. Add database nodes to collect performance metrics:

// PostgreSQL metrics
SELECT 
  count(\*) as active\_connections,
  max(now() - query_start) as longest_running_query_seconds
FROM pg_stat_activity
WHERE state = 'active'
  1. Add a Function node to evaluate metrics:

// Check if metrics are within acceptable ranges
const metrics = $input.item.json;
const alerts = [];

if (metrics.active\_connections > 100) {
  alerts.push(`High connection count: ${metrics.active_connections}`);
}

if (metrics.longest_running_query\_seconds > 300) {
  alerts.push(`Long-running query: ${Math.round(metrics.longest_running_query_seconds)}s`);
}

if (alerts.length > 0) {
  return { 
    hasAlerts: true, 
    alerts: alerts.join(', '),
    metrics
  };
} else {
  return { hasAlerts: false, metrics };
}
  1. Add an IF node to send alerts only when needed.

 

Conclusion and Best Practices

 

Step 18: Optimizing N8n Database Workflows

 

A. Performance Best Practices

  • Use batching for large data sets to reduce database load
  • Implement proper indexes on database tables
  • Use connection pooling for high-volume workflows
  • Schedule resource-intensive operations during off-peak hours
  • Implement caching using Redis for frequently accessed data

B. Security Best Practices

  • Use dedicated database users with least privilege permissions
  • Store credentials securely in n8n's credential store
  • Always use parameterized queries to prevent SQL injection
  • Implement proper input validation before database operations
  • Use SSL/TLS for database connections when possible
  • Regularly audit database access patterns

C. Reliability Best Practices

  • Implement comprehensive error handling
  • Use retries with exponential backoff for transient failures
  • Set up monitoring and alerting for database workflows
  • Implement database transaction management for multi-step operations
  • Create regular automated backups
  • Test recovery procedures periodically

By following this guide, you can effectively integrate n8n with your self-hosted databases to create powerful, secure, and reliable workflow automations that leverage your data for business processes.

Want to explore opportunities to work with us?

Connect with our team to unlock the full potential of no-code solutions with a no-commitment consultation!

Book a Free Consultation

Client trust and success are our top priorities

When it comes to serving you, we sweat the little things. That’s why our work makes a big impact.

Rapid Dev was an exceptional project management organization and the best development collaborators I've had the pleasure of working with. They do complex work on extremely fast timelines and effectively manage the testing and pre-launch process to deliver the best possible product. I'm extremely impressed with their execution ability.

CPO, Praction - Arkady Sokolov

May 2, 2023

Working with Matt was comparable to having another co-founder on the team, but without the commitment or cost. He has a strategic mindset and willing to change the scope of the project in real time based on the needs of the client. A true strategic thought partner!

Co-Founder, Arc - Donald Muir

Dec 27, 2022

Rapid Dev are 10/10, excellent communicators - the best I've ever encountered in the tech dev space. They always go the extra mile, they genuinely care, they respond quickly, they're flexible, adaptable and their enthusiasm is amazing.

Co-CEO, Grantify - Mat Westergreen-Thorne

Oct 15, 2022

Rapid Dev is an excellent developer for no-code and low-code solutions.
We’ve had great success since launching the platform in November 2023. In a few months, we’ve gained over 1,000 new active users. We’ve also secured several dozen bookings on the platform and seen about 70% new user month-over-month growth since the launch.

Co-Founder, Church Real Estate Marketplace - Emmanuel Brown

May 1, 2024 

Matt’s dedication to executing our vision and his commitment to the project deadline were impressive. 
This was such a specific project, and Matt really delivered. We worked with a really fast turnaround, and he always delivered. The site was a perfect prop for us!

Production Manager, Media Production Company - Samantha Fekete

Sep 23, 2022