/n8n-tutorials

How to store workflow data in n8n?

Learn how to store workflow data in n8n using variables, Set/Get nodes, binary storage, databases, Redis caching, and best practices for secure, persistent, and organized automation data management.

Matt Graham, CEO of Rapid Developers

Book a call with an Expert

Starting a new venture? Need to upgrade your web app? RapidDev builds application with your growth in mind.

Book a free consultation

How to store workflow data in n8n?

To store workflow data in n8n, you can use several approaches including workflow variables, built-in storage options like Binary Data Manager, workflow data storage nodes, and integrations with external databases. These methods allow you to persist data between workflow runs, share information across different workflows, and maintain state during complex automation processes.

 

Comprehensive Guide to Storing Workflow Data in n8n

 

Step 1: Understanding Data Storage Options in n8n

 

Before diving into implementation, it's important to understand the different data storage options available in n8n:

  • Workflow variables: Store data temporarily within a workflow execution
  • Binary Data Manager: Store binary files like images or documents
  • Execution data: Access previous execution results
  • n8n nodes for data storage: Such as Set, Get, and Function nodes
  • External database integration: Connect to databases like MySQL, PostgreSQL, MongoDB

Each option has specific use cases depending on your data persistence needs.

 

Step 2: Using Workflow Variables for Simple Data Storage

 

Workflow variables are the simplest way to store and access data within a single workflow execution.


// In a Function node, you can set workflow variables
const items = [];
// Set workflow variable
$workflow.variables.myData = 'This is stored data';
$workflow.variables.counter = 1;

// Return items to continue workflow
return items;

To access these variables later in the workflow:


// In another Function node, access the stored data
const storedData = $workflow.variables.myData;
const counter = $workflow.variables.counter;

// Use the data
items[0].json.retrievedData = storedData;
items[0].json.currentCount = counter;

return items;

Remember that workflow variables only persist during a single workflow execution. They are lost when the workflow completes.

 

Step 3: Using the Set and Get Nodes for Persistent Storage

 

For data that needs to persist between workflow executions, use the Set and Get nodes:

To store data:

  1. Add a "Set" node to your workflow
  2. Configure it with the following settings:
  • Key: A unique identifier for your data (e.g., "customerData")
  • Value: The data you want to store (can be text, numbers, or JSON)
  • TTL (Time to Live): How long to keep the data (optional)

To retrieve data:

  1. Add a "Get" node to your workflow
  2. Configure it with the following settings:
  • Key: The same identifier you used in the Set node
  • Default Value: What to return if the key doesn't exist (optional)

Example configuration for Set node:


// This would be configured in the Set node interface
Key: customerData
Value: {{ $json.customer }}
TTL: 3600 // Store for 1 hour (in seconds)

Example configuration for Get node:


// This would be configured in the Get node interface
Key: customerData
Default Value: {{ {"name": "Unknown", "id": 0} }}

 

Step 4: Storing Binary Data Using Binary Data Manager

 

For files and binary data:

  1. Use the "Write Binary File" node to store files:

// In a Function node preparing data for Write Binary File
const items = [];
items.push({
  json: {
    filename: 'example.txt',
    data: 'This is the content of my file'
  },
  binary: {}
});
return items;
  1. Configure the "Write Binary File" node:
  • File Name: {{ $json.filename }}
  • Data: {{ $json.data }}
  • Storage Location: Local (or other available option)
  1. To retrieve the data, use the "Read Binary File" node with the same file path.

 

Step 5: Working with Database Nodes for Structured Data Storage

 

For more complex data storage needs, connect to a database:

  1. PostgreSQL example:

First, add the PostgreSQL node and configure the connection:


// Configuration in PostgreSQL node
Operation: Insert
Table: user\_data
Columns: id, name, email, data
Return Fields: id

Then, prepare your data in a Function node:


// In a Function node before PostgreSQL
const userData = {
  id: 1001,
  name: 'John Doe',
  email: '[email protected]',
  data: JSON.stringify({
    preferences: {
      theme: 'dark',
      notifications: true
    },
    lastLogin: new Date().toISOString()
  })
};

return [{ json: userData }];
  1. MongoDB example:

Similarly for MongoDB:


// Configuration in MongoDB node
Operation: Insert
Collection: workflow\_data
Fields: {
  "id": "={{ $json.id }}",
  "workflowName": "={{ $workflow.name }}",
  "data": "={{ $json.dataToStore }}",
  "timestamp": "={{ $now.toISOString() }}"
}

 

Step 6: Using the Function Node for Custom Storage Logic

 

For complex storage scenarios, create custom logic with the Function node:


// In a Function node - Custom storage logic with error handling
const dataToStore = items[0].json.dataToStore;
const uniqueId = items[0].json.id || Date.now().toString();

// Custom storage logic
try {
  // Store to workflow variables for current execution
  $workflow.variables[`data_${uniqueId}`] = dataToStore;
  
  // For long-term storage, you would typically use a database or Set node
  // This is just an example of custom logic
  
  // Add metadata
  const storedData = {
    id: uniqueId,
    data: dataToStore,
    storedAt: new Date().toISOString(),
    workflowId: $workflow.id,
    workflowName: $workflow.name
  };
  
  // Return success response
  return [{
    json: {
      success: true,
      message: 'Data stored successfully',
      dataId: uniqueId,
      storedData: storedData
    }
  }];
} catch (error) {
  // Return error response
  return [{
    json: {
      success: false,
      message: `Error storing data: ${error.message}`,
      error: error.toString()
    }
  }];
}

 

Step 7: Implementing a Cache with Redis Node

 

For high-performance caching of workflow data:

  1. Add the Redis node and configure your connection.
  2. Store data with Redis SET operation:

// Redis node configuration
Operation: Set
Key: workflow:{{ $workflow.id }}:data:{{ $json.key }}
Value: {{ $json.dataToCache }}
Expire: 3600  // Optional TTL in seconds
  1. Retrieve data with Redis GET operation:

// Redis node configuration
Operation: Get
Key: workflow:{{ $workflow.id }}:data:{{ $json.key }}

 

Step 8: Creating a Data Versioning System

 

For tracking changes to your workflow data:


// Function node for versioned data storage
const data = items[0].json.data;
const dataKey = items[0].json.key || 'default';

// Get current timestamp for versioning
const timestamp = Date.now();
const versionId = `v_${timestamp}`;

// Create versioned data structure
const versionedData = {
  key: dataKey,
  currentVersion: versionId,
  versions: {
    [versionId]: {
      data: data,
      timestamp: timestamp,
      createdAt: new Date().toISOString(),
      workflowId: $workflow.id,
      executionId: $execution.id
    }
  },
  history: [{
    versionId: versionId,
    timestamp: timestamp,
    operation: 'create'
  }]
};

// Get existing data if it exists
// This would typically involve a Get node or database query before this Function node
// For this example, we'll simulate that we're adding a new version to existing data
if ($workflow.variables[`versioned_${dataKey}`]) {
  const existingData = $workflow.variables[`versioned_${dataKey}`];
  
  // Update with new version
  versionedData.versions = {
    ...existingData.versions,
    [versionId]: versionedData.versions[versionId]
  };
  
  // Update history
  versionedData.history = [
    ...existingData.history,
    versionedData.history[0]
  ];
  
  // Preserve creation info
  versionedData.createdAt = existingData.createdAt;
  versionedData.createdBy = existingData.createdBy;
}

// Store the versioned data
$workflow.variables[`versioned_${dataKey}`] = versionedData;

// Return the updated versioned data
return [{
  json: {
    success: true,
    key: dataKey,
    currentVersion: versionId,
    versionedData: versionedData
  }
}];

 

Step 9: Working with Execution Data Storage

 

n8n allows you to access data from previous workflow executions:

  1. Enable execution data storage in your n8n instance settings
  2. Use an "Execution" node to fetch previous execution data:

// Configuration in Execution node
Operation: Get All
Workflow ID: current
Limit: 5
  1. Process the execution data in a Function node:

// In a Function node after the Execution node
const executions = items[0].json.data;

// Extract data from previous executions
const previousData = executions.map(execution => {
  // Find the output from a specific node
  const nodeOfInterest = "My Node Name";
  const nodeData = execution.data.resultData.runData[nodeOfInterest];
  
  if (nodeData && nodeData.length > 0 && nodeData[0].data) {
    return {
      executionId: execution.id,
      timestamp: execution.startedAt,
      data: nodeData[0].data.main\[0]\[0].json
    };
  }
  return null;
}).filter(item => item !== null);

// Return the processed data
return [{ json: { previousExecutionData: previousData } }];

 

Step 10: Creating a Data Storage Abstraction Layer

 

For more flexibility, create a reusable data storage abstraction:


// Function node implementing a storage abstraction layer
function DataStorage() {
  // Storage operations
  this.save = function(key, data, options = {}) {
    const storageType = options.type || 'workflow'; // 'workflow', 'database', 'redis', etc.
    const ttl = options.ttl || null;
    
    switch (storageType) {
      case 'workflow':
        // Store in workflow variables
        $workflow.variables[`storage_${key}`] = {
          data: data,
          storedAt: Date.now(),
          expires: ttl ? Date.now() + (ttl \* 1000) : null
        };
        return true;
        
      case 'global':
        // This would require the Set node in practice
        // For demonstration purposes only
        console.log(`Storing ${key} in global storage`);
        return true;
        
      case 'database':
        // This would require a database node in practice
        // For demonstration purposes only
        console.log(`Storing ${key} in database`);
        return true;
        
      default:
        throw new Error(`Unknown storage type: ${storageType}`);
    }
  };
  
  this.get = function(key, options = {}) {
    const storageType = options.type || 'workflow';
    const defaultValue = options.default || null;
    
    switch (storageType) {
      case 'workflow':
        const storedData = $workflow.variables[`storage_${key}`];
        
        // Check if data exists
        if (!storedData) return defaultValue;
        
        // Check if data has expired
        if (storedData.expires && storedData.expires < Date.now()) {
          // Data expired, clean up and return default
          delete $workflow.variables[`storage_${key}`];
          return defaultValue;
        }
        
        return storedData.data;
        
      case 'global':
        // This would require the Get node in practice
        console.log(`Retrieving ${key} from global storage`);
        return defaultValue;
        
      case 'database':
        // This would require a database node in practice
        console.log(`Retrieving ${key} from database`);
        return defaultValue;
        
      default:
        throw new Error(`Unknown storage type: ${storageType}`);
    }
  };
  
  this.delete = function(key, options = {}) {
    const storageType = options.type || 'workflow';
    
    switch (storageType) {
      case 'workflow':
        if ($workflow.variables[`storage_${key}`]) {
          delete $workflow.variables[`storage_${key}`];
          return true;
        }
        return false;
        
      case 'global':
        // Would require additional nodes in practice
        console.log(`Deleting ${key} from global storage`);
        return true;
        
      case 'database':
        // Would require additional nodes in practice
        console.log(`Deleting ${key} from database`);
        return true;
        
      default:
        throw new Error(`Unknown storage type: ${storageType}`);
    }
  };
}

// Initialize storage
const storage = new DataStorage();

// Example usage
try {
  // Store data
  storage.save('userPreferences', {
    theme: 'dark',
    language: 'en',
    notifications: true
  }, { ttl: 3600 }); // Store for 1 hour
  
  // Retrieve data
  const preferences = storage.get('userPreferences', { default: { theme: 'light' } });
  
  // Process data
  const result = {
    success: true,
    message: 'Data operation successful',
    currentPreferences: preferences
  };
  
  // Return result
  return [{ json: result }];
} catch (error) {
  return [{
    json: {
      success: false,
      message: `Storage operation failed: ${error.message}`
    }
  }];
}

 

Step 11: Working with Multi-Environment Data Storage

 

For workflows that run in different environments (dev, staging, prod):


// Function node for environment-aware storage
// This assumes you have an environment variable set in n8n

const environment = process.env.N8N\_ENVIRONMENT || 'development';
const key = items[0].json.key;
const data = items[0].json.data;

// Create environment-specific key
const envKey = `${environment}:${key}`;

// Store with environment prefix
$workflow.variables[envKey] = {
  data: data,
  environment: environment,
  timestamp: Date.now()
};

// Return confirmation
return [{
  json: {
    success: true,
    message: `Data stored for ${environment} environment`,
    key: envKey
  }
}];

To retrieve environment-specific data:


// Function node for environment-aware retrieval
const environment = process.env.N8N\_ENVIRONMENT || 'development';
const key = items[0].json.key;

// Create environment-specific key
const envKey = `${environment}:${key}`;

// Retrieve data
const storedData = $workflow.variables[envKey];

if (storedData) {
  return [{
    json: {
      success: true,
      environment: environment,
      data: storedData.data,
      metadata: {
        storedAt: new Date(storedData.timestamp).toISOString(),
        environment: storedData.environment
      }
    }
  }];
} else {
  return [{
    json: {
      success: false,
      message: `No data found for key "${key}" in ${environment} environment`
    }
  }];
}

 

Step 12: Implementing a JSON File Storage System

 

For storing data in local JSON files:


// Function node to prepare data for Write Binary File
const dataToStore = items[0].json.dataToStore;
const filename = items[0].json.filename || 'workflow-data.json';

// Prepare data as JSON string
const jsonData = JSON.stringify(dataToStore, null, 2);

// Create binary data for file writing
const newItem = {
  json: {
    filename: filename,
    dataSize: jsonData.length
  },
  binary: {
    data: {
      data: Buffer.from(jsonData).toString('base64'),
      mimeType: 'application/json',
      fileName: filename
    }
  }
};

return [newItem];

Then, use a "Write Binary File" node to save the file to disk.

To read the data back:


// Function node after Read Binary File to process the data
if (items[0].binary && items[0].binary.data) {
  // Convert binary data back to string
  const binaryData = items[0].binary.data;
  const fileContent = Buffer.from(binaryData.data, 'base64').toString();
  
  try {
    // Parse JSON content
    const parsedData = JSON.parse(fileContent);
    
    // Return parsed data
    return [{
      json: {
        success: true,
        data: parsedData,
        source: binaryData.fileName
      }
    }];
  } catch (error) {
    return [{
      json: {
        success: false,
        message: `Failed to parse JSON: ${error.message}`,
        fileContent: fileContent.substring(0, 100) + '...' // First 100 chars for debugging
      }
    }];
  }
} else {
  return [{
    json: {
      success: false,
      message: 'No binary data found'
    }
  }];
}

 

Step 13: Creating a Metadata System for Stored Data

 

Enhance your data storage with metadata tracking:


// Function node for metadata-enhanced storage
const data = items[0].json.data;
const key = items[0].json.key || 'default';

// Create metadata
const metadata = {
  key: key,
  createdAt: new Date().toISOString(),
  createdBy: items[0].json.userId || 'system',
  workflowId: $workflow.id,
  executionId: $execution.id,
  dataType: typeof data,
  size: JSON.stringify(data).length,
  checksum: calculateChecksum(data), // This is a placeholder function
  tags: items[0].json.tags || [],
  description: items[0].json.description || ''
};

// Create storage object with data and metadata
const storageObject = {
  data: data,
  metadata: metadata
};

// Store the data
$workflow.variables[`data_${key}`] = storageObject;

// Return the metadata
return [{
  json: {
    success: true,
    key: key,
    metadata: metadata
  }
}];

// Placeholder function - in a real scenario, you would implement a proper checksum algorithm
function calculateChecksum(data) {
  return 'checksum-' + Math.random().toString(36).substring(2, 10);
}

 

Step 14: Implementing a Data Sharing Mechanism Between Workflows

 

To share data between different workflows:

  1. First, create a data storage workflow:

// Function node in a dedicated storage workflow
// This workflow would be triggered by other workflows via a Webhook node

// Determine operation from input
const operation = $input.first().json.operation || 'get';
const key = $input.first().json.key;
const data = $input.first().json.data;
const options = $input.first().json.options || {};

// Storage operations
switch (operation.toLowerCase()) {
  case 'set':
    // Store data using the Set node after this Function node
    return [{
      json: {
        key: key,
        value: data,
        ttl: options.ttl || 0
      }
    }];
    
  case 'get':
    // Will use a Get node after this Function node
    return [{
      json: {
        key: key,
        defaultValue: options.default || null
      }
    }];
    
  case 'delete':
    // Will use a Set node with null value after this Function node
    return [{
      json: {
        key: key,
        value: null,
        ttl: 1 // Very short TTL to effectively delete
      }
    }];
    
  default:
    throw new Error(`Unknown operation: ${operation}`);
}
  1. Then, in any workflow that needs to store or retrieve data:

// HTTP Request node configuration to call the storage workflow
Method: POST
URL: https://your-n8n-instance.com/webhook/storage-workflow
Body Content Type: JSON
Body:
{
  "operation": "set",
  "key": "shared-data-key",
  "data": {{ $json.dataToShare }},
  "options": {
    "ttl": 86400
  }
}

 

Step 15: Securing Sensitive Data in Storage

 

To securely store sensitive information:


// Function node for secure data storage
const sensitiveData = items[0].json.sensitiveData;
const key = items[0].json.key;

// In a production environment, you would use proper encryption
// This is a simplified example for demonstration purposes
function encryptData(data, encryptionKey) {
  // This is NOT real encryption - just a placeholder
  // In production, use proper cryptographic libraries
  const serializedData = JSON.stringify(data);
  return {
    data: Buffer.from(serializedData).toString('base64'),
    encryptedAt: new Date().toISOString(),
    encryptionMethod: 'aes-256' // This would be the actual method used
  };
}

// "Encrypt" the data
const encryptedData = encryptData(sensitiveData, 'encryption-key');

// Store encrypted data
$workflow.variables[`secure_${key}`] = encryptedData;

// Return confirmation without the actual data
return [{
  json: {
    success: true,
    message: 'Data encrypted and stored securely',
    key: key,
    metadata: {
      encryptedAt: encryptedData.encryptedAt,
      method: encryptedData.encryptionMethod
    }
  }
}];

To retrieve and decrypt:


// Function node to retrieve and decrypt data
const key = items[0].json.key;

// Get encrypted data
const encryptedData = $workflow.variables[`secure_${key}`];

if (!encryptedData) {
  return [{
    json: {
      success: false,
      message: `No encrypted data found for key: ${key}`
    }
  }];
}

// Decrypt function (placeholder - use proper decryption in production)
function decryptData(encryptedData, encryptionKey) {
  try {
    const decodedString = Buffer.from(encryptedData.data, 'base64').toString();
    return JSON.parse(decodedString);
  } catch (error) {
    throw new Error(`Decryption failed: ${error.message}`);
  }
}

try {
  // "Decrypt" the data
  const decryptedData = decryptData(encryptedData, 'encryption-key');
  
  // Return decrypted data
  return [{
    json: {
      success: true,
      data: decryptedData,
      metadata: {
        encryptedAt: encryptedData.encryptedAt,
        decryptedAt: new Date().toISOString()
      }
    }
  }];
} catch (error) {
  return [{
    json: {
      success: false,
      message: error.message
    }
  }];
}

 

Step 16: Implementing Data Storage with Webhooks

 

Create a webhook-based storage system:

  1. Create a workflow with a Webhook node configured as follows:
  • Authentication: Header Auth
  • HTTP Method: POST
  • Path: /store-data
  • Response Mode: Last Node
  1. Add a Function node after the Webhook to process the storage request:

// Function node after Webhook node
const operation = items[0].json.operation || 'set';
const key = items[0].json.key;
const data = items[0].json.data;
const apiKey = items[0].headers['x-api-key'];

// Validate API key (you would implement proper validation)
if (apiKey !== 'your-secure-api-key') {
  return [{
    json: {
      success: false,
      message: 'Invalid API key'
    }
  }];
}

// Process operation
switch (operation) {
  case 'set':
    // Use Set node after this
    return [{
      json: {
        key: key,
        value: data
      }
    }];
    
  case 'get':
    // Use Get node after this
    return [{
      json: {
        key: key
      }
    }];
    
  default:
    return [{
      json: {
        success: false,
        message: `Unknown operation: ${operation}`
      }
    }];
}
  1. Add Set or Get nodes depending on the operation.

  2. Add a final Function node to format the response:


// Final Function node before response
const operation = items[0].json.operation;
const key = items[0].json.key;

// Format response based on operation
if (operation === 'get') {
  const retrievedData = items[0].json.value;
  
  return [{
    json: {
      success: true,
      operation: 'get',
      key: key,
      data: retrievedData,
      timestamp: Date.now()
    }
  }];
} else if (operation === 'set') {
  return [{
    json: {
      success: true,
      operation: 'set',
      key: key,
      message: 'Data stored successfully',
      timestamp: Date.now()
    }
  }];
}

 

Step 17: Advanced Error Handling for Data Storage

 

Implement robust error handling for your storage operations:


// Function node with advanced error handling
const operation = items[0].json.operation;
const key = items[0].json.key;
const data = items[0].json.data;

// Logger function
function logStorageOperation(status, details) {
  const logEntry = {
    timestamp: new Date().toISOString(),
    operation: operation,
    key: key,
    workflowId: $workflow.id,
    executionId: $execution.id,
    status: status,
    details: details
  };
  
  // In a real scenario, you might want to store logs
  // For this example, we just console.log
  console.log('STORAGE OPERATION:', JSON.stringify(logEntry));
  
  // Return the log entry for potential further processing
  return logEntry;
}

try {
  // Validate inputs
  if (!key) {
    throw new Error('Storage key is required');
  }
  
  if (operation === 'set' && data === undefined) {
    throw new Error('Data is required for set operation');
  }
  
  // Validate key format (example: only allow alphanumeric and hyphens)
  if (!/^[a-zA-Z0-9-\_]+$/.test(key)) {
    throw new Error('Invalid key format. Use only letters, numbers, hyphens, and underscores.');
  }
  
  // Perform operation-specific validation
  if (operation === 'set') {
    // Check data size (prevent storing very large objects)
    const dataSize = JSON.stringify(data).length;
    if (dataSize > 1048576) { // 1MB limit example
      throw new Error(`Data size exceeds limit: ${dataSize} bytes`);
    }
  }
  
  // Log successful validation
  logStorageOperation('validation\_success', { dataType: typeof data });
  
  // Return validated data for further processing
  return [{
    json: {
      operation: operation,
      key: key,
      data: data,
      validationPassed: true
    }
  }];
} catch (error) {
  // Log error
  const logEntry = logStorageOperation('error', { message: error.message, stack: error.stack });
  
  // Return error response
  return [{
    json: {
      success: false,
      operation: operation,
      key: key,
      error: error.message,
      validationPassed: false,
      logEntry: logEntry
    }
  }];
}

 

Step 18: Implementing Data Transformation During Storage

 

Transform data as you store it for better organization:


// Function node for data transformation during storage
const rawData = items[0].json.data;
const key = items[0].json.key;

// Define transformation rules
const transformations = {
  // Convert dates to ISO strings
  transformDates: function(obj) {
    if (!obj) return obj;
    
    const transformed = { ...obj };
    
    // Process each property
    Object.keys(transformed).forEach(prop => {
      const value = transformed[prop];
      
      // Check if value is a Date object
      if (value instanceof Date) {
        transformed[prop] = value.toISOString();
      } 
      // Check if value is a date string
      else if (typeof value === 'string' && /^\d{4}-\d{2}-\d{2}/.test(value)) {
        const dateObj = new Date(value);
        if (!isNaN(dateObj.getTime())) {
          transformed[prop] = dateObj.toISOString();
        }
      }
      // Recursively process nested objects
      else if (typeof value === 'object' && value !== null) {
        transformed[prop] = this.transformDates(value);
      }
    });
    
    return transformed;
  },
  
  // Add metadata fields
  addMetadata: function(obj, key) {
    return {
      ...obj,
      \_metadata: {
        storedAt: new Date().toISOString(),
        key: key,
        workflowId: $workflow.id,
        version: '1.0'
      }
    };
  },
  
  // Sanitize data (remove sensitive fields)
  sanitize: function(obj) {
    if (!obj) return obj;
    
    const sensitiveFields = ['password', 'credit\_card', 'ssn', 'secret'];
    const sanitized = { ...obj };
    
    // Remove sensitive fields
    Object.keys(sanitized).forEach(prop => {
      // Check for sensitive field names
      if (sensitiveFields.includes(prop.toLowerCase())) {
        delete sanitized[prop];
      }
      // Recursively process nested objects
      else if (typeof sanitized[prop] === 'object' && sanitized[prop] !== null) {
        sanitized[prop] = this.sanitize(sanitized[prop]);
      }
    });
    
    return sanitized;
  }
};

// Apply transformations
try {
  // Create a copy of the data
  let transformedData = JSON.parse(JSON.stringify(rawData));
  
  // Apply transformations in sequence
  transformedData = transformations.sanitize(transformedData);
  transformedData = transformations.transformDates(transformedData);
  transformedData = transformations.addMetadata(transformedData, key);
  
  // Store the transformed data (in this example, just in workflow variables)
  $workflow.variables[`transformed_${key}`] = transformedData;
  
  // Return the transformed data
  return [{
    json: {
      success: true,
      key: key,
      originalData: rawData,
      transformedData: transformedData,
      transformations: Object.keys(transformations)
    }
  }];
} catch (error) {
  return [{
    json: {
      success: false,
      message: `Transformation failed: ${error.message}`,
      data: rawData
    }
  }];
}

 

Step 19: Creating a Data Storage Monitoring System

 

Monitor your data storage usage and performance:


// Function node for storage monitoring
// This would be in a separate monitoring workflow

// Initialize storage stats if not present
if (!$workflow.variables.storageStats) {
  $workflow.variables.storageStats = {
    operations: {
      set: 0,
      get: 0,
      delete: 0
    },
    errors: 0,
    totalDataSize: 0,
    keyCount: 0,
    lastOperation: null,
    storageKeys: []
  };
}

// Get current stats
const stats = $workflow.variables.storageStats;

// Update stats based on latest operation
// This assumes you're passing operation details to this node
const operation = items[0].json.operation;
const key = items[0].json.key;
const dataSize = items[0].json.dataSize || 0;
const success = items[0].json.success === true;

// Update operation counts
if (operation && stats.operations[operation] !== undefined) {
  stats.operations[operation]++;
}

// Update error count
if (!success) {
  stats.errors++;
}

// Update key tracking
if (operation === 'set' && key) {
  if (!stats.storageKeys.includes(key)) {
    stats.storageKeys.push(key);
    stats.keyCount = stats.storageKeys.length;
  }
  // Update data size (approximate)
  stats.totalDataSize += dataSize;
} else if (operation === 'delete' && key) {
  stats.storageKeys = stats.storageKeys.filter(k => k !== key);
  stats.keyCount = stats.storageKeys.length;
}

// Record last operation
stats.lastOperation = {
  timestamp: Date.now(),
  operation: operation,
  key: key,
  success: success
};

// Update the stats in workflow variables
$workflow.variables.storageStats = stats;

// Return current stats
return [{
  json: {
    monitoringTimestamp: new Date().toISOString(),
    storageStats: stats,
    latestOperation: {
      operation: operation,
      key: key,
      success: success,
      timestamp: new Date().toISOString()
    }
  }
}];

 

Step 20: Best Practices for n8n Data Storage

 

Finally, here are some best practices for storing workflow data in n8n:

  • Choose the right storage method based on data persistence needs:
    • Use workflow variables for temporary data within a single execution
    • Use Set/Get nodes for data that needs to persist between executions
    • Use database nodes for complex, structured data or large datasets
  • Implement proper error handling for all storage operations
  • Add metadata to stored data for better organization and tracking
  • Encrypt sensitive information before storage
  • Create modular storage workflows that can be reused across your n8n instance
  • Monitor your storage usage to prevent performance issues
  • Implement data cleanup routines to remove old or unused data
  • Use TTL (Time to Live) settings when appropriate to automatically expire data
  • Document your storage strategies for team members

By following these steps and best practices, you can implement robust data storage solutions in your n8n workflows, enabling more complex automations while maintaining data integrity and organization.

Want to explore opportunities to work with us?

Connect with our team to unlock the full potential of no-code solutions with a no-commitment consultation!

Book a Free Consultation

Client trust and success are our top priorities

When it comes to serving you, we sweat the little things. That’s why our work makes a big impact.

Rapid Dev was an exceptional project management organization and the best development collaborators I've had the pleasure of working with. They do complex work on extremely fast timelines and effectively manage the testing and pre-launch process to deliver the best possible product. I'm extremely impressed with their execution ability.

CPO, Praction - Arkady Sokolov

May 2, 2023

Working with Matt was comparable to having another co-founder on the team, but without the commitment or cost. He has a strategic mindset and willing to change the scope of the project in real time based on the needs of the client. A true strategic thought partner!

Co-Founder, Arc - Donald Muir

Dec 27, 2022

Rapid Dev are 10/10, excellent communicators - the best I've ever encountered in the tech dev space. They always go the extra mile, they genuinely care, they respond quickly, they're flexible, adaptable and their enthusiasm is amazing.

Co-CEO, Grantify - Mat Westergreen-Thorne

Oct 15, 2022

Rapid Dev is an excellent developer for no-code and low-code solutions.
We’ve had great success since launching the platform in November 2023. In a few months, we’ve gained over 1,000 new active users. We’ve also secured several dozen bookings on the platform and seen about 70% new user month-over-month growth since the launch.

Co-Founder, Church Real Estate Marketplace - Emmanuel Brown

May 1, 2024 

Matt’s dedication to executing our vision and his commitment to the project deadline were impressive. 
This was such a specific project, and Matt really delivered. We worked with a really fast turnaround, and he always delivered. The site was a perfect prop for us!

Production Manager, Media Production Company - Samantha Fekete

Sep 23, 2022