/n8n-tutorials

How to persist user session data for an AI agent across executions in n8n?

Learn how to persist user session data for AI agents in n8n across executions using databases, file storage, or workflow variables to maintain conversation context, preferences, and history securely.

Matt Graham, CEO of Rapid Developers

Book a call with an Expert

Starting a new venture? Need to upgrade your web app? RapidDev builds application with your growth in mind.

Book a free consultation

How to persist user session data for an AI agent across executions in n8n?

To persist user session data for an AI agent across executions in n8n, you need to implement a storage mechanism that saves and retrieves user interaction data between workflow runs. This can be accomplished by using n8n's built-in features like variables nodes, external databases, or persistent file storage to maintain conversation context, user preferences, and interaction history.

 

Introduction to Session Persistence in n8n

 

Session persistence refers to maintaining a user's state and interaction data across multiple workflow executions. When working with AI agents in n8n, this is crucial for creating conversational experiences that remember past interactions. Without persistence, each workflow execution would treat the user as brand new, losing all context from previous conversations.

In n8n, workflows are typically stateless by default, meaning they don't automatically remember information from previous executions. To overcome this limitation, we need to implement explicit mechanisms to store and retrieve user data.

 

Step 1: Understanding the Session Data Requirements

 

Before implementing a solution, let's identify what types of data we typically need to persist:

  • Conversation history: Previous messages exchanged between the user and AI
  • User preferences: Settings or configurations specific to each user
  • Authentication data: User identification information
  • Context information: Any data that helps maintain the conversation flow
  • State tracking: Where in a multi-step process the user currently is

 

Step 2: Choose a Storage Method

 

n8n offers several approaches for data persistence:

  • Database Storage: Using external databases via n8n database nodes
  • n8n Variables: Using workflow variables for temporary storage
  • File Storage: Writing data to files on disk
  • External APIs: Storing data in third-party services

For robust persistence, a database is typically the best option. Let's explore each method.

 

Step 3: Implementing Database Storage (Recommended Approach)

 

This method uses a database to store session data indexed by a unique user identifier.

Step 3.1: Set Up a Database Connection

Add a database node to your workflow. n8n supports various databases including MySQL, PostgreSQL, MongoDB, etc.


// Example PostgreSQL connection parameters
{
  "host": "localhost",
  "database": "ai\_sessions",
  "user": "n8n\_user",
  "password": "your\_password",
  "port": 5432
}

Step 3.2: Create a Session Storage Table

Execute a query to create a table structure for storing session data:


// SQL for creating a session table
CREATE TABLE IF NOT EXISTS user\_sessions (
  user\_id VARCHAR(50) PRIMARY KEY,
  session\_data JSONB,
  last_updated TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);

Step 3.3: Store Session Data Function

Create a workflow that saves session data when a user interaction occurs:


// Function node to prepare session data
function storeSessionData(items) {
  const userId = items[0].json.userId; // Get user ID from incoming data
  const sessionData = {
    conversationHistory: items[0].json.history || [],
    preferences: items[0].json.preferences || {},
    lastInteraction: new Date().toISOString()
  };
  
  return [{
    json: {
      userId,
      sessionData: JSON.stringify(sessionData),
      query: "INSERT INTO user_sessions (user_id, session\_data) VALUES ($1, $2) " +
             "ON CONFLICT (user_id) DO UPDATE SET session_data = $2, last_updated = CURRENT_TIMESTAMP"
    }
  }];
}

Step 3.4: Retrieve Session Data Function

Create a function to load session data at the beginning of a workflow:


// Function node to load session data
function loadSessionData(items) {
  const userId = items[0].json.userId;
  
  return [{
    json: {
      userId,
      query: "SELECT session_data FROM user_sessions WHERE user\_id = $1",
      additionalFields: {
        parametersArray: [userId]
      }
    }
  }];
}

Step 3.5: Process Retrieved Data

After retrieving the data, parse it and make it available to your AI agent:


// Function node to parse session data
function parseSessionData(items) {
  let sessionData = { conversationHistory: [], preferences: {} };
  
  if (items[0].json.data && items[0].json.data.length > 0) {
    try {
      sessionData = JSON.parse(items[0].json.data[0].session\_data);
    } catch (error) {
      console.error("Error parsing session data:", error);
    }
  }
  
  // Merge with current request data
  return [{
    json: {
      ...items[0].json,
      sessionData
    }
  }];
}

 

Step 4: Implementing n8n Variables for Simple Persistence

 

For simpler use cases, n8n variables can store data temporarily between executions.

Step 4.1: Setting Up Workflow Variables

Create a Set node to define workflow variables:


// Set node configuration
{
  "variables": {
    "sessionData": "={{ $json.sessionData || {} }}"
  }
}

Step 4.2: Store and Update Session Data


// Function node to update variables
function updateSessionVariables(items) {
  const currentSessionData = $workflow.variables.sessionData || {};
  const userId = items[0].json.userId;
  
  // Create or update user data
  const userData = currentSessionData[userId] || {
    conversationHistory: [],
    preferences: {}
  };
  
  // Add new message to history
  if (items[0].json.message) {
    userData.conversationHistory.push({
      role: "user",
      content: items[0].json.message,
      timestamp: new Date().toISOString()
    });
  }
  
  // Update user data in the session
  currentSessionData[userId] = userData;
  
  // Set the updated session data as a workflow variable
  $workflow.variables.sessionData = currentSessionData;
  
  return [{
    json: {
      ...items[0].json,
      sessionData: userData
    }
  }];
}

Note: Workflow variables have limitations in terms of persistence duration and data size. They're best for short-term storage or with workflows that run continuously.

 

Step 5: File-Based Storage Method

 

For more persistent storage without a database, you can use the filesystem.

Step 5.1: Create File Storage Functions


// Function node to save session to file
function saveSessionToFile(items) {
  const userId = items[0].json.userId;
  const sessionData = items[0].json.sessionData || {
    conversationHistory: [],
    preferences: {}
  };
  
  // Add new data to session
  if (items[0].json.message) {
    sessionData.conversationHistory.push({
      role: "user",
      content: items[0].json.message,
      timestamp: new Date().toISOString()
    });
  }
  
  // Format for file storage
  const fileData = JSON.stringify(sessionData, null, 2);
  
  return [{
    json: {
      userId,
      sessionData,
      fileData,
      filePath: `/tmp/n8n_sessions/${userId}.json`
    }
  }];
}

Step 5.2: Set Up Write File Node

Configure a Write Binary File node to save the session data:


// Write Binary File node configuration
{
  "filePath": "={{ $json.filePath }}",
  "fileName": "={{ $json.userId }}.json",
  "dataPropertyName": "fileData",
  "options": {
    "encoding": "utf8",
    "writeMode": "overwrite"
  }
}

Step 5.3: Set Up Read File Node

Configure a Read Binary File node to retrieve the session data:


// Read Binary File node configuration
{
  "filePath": "={{ '/tmp/n8n\_sessions/' + $json.userId + '.json' }}",
  "options": {
    "encoding": "utf8"
  }
}

Step 5.4: Handle File Reading Errors

Add error handling for when files don't exist yet:


// Function node to handle reading errors
function handleFileReadResult(items) {
  // If file doesn't exist or has an error, initialize empty session
  if (items[0].json.error) {
    return [{
      json: {
        ...items[0].json,
        sessionData: {
          conversationHistory: [],
          preferences: {}
        }
      }
    }];
  }
  
  // Parse the file content
  try {
    const sessionData = JSON.parse(items[0].binary.data.toString('utf8'));
    return [{
      json: {
        ...items[0].json,
        sessionData
      }
    }];
  } catch (error) {
    console.error("Error parsing session file:", error);
    return [{
      json: {
        ...items[0].json,
        sessionData: {
          conversationHistory: [],
          preferences: {}
        }
      }
    }];
  }
}

 

Step 6: Integrating with AI Models

 

Now that we have persistence set up, let's integrate it with an AI agent.

Step 6.1: Prepare Conversation History for AI


// Function to format history for OpenAI
function prepareAIPrompt(items) {
  const sessionData = items[0].json.sessionData || { conversationHistory: [] };
  const currentMessage = items[0].json.message;
  
  // Format the conversation history for the AI
  const messages = [];
  
  // Add system message if not present
  if (sessionData.conversationHistory.length === 0 || 
      sessionData.conversationHistory[0].role !== 'system') {
    messages.push({
      role: 'system',
      content: 'You are a helpful assistant that remembers previous conversations.'
    });
  }
  
  // Add conversation history
  sessionData.conversationHistory.forEach(msg => {
    messages.push({
      role: msg.role,
      content: msg.content
    });
  });
  
  // Add the current user message
  if (currentMessage) {
    messages.push({
      role: 'user',
      content: currentMessage
    });
  }
  
  return [{
    json: {
      ...items[0].json,
      messages
    }
  }];
}

Step 6.2: Configure the OpenAI Node


// OpenAI node configuration
{
  "authentication": "serviceApiKey",
  "apiKey": "your-openai-api-key",
  "resource": "chatCompletion",
  "operation": "create",
  "model": "gpt-3.5-turbo",
  "messages": "={{ $json.messages }}",
  "options": {
    "temperature": 0.7
  }
}

Step 6.3: Update Session with AI Response


// Function to update session with AI response
function updateSessionWithAIResponse(items) {
  const sessionData = items[0].json.sessionData || { conversationHistory: [] };
  const aiResponse = items[0].json.response.choices[0].message.content;
  
  // Add AI response to conversation history
  sessionData.conversationHistory.push({
    role: 'assistant',
    content: aiResponse,
    timestamp: new Date().toISOString()
  });
  
  return [{
    json: {
      ...items[0].json,
      sessionData,
      aiResponse
    }
  }];
}

 

Step 7: Creating a Complete Workflow

 

Let's put everything together into a complete workflow:

Step 7.1: Workflow Trigger

Start with an HTTP Request trigger to receive user messages:


// HTTP Request node configuration
{
  "authentication": "none",
  "httpMethod": "POST",
  "path": "chat",
  "responseMode": "lastNode",
  "options": {}
}

Step 7.2: Extract User Information


// Function node to extract user info
function extractUserInfo(items) {
  const body = items[0].json.body;
  
  // Extract user ID and message
  return [{
    json: {
      userId: body.userId || 'anonymous',
      message: body.message || '',
      timestamp: new Date().toISOString()
    }
  }];
}

Step 7.3: Complete Workflow Structure

Here's the logical flow of the complete workflow:

  1. HTTP Request Trigger
  2. Extract User Info
  3. Load Session Data (database query or file read)
  4. Prepare AI Prompt with History
  5. Call OpenAI API
  6. Update Session with AI Response
  7. Save Updated Session (database update or file write)
  8. Return Response to User

 

Step 8: Advanced Features and Optimizations

 

Let's enhance our solution with some advanced features.

Step 8.1: Session Cleanup and Expiration

To prevent unlimited growth of session data, implement a cleanup mechanism:


// Function to trim conversation history
function trimConversationHistory(items) {
  const sessionData = items[0].json.sessionData;
  
  // Keep only the last 10 messages
  if (sessionData.conversationHistory && 
      sessionData.conversationHistory.length > 10) {
    // Always keep the system message if it exists
    const systemMessage = sessionData.conversationHistory.find(m => m.role === 'system');
    
    // Get the last 9 messages
    const recentMessages = sessionData.conversationHistory.slice(-9);
    
    // Reconstruct history with system message (if exists) and recent messages
    if (systemMessage) {
      sessionData.conversationHistory = [systemMessage, ...recentMessages];
    } else {
      sessionData.conversationHistory = recentMessages;
    }
  }
  
  return [{
    json: {
      ...items[0].json,
      sessionData
    }
  }];
}

Step 8.2: Database Cleanup Cron Job

Create a separate workflow that runs periodically to clean up old sessions:


// Cron node configuration
{
  "triggerTimes": {
    "item": [
      {
        "mode": "everyX",
        "value": 1,
        "unit": "days"
      }
    ]
  }
}

// Database query to clean up old sessions
{
  "operation": "executeQuery",
  "query": "DELETE FROM user_sessions WHERE last_updated < NOW() - INTERVAL '30 days'"
}

Step 8.3: Session Context Management

Implement a mechanism to manage different conversation contexts:


// Function to switch conversation context
function switchConversationContext(items) {
  const sessionData = items[0].json.sessionData;
  const newContext = items[0].json.context || 'default';
  
  // Initialize contexts object if it doesn't exist
  if (!sessionData.contexts) {
    sessionData.contexts = {};
  }
  
  // Save current conversation to the previous context
  const currentContext = sessionData.currentContext || 'default';
  sessionData.contexts[currentContext] = sessionData.conversationHistory || [];
  
  // Load or initialize the new context
  sessionData.conversationHistory = sessionData.contexts[newContext] || [];
  sessionData.currentContext = newContext;
  
  return [{
    json: {
      ...items[0].json,
      sessionData
    }
  }];
}

 

Step 9: Handling Multiple AI Agents

 

If you have multiple AI agents, you might want to maintain separate session data for each:


// Function to select agent-specific session data
function selectAgentSession(items) {
  const sessionData = items[0].json.sessionData || {};
  const agentId = items[0].json.agentId || 'default';
  
  // Initialize agents container if needed
  if (!sessionData.agents) {
    sessionData.agents = {};
  }
  
  // Initialize or retrieve the specific agent's data
  if (!sessionData.agents[agentId]) {
    sessionData.agents[agentId] = {
      conversationHistory: [],
      preferences: {}
    };
  }
  
  // Extract agent-specific data for use in this execution
  const agentSessionData = sessionData.agents[agentId];
  
  return [{
    json: {
      ...items[0].json,
      sessionData,
      agentSessionData
    }
  }];
}

 

Step 10: Security Considerations

 

When storing user session data, security is crucial:

Step 10.1: Data Encryption

For sensitive data, implement encryption:


// Function to encrypt session data
function encryptSessionData(items) {
  const crypto = require('crypto');
  const sessionData = items[0].json.sessionData;
  
  // Encryption key (store securely, e.g., in n8n credentials)
  const encryptionKey = 'your-secure-encryption-key';
  
  // Encrypt the data
  const algorithm = 'aes-256-cbc';
  const iv = crypto.randomBytes(16);
  const cipher = crypto.createCipheriv(algorithm, Buffer.from(encryptionKey), iv);
  
  let encrypted = cipher.update(JSON.stringify(sessionData));
  encrypted = Buffer.concat([encrypted, cipher.final()]);
  
  return [{
    json: {
      ...items[0].json,
      encryptedData: {
        iv: iv.toString('hex'),
        content: encrypted.toString('hex')
      }
    }
  }];
}

// Function to decrypt session data
function decryptSessionData(items) {
  const crypto = require('crypto');
  const encryptedData = items[0].json.encryptedData;
  
  // If no encrypted data, return empty session
  if (!encryptedData) {
    return [{
      json: {
        ...items[0].json,
        sessionData: {
          conversationHistory: [],
          preferences: {}
        }
      }
    }];
  }
  
  // Encryption key (same as used for encryption)
  const encryptionKey = 'your-secure-encryption-key';
  
  // Decrypt the data
  const algorithm = 'aes-256-cbc';
  const iv = Buffer.from(encryptedData.iv, 'hex');
  const encryptedText = Buffer.from(encryptedData.content, 'hex');
  const decipher = crypto.createDecipheriv(algorithm, Buffer.from(encryptionKey), iv);
  
  let decrypted = decipher.update(encryptedText);
  decrypted = Buffer.concat([decrypted, decipher.final()]);
  
  try {
    const sessionData = JSON.parse(decrypted.toString());
    return [{
      json: {
        ...items[0].json,
        sessionData
      }
    }];
  } catch (error) {
    console.error("Error decrypting session data:", error);
    return [{
      json: {
        ...items[0].json,
        sessionData: {
          conversationHistory: [],
          preferences: {}
        }
      }
    }];
  }
}

Step 10.2: User Authentication

Ensure only authorized users can access their session data:


// Function to verify user authentication
function verifyUserAuth(items) {
  const authToken = items[0].json.authToken;
  
  // Implement your authentication logic
  // This could involve JWT validation, checking against a user database, etc.
  
  // For demonstration purposes:
  if (!authToken) {
    throw new Error('Authentication required');
  }
  
  // Decode JWT token (example)
  try {
    const jwt = require('jsonwebtoken');
    const secret = 'your-jwt-secret';
    const decoded = jwt.verify(authToken, secret);
    
    return [{
      json: {
        ...items[0].json,
        userId: decoded.userId,
        authenticated: true
      }
    }];
  } catch (error) {
    throw new Error('Invalid authentication token');
  }
}

 

Step 11: Handling High Volume Usage

 

For applications with many users, optimize performance:

Step 11.1: Implement Caching


// Function to implement in-memory caching
function implementSessionCache(items) {
  // This would typically use a shared cache like Redis
  // For demonstration, we'll use a simple module-level cache
  
  // Access or initialize the cache (in a real implementation, use Redis or similar)
  const sessionCache = global.sessionCache = global.sessionCache || {};
  const userId = items[0].json.userId;
  
  // Check if we have a recent cache entry
  const cachedSession = sessionCache[userId];
  if (cachedSession && 
      (new Date().getTime() - cachedSession.timestamp) < 300000) { // 5 min cache
    return [{
      json: {
        ...items[0].json,
        sessionData: cachedSession.data,
        fromCache: true
      }
    }];
  }
  
  // If we get here, we need to load from persistent storage
  // We'll add a flag to indicate this isn't from cache
  return [{
    json: {
      ...items[0].json,
      fromCache: false
    }
  }];
}

// Function to update cache after database load
function updateSessionCache(items) {
  // Only run this after loading from database
  if (items[0].json.fromCache === true) {
    return items; // Already using cached data
  }
  
  const sessionCache = global.sessionCache = global.sessionCache || {};
  const userId = items[0].json.userId;
  const sessionData = items[0].json.sessionData;
  
  // Update the cache
  sessionCache[userId] = {
    data: sessionData,
    timestamp: new Date().getTime()
  };
  
  return items;
}

Step 11.2: Implement Batched Writes

For high-volume applications, batch database writes:


// Function to queue session updates
function queueSessionUpdate(items) {
  // In a real implementation, use a message queue
  const updateQueue = global.updateQueue = global.updateQueue || [];
  const userId = items[0].json.userId;
  const sessionData = items[0].json.sessionData;
  
  // Add to queue
  updateQueue.push({
    userId,
    sessionData,
    timestamp: new Date().getTime()
  });
  
  // Check if we need to trigger a batch update
  if (updateQueue.length >= 10 || global.lastBatchUpdate === undefined || 
      (new Date().getTime() - global.lastBatchUpdate) > 60000) { // 1 minute
    global.lastBatchUpdate = new Date().getTime();
    
    // In a real implementation, this would trigger a separate workflow
    // For demonstration, we'll just include the queue in the output
    return [{
      json: {
        ...items[0].json,
        updateQueue: [...updateQueue],
        triggerBatchUpdate: true
      }
    }];
  }
  
  return items;
}

 

Step 12: Testing and Debugging

 

Implement logging and testing to ensure your persistence mechanism works properly:

Step 12.1: Session Logging Function


// Function to log session operations
function logSessionOperation(items) {
  const operation = items[0].json.operation || 'unknown';
  const userId = items[0].json.userId;
  const timestamp = new Date().toISOString();
  
  console.log(`[${timestamp}] Session ${operation} for user ${userId}`);
  
  // For debugging, optionally log more details
  if (process.env.DEBUG\_SESSIONS === 'true') {
    console.log('Session data:', JSON.stringify(items[0].json.sessionData, null, 2));
  }
  
  return items;
}

Step 12.2: Create a Test Workflow

Create a separate workflow to test your session persistence:


// Manual trigger for testing
{
  "manualTriggerFunction": true
}

// Test data input
{
  "output": {
    "item": [
      {
        "json": {
          "userId": "test-user-1",
          "message": "Hello, I'm testing session persistence"
        }
      }
    ]
  }
}

 

Conclusion: Putting It All Together

 

With the steps above, you've learned how to implement robust session persistence for AI agents in n8n. Here's a summary of the key points:

  • Database storage provides the most robust solution for persistent session data
  • File-based storage offers a simpler alternative without requiring a database
  • n8n variables can be used for temporary persistence within the same workflow
  • Session data should be structured to include conversation history and user preferences
  • Proper security measures, including encryption and authentication, are essential
  • For high-volume applications, consider caching and batched writes
  • Regular maintenance like session cleanup prevents unlimited data growth

By implementing these patterns, your AI agent will maintain context across multiple interactions, creating more natural and useful conversational experiences. The exact implementation you choose will depend on your specific requirements, including the volume of users, sensitivity of data, and performance needs.

Want to explore opportunities to work with us?

Connect with our team to unlock the full potential of no-code solutions with a no-commitment consultation!

Book a Free Consultation

Client trust and success are our top priorities

When it comes to serving you, we sweat the little things. That’s why our work makes a big impact.

Rapid Dev was an exceptional project management organization and the best development collaborators I've had the pleasure of working with. They do complex work on extremely fast timelines and effectively manage the testing and pre-launch process to deliver the best possible product. I'm extremely impressed with their execution ability.

CPO, Praction - Arkady Sokolov

May 2, 2023

Working with Matt was comparable to having another co-founder on the team, but without the commitment or cost. He has a strategic mindset and willing to change the scope of the project in real time based on the needs of the client. A true strategic thought partner!

Co-Founder, Arc - Donald Muir

Dec 27, 2022

Rapid Dev are 10/10, excellent communicators - the best I've ever encountered in the tech dev space. They always go the extra mile, they genuinely care, they respond quickly, they're flexible, adaptable and their enthusiasm is amazing.

Co-CEO, Grantify - Mat Westergreen-Thorne

Oct 15, 2022

Rapid Dev is an excellent developer for no-code and low-code solutions.
We’ve had great success since launching the platform in November 2023. In a few months, we’ve gained over 1,000 new active users. We’ve also secured several dozen bookings on the platform and seen about 70% new user month-over-month growth since the launch.

Co-Founder, Church Real Estate Marketplace - Emmanuel Brown

May 1, 2024 

Matt’s dedication to executing our vision and his commitment to the project deadline were impressive. 
This was such a specific project, and Matt really delivered. We worked with a really fast turnaround, and he always delivered. The site was a perfect prop for us!

Production Manager, Media Production Company - Samantha Fekete

Sep 23, 2022