/n8n-tutorials

How to handle multi-turn conversations failing with Cohere in n8n?

Learn how to handle multi-turn conversation failures with Cohere in n8n by managing context, session state, token limits, error handling, and proper API configuration for smooth chatbot workflows.

Matt Graham, CEO of Rapid Developers

Book a call with an Expert

Starting a new venture? Need to upgrade your web app? RapidDev builds application with your growth in mind.

Book a free consultation

How to handle multi-turn conversations failing with Cohere in n8n?

To handle multi-turn conversations that fail with Cohere in n8n, you need to properly manage conversation context, implement error handling, use appropriate API parameters, and maintain session state. The main issues typically stem from improper context management, token limitations, or configuration errors in the n8n workflow.

 
Step 1: Understanding the Problem
 

Multi-turn conversations with Cohere in n8n can fail for several reasons:

  • Insufficient context management between conversation turns
  • Token limitations being exceeded
  • Improper API configuration
  • Lack of error handling mechanisms
  • Session state not being properly maintained

Before implementing fixes, it's important to understand how Cohere handles conversations. Unlike stateful chatbots, Cohere's API is stateless, meaning you must provide the entire conversation history with each request to maintain context.

 
Step 2: Setting Up Proper Context Management
 

The most common reason for failed multi-turn conversations is improper context management. Here's how to fix it:


// Example of properly structured conversation history for Cohere
{
  "message": "What's the weather like today?",
  "chat\_history": [
    {"role": "USER", "message": "Hello, can you help me?"},
    {"role": "CHATBOT", "message": "Hi there! I'd be happy to help. What do you need assistance with?"}
  ]
}

In n8n, you need to maintain this structure by:

  1. Creating a conversation array in a Function node
  2. Appending each user message and Cohere response to this array
  3. Passing the updated array to each subsequent Cohere API call

 
Step 3: Implementing a Conversation Storage Mechanism
 

Create a Function node to manage the conversation history:


// Function node: Initialize or update conversation history
const incomingMessage = items[0].json.message;
const existingHistory = items[0].json.conversation\_history || [];

// Add the new user message to history
existingHistory.push({
  role: "USER",
  message: incomingMessage
});

// Prepare output for Cohere API
return {
  json: {
    message: incomingMessage,
    chat\_history: existingHistory,
    conversation_id: items[0].json.conversation_id || Date.now().toString()
  }
};

 
Step 4: Configuring the Cohere Node Properly
 

When using the Cohere node in n8n, ensure it's configured correctly:

  1. Open the Cohere node settings
  2. Select "Chat" as the operation
  3. Set these parameters:
  • Model: Choose an appropriate model (e.g., "command" or "command-light")
  • Message: Map to the current user message
  • Chat History: Map to your maintained chat history array
  • Temperature: Set between 0-1 (lower for more deterministic responses)
  • Max Tokens: Set high enough to allow for full responses (1024-2048)
  • Connector ID: If using a specific connector

// Example parameter mapping in Cohere node
{
  "model": "command",
  "message": "={{ $json.message }}",
  "chat_history": "={{ $json.chat_history }}",
  "temperature": 0.7,
  "max\_tokens": 1024,
  "preamble": "You are a helpful assistant."
}

 
Step 5: Handling Responses and Updating History
 

After receiving a response from Cohere, update the conversation history:


// Function node: Process Cohere response and update history
const cohereResponse = items[0].json.response.text;
const updatedHistory = items[0].json.chat\_history || [];

// Add the chatbot response to history
updatedHistory.push({
  role: "CHATBOT",
  message: cohereResponse
});

return {
  json: {
    message: items[0].json.message,
    chat\_history: updatedHistory,
    conversation_id: items[0].json.conversation_id,
    last\_response: cohereResponse
  }
};

 
Step 6: Implementing Error Handling
 

Add error handling to gracefully manage failures:


// Function node: Error handling for Cohere API
try {
  // Your existing code to handle the Cohere response
  const cohereResponse = items[0].json.response.text;
  const updatedHistory = items[0].json.chat\_history || [];
  
  updatedHistory.push({
    role: "CHATBOT",
    message: cohereResponse
  });
  
  return {
    json: {
      message: items[0].json.message,
      chat\_history: updatedHistory,
      conversation_id: items[0].json.conversation_id,
      last\_response: cohereResponse,
      error: false
    }
  };
} catch (error) {
  // Handle the error
  console.error('Cohere API error:', error.message);
  
  return {
    json: {
      error: true,
      error\_message: error.message || 'Failed to process the conversation',
      conversation_id: items[0].json.conversation_id,
      chat_history: items[0].json.chat_history // Preserve existing history
    }
  };
}

 
Step 7: Managing Token Limitations
 

If your conversations are failing due to token limitations, implement a conversation history truncation mechanism:


// Function node: Truncate conversation history if too long
function truncateHistory(history, maxEntries = 10) {
  if (history.length <= maxEntries) return history;
  
  // Keep the first message for context and the most recent messages
  return [history[0], ...history.slice(history.length - maxEntries + 1)];
}

const chatHistory = items[0].json.chat\_history || [];
const truncatedHistory = truncateHistory(chatHistory);

return {
  json: {
    message: items[0].json.message,
    chat\_history: truncatedHistory,
    conversation_id: items[0].json.conversation_id
  }
};

 
Step 8: Implementing a Complete Workflow
 

Here's how to structure a complete workflow for multi-turn conversations:

  1. Trigger node (Webhook, HTTP Request, etc.)
  2. Function node: Process incoming message
  3. IF node: Check if conversation exists
  • If YES: Retrieve conversation history
  • If NO: Initialize new conversation
  1. Function node: Prepare data for Cohere
  2. Cohere node: Generate response
  3. Function node: Process response and update history
  4. Function node: Error handling
  5. Function node: Store updated conversation (to database or file)
  6. Respond to user

 
Step 9: Storing Conversation State
 

For persistence between workflow runs, store conversation data:


// Function node: Store conversation in n8n variables
const workflowData = {
  conversation_id: items[0].json.conversation_id,
  chat_history: items[0].json.chat_history
};

// Using n8n's built-in variable storage
$node["Set Variable"].setVariable('conversationData', JSON.stringify(workflowData));

return items;

For more permanent storage, use a database node:


// Function node: Prepare data for database storage
return {
  json: {
    table: "conversations",
    operation: "upsert",
    data: {
      conversation_id: items[0].json.conversation_id,
      user_id: items[0].json.user_id || "anonymous",
      chat_history: JSON.stringify(items[0].json.chat_history),
      last\_updated: new Date().toISOString()
    },
    where\_clause: {
      conversation_id: items[0].json.conversation_id
    }
  }
};

 
Step 10: Implementing a Session Timeout Mechanism
 

For better resource management, implement a session timeout:


// Function node: Check for session timeout
const conversationData = JSON.parse(items[0].json.stored\_conversation || '{}');
const lastUpdateTime = new Date(conversationData.last\_updated || 0);
const currentTime = new Date();
const timeoutMinutes = 30; // Set your desired timeout

// Check if the conversation has timed out
const minutesDifference = (currentTime - lastUpdateTime) / (1000 \* 60);
const isTimedOut = minutesDifference > timeoutMinutes;

if (isTimedOut || !conversationData.chat\_history) {
  // Start a new conversation
  return {
    json: {
      message: items[0].json.message,
      chat\_history: [],
      conversation\_id: Date.now().toString(),
      is_new_conversation: true
    }
  };
} else {
  // Continue existing conversation
  return {
    json: {
      message: items[0].json.message,
      chat_history: conversationData.chat_history,
      conversation_id: conversationData.conversation_id,
      is_new_conversation: false
    }
  };
}

 
Step 11: Implementing Rate Limiting Handling
 

If you're experiencing failures due to rate limiting, add a mechanism to handle this:


// Function node: Handle rate limiting
function sleep(ms) {
  return new Promise(resolve => setTimeout(resolve, ms));
}

// In an async function or with async node support
async function handleRateLimiting(items) {
  try {
    // Your Cohere API call logic
    return items;
  } catch (error) {
    if (error.message.includes('rate limit') || error.status === 429) {
      console.log('Rate limited, retrying after 2 seconds...');
      await sleep(2000);
      // You would need to trigger the Cohere node again here
      // This might require custom handling in n8n
      return { retry: true };
    } else {
      throw error; // Re-throw other errors
    }
  }
}

// Return the result
return handleRateLimiting(items);

 
Step 12: Testing and Debugging
 

To effectively debug conversation failures:

  1. Add logging at key points in your workflow
  2. Use n8n's built-in testing features

// Function node: Add debug logging
const debugMode = true; // Set to false in production

if (debugMode) {
  console.log('=== DEBUG INFO ===');
  console.log('Conversation ID:', items[0].json.conversation\_id);
  console.log('Message:', items[0].json.message);
  console.log('History Length:', (items[0].json.chat\_history || []).length);
  console.log('History:', JSON.stringify(items[0].json.chat\_history));
}

return items;

 
Step 13: Advanced Error Recovery
 

For robust handling of persistent failures, implement a progressive fallback system:


// Function node: Progressive fallback system
let attempts = items[0].json.retry\_attempts || 0;
const maxAttempts = 3;

if (items[0].json.error && attempts < maxAttempts) {
  // Increment attempt counter
  attempts++;
  
  // Apply fallback strategies based on attempt number
  if (attempts === 1) {
    // First fallback: Try with truncated history
    const truncatedHistory = items[0].json.chat\_history.slice(-4); // Keep only last 4 exchanges
    
    return {
      json: {
        message: items[0].json.message,
        chat\_history: truncatedHistory,
        conversation_id: items[0].json.conversation_id,
        retry\_attempts: attempts,
        fallback\_level: 1
      }
    };
  } else if (attempts === 2) {
    // Second fallback: Try with only the current message
    return {
      json: {
        message: items[0].json.message,
        chat\_history: [],
        conversation_id: items[0].json.conversation_id,
        retry\_attempts: attempts,
        fallback\_level: 2
      }
    };
  } else {
    // Final fallback: Use predefined response
    return {
      json: {
        error: false,
        last\_response: "I'm sorry, I'm having trouble processing your request right now. Could you please try again later?",
        conversation_id: items[0].json.conversation_id,
        chat_history: items[0].json.chat_history,
        retry\_attempts: 0,
        fallback\_level: 3
      }
    };
  }
} else if (!items[0].json.error) {
  // Successful response, reset attempt counter
  return {
    json: {
      ...items[0].json,
      retry\_attempts: 0
    }
  };
} else {
  // Max attempts reached, provide error message
  return {
    json: {
      error: true,
      last\_response: "I'm sorry, but I'm experiencing technical difficulties. Please try again later.",
      conversation_id: items[0].json.conversation_id,
      chat_history: items[0].json.chat_history,
      retry\_attempts: 0
    }
  };
}

 
Step 14: Implementing a Complete Example Workflow
 

Here's a complete workflow example combining all the steps:

  1. Create a new n8n workflow
  2. Add a Webhook node as the trigger
  3. Add these Function nodes in sequence:

Function Node 1: Process incoming message and retrieve history


// Get incoming message and user ID
const message = $input.item.json.message;
const userId = $input.item.json.user\_id || 'anonymous';
const conversationId = $input.item.json.conversation\_id || Date.now().toString();

// Try to get existing conversation history
let chatHistory = [];
try {
  // This assumes you have a variable or DB to retrieve from
  const storedData = $node["Get Variable"].json.value;
  if (storedData) {
    const parsedData = JSON.parse(storedData);
    if (parsedData.conversation\_id === conversationId) {
      chatHistory = parsedData.chat\_history || [];
    }
  }
} catch (error) {
  console.log('No existing conversation found or error retrieving:', error.message);
}

// Add user message to history
chatHistory.push({
  role: "USER",
  message: message
});

return {
  json: {
    message: message,
    chat\_history: chatHistory,
    conversation\_id: conversationId,
    user\_id: userId
  }
};

Function Node 2: Check and handle token limits


// Ensure we don't exceed token limits
const chatHistory = $input.item.json.chat\_history;
const estimatedTokens = chatHistory.reduce((total, msg) => {
  // Rough estimate: 1 token ≈ 4 characters
  return total + Math.ceil((msg.message.length || 0) / 4);
}, 0);

// If approaching token limit (e.g., 3000 tokens for context)
const MAX\_TOKENS = 3000;
if (estimatedTokens > MAX\_TOKENS) {
  // Truncate history strategy:
  // 1. Keep first user message for context
  // 2. Keep most recent messages
  const firstMessage = chatHistory[0];
  const recentMessages = chatHistory.slice(-10); // Keep last 10 messages
  
  return {
    json: {
      message: $input.item.json.message,
      chat\_history: [firstMessage, ...recentMessages],
      conversation_id: $input.item.json.conversation_id,
      user_id: $input.item.json.user_id,
      truncated: true
    }
  };
}

return $input.item;

Cohere Node: Configure as described in Step 4

Function Node 3: Process response and update history


try {
  // Get Cohere response
  const cohereResponse = $input.item.json.response.text;
  const chatHistory = $input.item.json.chat\_history || [];
  
  // Add bot response to history
  chatHistory.push({
    role: "CHATBOT",
    message: cohereResponse
  });
  
  // Save updated conversation
  const conversationData = {
    conversation_id: $input.item.json.conversation_id,
    user_id: $input.item.json.user_id,
    chat\_history: chatHistory,
    last\_updated: new Date().toISOString()
  };
  
  // Store the conversation data (variable or DB)
  $node["Set Variable"].setVariable('conversation_' + $input.item.json.conversation_id, JSON.stringify(conversationData));
  
  return {
    json: {
      response: cohereResponse,
      conversation_id: $input.item.json.conversation_id,
      error: false
    }
  };
} catch (error) {
  console.error('Error processing Cohere response:', error.message);
  
  return {
    json: {
      response: "I'm sorry, I encountered an error processing your request.",
      conversation_id: $input.item.json.conversation_id,
      error: true,
      error\_message: error.message
    }
  };
}

Function Node 4: Send response to user


// Final response formatting
return {
  json: {
    message: $input.item.json.response,
    conversation_id: $input.item.json.conversation_id,
    error: $input.item.json.error || false
  }
};

 
Step 15: Monitoring and Maintaining the Solution
 

To ensure ongoing reliability:

  1. Set up monitoring of your workflow using n8n's execution history
  2. Implement periodic cleanup of old conversations:

// Function node for periodic cleanup (in a separate workflow)
// This would run on a schedule to clean up old conversations

// Get all conversation variables
const allVariables = $node["Get All Variables"].json;
const currentTime = new Date();
const expirationDays = 7; // Keep conversations for 7 days

for (const key in allVariables) {
  if (key.startsWith('conversation\_')) {
    try {
      const conversationData = JSON.parse(allVariables[key]);
      const lastUpdated = new Date(conversationData.last\_updated);
      const daysDifference = (currentTime - lastUpdated) / (1000 _ 60 _ 60 \* 24);
      
      if (daysDifference > expirationDays) {
        // Delete expired conversation
        $node["Delete Variable"].deleteVariable(key);
        console.log(`Deleted expired conversation: ${key}`);
      }
    } catch (error) {
      console.error(`Error processing variable ${key}:`, error.message);
    }
  }
}

return { json: { cleanup\_completed: true, timestamp: currentTime.toISOString() } };

Following these detailed steps will help you effectively handle multi-turn conversations with Cohere in n8n, preventing failures and ensuring smooth operation of your conversational workflows. The key is to properly manage conversation context, implement robust error handling, and maintain appropriate session state throughout the interaction.

Want to explore opportunities to work with us?

Connect with our team to unlock the full potential of no-code solutions with a no-commitment consultation!

Book a Free Consultation

Client trust and success are our top priorities

When it comes to serving you, we sweat the little things. That’s why our work makes a big impact.

Rapid Dev was an exceptional project management organization and the best development collaborators I've had the pleasure of working with. They do complex work on extremely fast timelines and effectively manage the testing and pre-launch process to deliver the best possible product. I'm extremely impressed with their execution ability.

CPO, Praction - Arkady Sokolov

May 2, 2023

Working with Matt was comparable to having another co-founder on the team, but without the commitment or cost. He has a strategic mindset and willing to change the scope of the project in real time based on the needs of the client. A true strategic thought partner!

Co-Founder, Arc - Donald Muir

Dec 27, 2022

Rapid Dev are 10/10, excellent communicators - the best I've ever encountered in the tech dev space. They always go the extra mile, they genuinely care, they respond quickly, they're flexible, adaptable and their enthusiasm is amazing.

Co-CEO, Grantify - Mat Westergreen-Thorne

Oct 15, 2022

Rapid Dev is an excellent developer for no-code and low-code solutions.
We’ve had great success since launching the platform in November 2023. In a few months, we’ve gained over 1,000 new active users. We’ve also secured several dozen bookings on the platform and seen about 70% new user month-over-month growth since the launch.

Co-Founder, Church Real Estate Marketplace - Emmanuel Brown

May 1, 2024 

Matt’s dedication to executing our vision and his commitment to the project deadline were impressive. 
This was such a specific project, and Matt really delivered. We worked with a really fast turnaround, and he always delivered. The site was a perfect prop for us!

Production Manager, Media Production Company - Samantha Fekete

Sep 23, 2022