Learn how to send follow-up questions to Gemini in n8n by creating a workflow that maintains conversation context and thread history using HTTP Request and Function nodes.
Book a call with an Expert
Starting a new venture? Need to upgrade your web app? RapidDev builds application with your growth in mind.
To send follow-up questions to Gemini with correct thread history in n8n, you need to create a workflow that maintains conversation context by saving and passing conversation IDs between nodes. This requires using HTTP Request nodes to interact with the Gemini API, along with properly configured Function nodes to manage the request payloads and response handling.
Step 1: Set Up Your n8n Environment
Before getting started, ensure you have:
If you haven't already set up n8n, you can install it via npm:
npm install n8n -g
n8n start
Step 2: Create a New Workflow
Open your n8n interface (typically at http://localhost:5678) and create a new workflow:
Step 3: Add Trigger Node
Add a trigger to start your workflow:
Step 4: Create Initial Conversation Function
Add a Function node to prepare the initial request:
// This function prepares the initial request to Gemini
return [
{
json: {
apiKey: "YOUR_GEMINI_API\_KEY", // Replace with your actual API key
initialPrompt: "Hello, I have some questions about machine learning.",
geminiModel: "gemini-pro", // or whatever model you want to use
requestPayload: {
contents: [
{
role: "user",
parts: [
{
text: "Hello, I have some questions about machine learning."
}
]
}
]
}
}
}
];
Step 5: Set Up HTTP Request for Initial Message
Add an HTTP Request node to send the initial message:
Step 6: Process Initial Response
Add a Function node to process the initial response:
// Extract and process the initial response
const response = $input.item.json;
// Check if we have a valid response
if (!response || !response.conversation || !response.conversation.conversationId) {
throw new Error('Failed to get a valid conversation ID from Gemini');
}
// Store important data for follow-up
return [
{
json: {
conversationId: response.conversation.conversationId,
apiKey: $input.item.json.apiKey,
geminiModel: $input.item.json.geminiModel,
initialPrompt: $input.item.json.initialPrompt,
assistantResponse: response.response.candidates[0].content.parts[0].text,
messageHistory: [
{
role: "user",
content: $input.item.json.initialPrompt
},
{
role: "assistant",
content: response.response.candidates[0].content.parts[0].text
}
]
}
}
];
Step 7: Create Follow-Up Question Function
Add a Function node to prepare the follow-up request:
// This function prepares a follow-up request to Gemini
// For this example, we'll use a predefined follow-up question
const followUpQuestion = "Can you explain neural networks in simple terms?";
return [
{
json: {
apiKey: $input.item.json.apiKey,
conversationId: $input.item.json.conversationId,
geminiModel: $input.item.json.geminiModel,
followUpQuestion: followUpQuestion,
messageHistory: $input.item.json.messageHistory,
requestPayload: {
conversation: {
conversationId: $input.item.json.conversationId
},
prompt: {
text: followUpQuestion
}
}
}
}
];
Step 8: Set Up HTTP Request for Follow-Up Message
Add an HTTP Request node for the follow-up:
Step 9: Process Follow-Up Response
Add a Function node to process the follow-up response:
// Extract and process the follow-up response
const response = $input.item.json;
// Check if we have a valid response
if (!response || !response.response || !response.response.candidates || response.response.candidates.length === 0) {
throw new Error('Failed to get a valid response from Gemini');
}
// Update message history with the new exchange
const updatedHistory = [...$input.item.json.messageHistory,
{
role: "user",
content: $input.item.json.followUpQuestion
},
{
role: "assistant",
content: response.response.candidates[0].content.parts[0].text
}
];
// Return all the important data
return [
{
json: {
conversationId: $input.item.json.conversationId,
apiKey: $input.item.json.apiKey,
geminiModel: $input.item.json.geminiModel,
initialPrompt: $input.item.json.initialPrompt,
followUpQuestion: $input.item.json.followUpQuestion,
followUpResponse: response.response.candidates[0].content.parts[0].text,
messageHistory: updatedHistory
}
}
];
Step 10: Connect the Nodes
Connect all the nodes in sequence:
Step 11: Add More Follow-Up Questions (Optional)
To add additional follow-up questions, you can repeat Steps 7-9 with new Function and HTTP Request nodes, making sure to pass the updated message history each time.
For example, to add a second follow-up question:
Step 12: Make the Workflow Dynamic (Advanced)
For a more flexible approach, you can replace the hardcoded follow-up questions with dynamic inputs:
Alternatively, create a loop structure for multiple follow-ups:
// In a Function node to handle dynamic follow-ups
const userQuestions = [
"Can you explain neural networks in simple terms?",
"How is deep learning different from machine learning?",
"What are some applications of AI in healthcare?"
];
// Get current question index (or default to 0)
const questionIndex = $input.item.json.questionIndex || 0;
// Check if we have more questions
if (questionIndex >= userQuestions.length) {
return [
{
json: {
...$input.item.json,
conversationComplete: true
}
}
];
}
// Prepare the next question
return [
{
json: {
...$input.item.json,
followUpQuestion: userQuestions[questionIndex],
questionIndex: questionIndex + 1,
requestPayload: {
conversation: {
conversationId: $input.item.json.conversationId
},
prompt: {
text: userQuestions[questionIndex]
}
}
}
}
];
Step 13: Implement Error Handling
Add error handling to your workflow:
Example error handling function:
// Error handling function
if ($input.item.json.error) {
// Log the error
console.error('Gemini API Error:', $input.item.json.error);
// Return a formatted error message
return [
{
json: {
success: false,
errorMessage: 'Failed to communicate with Gemini API: ' + $input.item.json.error.message,
errorDetails: $input.item.json.error
}
}
];
}
// If no error, pass through the data
return $input.item;
Step 14: Save and Test Your Workflow
Save your workflow and test it:
Step 15: Store Conversation History in Database (Optional)
For persistent conversations, add database nodes:
Example schema for storing conversations:
// MongoDB schema example
{
conversationId: String,
startedAt: Date,
updatedAt: Date,
model: String,
messages: [
{
role: String,
content: String,
timestamp: Date
}
]
}
Step 16: Create a Complete Interactive Workflow (Advanced)
For a fully interactive experience, create a webhook-based workflow:
Example webhook handling code:
// Function to handle webhook input
const incomingData = $input.item.json;
let outputData = {};
// Check if we're continuing an existing conversation
if (incomingData.conversationId) {
outputData = {
apiKey: "YOUR_GEMINI_API\_KEY",
conversationId: incomingData.conversationId,
geminiModel: incomingData.model || "gemini-pro",
followUpQuestion: incomingData.message,
requestPayload: {
conversation: {
conversationId: incomingData.conversationId
},
prompt: {
text: incomingData.message
}
}
};
} else {
// Starting a new conversation
outputData = {
apiKey: "YOUR_GEMINI_API\_KEY",
geminiModel: incomingData.model || "gemini-pro",
initialPrompt: incomingData.message,
requestPayload: {
contents: [
{
role: "user",
parts: [
{
text: incomingData.message
}
]
}
]
}
};
}
return [{ json: outputData }];
Step 17: Optimize Your Workflow
Fine-tune your workflow for better performance:
Example environment variable usage:
// Using environment variables
const apiKey = $env.GEMINI_API_KEY;
if (!apiKey) {
throw new Error('Gemini API key not found in environment variables');
}
return [
{
json: {
apiKey: apiKey,
// rest of the properties
}
}
];
Step 18: Deploy Your Workflow
When you're ready to use the workflow in production:
Troubleshooting Common Issues
Authentication Errors:
Invalid Conversation ID:
Workflow Not Executing:
By following these steps, you'll have a functional n8n workflow that can maintain conversation context with Gemini across multiple interactions, allowing for natural follow-up questions while preserving the thread history.
When it comes to serving you, we sweat the little things. That’s why our work makes a big impact.