/n8n-tutorials

How to use splitInBatches node in n8n?

Learn how to use the Split In Batches node in n8n to break large datasets into manageable batches for efficient processing, API rate limit handling, and optimized workflows.

Matt Graham, CEO of Rapid Developers

Book a call with an Expert

Starting a new venture? Need to upgrade your web app? RapidDev builds application with your growth in mind.

Book a free consultation

How to use splitInBatches node in n8n?

The Split In Batches node in n8n is a powerful utility that breaks down large datasets into smaller, manageable batches for processing. It's particularly useful when dealing with API rate limits or when you need to process data in chunks. This node takes your input data and divides it into specified batch sizes, allowing subsequent nodes to process each batch separately.

 

Step 1: Understanding the Split In Batches Node

 

The Split In Batches node is a core utility node in n8n that allows you to split an array of items into multiple batches of a specified size. This is particularly useful when:

  • Working with APIs that have rate limits
  • Processing large datasets that might cause memory issues
  • Creating parallel processing workflows
  • Controlling the flow of data through your workflow

The node takes an array of items as input and outputs multiple batches, each containing a subset of the original items. Each batch is processed separately by subsequent nodes in your workflow.

 

Step 2: Adding the Split In Batches Node to Your Workflow

 

To add the Split In Batches node to your workflow:

  • Create or open an existing workflow in n8n
  • Click on the "+" button to add a new node
  • In the search bar, type "Split In Batches"
  • Select the "Split In Batches" node from the results

The node will be added to your workflow. Connect it to a node that outputs an array of items that you want to split into batches.

 

Step 3: Configuring the Split In Batches Node

 

Once added to your workflow, you need to configure the Split In Batches node:

  • Connect it to the preceding node that outputs the data you want to split
  • Set the "Batch Size" parameter to specify how many items should be in each batch
  • Optionally, configure other parameters based on your workflow needs

The main parameter to configure is the "Batch Size" which determines how many items will be in each output batch. The default is 10 items per batch.

 

Step 4: Creating a Sample Workflow with Split In Batches

 

Let's create a simple workflow to demonstrate how the Split In Batches node works:

  • Start with a "Manual Trigger" node
  • Add a "Function" node to generate sample data
  • Connect a "Split In Batches" node
  • Add a "No Operation" node to view the results

For the Function node, use the following code to generate sample data:


// Generate an array of 25 sample items
const items = [];
for (let i = 1; i <= 25; i++) {
  items.push({
    id: i,
    name: `Item ${i}`,
    value: Math.floor(Math.random() \* 100)
  });
}

// Return the array as output
return [{ json: { items } }];

 

Step 5: Configuring the Split In Batches Node Parameters

 

After adding the Split In Batches node to your workflow, you need to configure its parameters:

  • Batch Size: The number of items to include in each batch (default is 10)
  • Options: Additional configuration options
  • Source Data: Specify the data path to split (by default, it processes all items from the previous node)

In our example, we'll set the "Batch Size" to 5, which means our 25 items will be split into 5 batches of 5 items each.

If your data is nested within the items, you can specify the "Source Data" parameter to point to the specific property. For our example, since our Function node outputs data in the format { items: [...] }, we would set the "Source Data" to items.

 

Step 6: Understanding Batch Processing Flow

 

When you execute a workflow with the Split In Batches node, the flow works as follows:

  • The node receives the input data from the previous node
  • It divides the data into batches according to the specified batch size
  • The node then outputs the first batch and pauses
  • Subsequent nodes process the first batch
  • Once processing is complete, the Split In Batches node outputs the second batch
  • This continues until all batches have been processed

This sequential processing helps manage resource usage and respects API rate limits when working with external services.

 

Step 7: Working with the Batch Output

 

Each batch output from the Split In Batches node is an array of items. The items retain their original structure but are grouped into smaller arrays based on the batch size.

To process each batch, you'll typically:

  • Connect nodes after the Split In Batches node that can handle multiple items
  • Use loops or map functions if needed to process individual items within a batch
  • Set up error handling to manage issues with specific batches

Remember that each batch is processed entirely before the next batch begins, so any node connected after the Split In Batches node will be executed multiple times (once per batch).

 

Step 8: Using Split In Batches with API Requests

 

One common use case for the Split In Batches node is when working with APIs that have rate limits. Here's how to set this up:

  • Add a node that retrieves or generates a large dataset
  • Connect the Split In Batches node and set an appropriate batch size (e.g., 10)
  • Add an HTTP Request node after the Split In Batches node
  • Configure the HTTP Request node to process each batch

For example, if you're updating user records via an API that allows only 10 updates per minute, you would set the batch size to 10 and add appropriate delays between batches.

 

Step 9: Advanced Configuration - Adding Delays Between Batches

 

To add delays between batch processing (useful for API rate limiting), you can use a Function node after the Split In Batches node:


// Add a delay of 2 seconds
await new Promise(resolve => setTimeout(resolve, 2000));

// Pass through the data unchanged
return items;

This adds a 2-second delay between processing each batch, which can help prevent hitting rate limits with external APIs.

 

Step 10: Example Workflow: Processing CSV Data in Batches

 

Let's create a practical example of using the Split In Batches node to process a large CSV file:

  • Start with a "Read Binary File" node to read a CSV file
  • Add a "CSV" node to parse the CSV data
  • Connect a "Split In Batches" node with a batch size of 50
  • Add a "Function" node to process each batch
  • Finish with a "No Operation" node to see the results

For the Function node, you could use code like this to process each batch:


// Process each item in the batch
const processedItems = items.map(item => {
  // Example transformation: calculate a new field
  item.json.total = (item.json.price \* item.json.quantity);
  
  // Add a timestamp for tracking
  item.json.processedAt = new Date().toISOString();
  
  return item;
});

// Return the processed batch
return processedItems;

 

Step 11: Handling Errors in Batch Processing

 

When processing data in batches, it's important to handle errors properly so that one failing batch doesn't stop the entire workflow. Here's how to implement error handling:

  • Add an "Error Trigger" node to your workflow
  • Configure it to catch errors from the nodes processing your batches
  • Connect it to nodes that handle the error (e.g., sending notifications or logging)

You can also use try/catch blocks in Function nodes to handle errors within each batch:


// Process items with error handling
const processedItems = [];

for (const item of items) {
  try {
    // Process the item
    const processedItem = {
      json: {
        ...item.json,
        processed: true,
        processedAt: new Date().toISOString()
      }
    };
    processedItems.push(processedItem);
  } catch (error) {
    // Handle the error for this item
    console.error(`Error processing item ${item.json.id}: ${error.message}`);
    
    // Add the item with error information
    processedItems.push({
      json: {
        ...item.json,
        processed: false,
        error: error.message
      }
    });
  }
}

return processedItems;

 

Step 12: Monitoring Batch Progress

 

To monitor the progress of batch processing, you can add a Function node that logs information about each batch:


// Get the batch number from workflow data if available
const workflowData = $getWorkflowStaticData("global");
if (!workflowData.batchCount) {
  workflowData.batchCount = 0;
  workflowData.totalProcessed = 0;
}

// Increment the batch counter
workflowData.batchCount += 1;
workflowData.totalProcessed += items.length;

// Log batch information
console.log(`Processing batch #${workflowData.batchCount} with ${items.length} items`);
console.log(`Total items processed so far: ${workflowData.totalProcessed}`);

// Pass through the items unchanged
return items;

This allows you to track how many batches have been processed and how many items have been handled in total.

 

Step 13: Using Split In Batches with Webhooks

 

If you're receiving large amounts of data via webhooks, you can use the Split In Batches node to process this data in manageable chunks:

  • Start with a "Webhook" node to receive data
  • Add a "Split In Batches" node to divide the incoming data
  • Connect processing nodes to handle each batch

This approach helps prevent timeout issues when processing large webhook payloads.

 

Step 14: Combining Split In Batches with Merge Node

 

Sometimes you'll want to recombine the processed batches after they've been handled. For this, you can use the "Merge" node:

  • Add your "Split In Batches" node and processing nodes
  • Add a "Merge" node after all processing is complete
  • Configure the Merge node to combine all batches

The key is to set the Merge node's mode to "Append" to combine all processed batches into a single output.

 

Step 15: Real-world Example: Batch Processing Database Records

 

Let's create a comprehensive example of using the Split In Batches node to update records in a database:

  • Start with a "Database" node to retrieve records (e.g., "SELECT \* FROM customers")
  • Add a "Split In Batches" node with a batch size of 20
  • Add a "Function" node to transform data
  • Add a "Database" node to update records
  • Add a "No Operation" node to see results

For the Function node that transforms data:


// Process customer data to prepare for update
return items.map(item => {
  // Calculate loyalty tier based on purchase history
  let loyaltyTier = 'Bronze';
  
  if (item.json.totalPurchases > 50) {
    loyaltyTier = 'Platinum';
  } else if (item.json.totalPurchases > 25) {
    loyaltyTier = 'Gold';
  } else if (item.json.totalPurchases > 10) {
    loyaltyTier = 'Silver';
  }
  
  // Add calculated fields
  item.json.loyaltyTier = loyaltyTier;
  item.json.lastUpdated = new Date().toISOString();
  
  return item;
});

For the Database update node, you would configure an SQL query like:


UPDATE customers 
SET loyalty\_tier = :loyaltyTier, 
    last\_updated = :lastUpdated 
WHERE customer\_id = :id

 

Step 16: Performance Optimization with Split In Batches

 

To optimize performance when using the Split In Batches node:

  • Choose an appropriate batch size based on your system resources and API limits
  • Use smaller batch sizes for complex processing operations
  • Use larger batch sizes for simple operations to reduce overhead
  • Monitor memory usage during batch processing
  • Consider adding delays between batches for external API calls

Finding the optimal batch size often requires experimentation based on your specific use case.

 

Step 17: Using Environment Variables with Split In Batches

 

You can make your batch processing more flexible by using environment variables for batch size configuration:

  • Create an environment variable in n8n (e.g., BATCH\_SIZE)
  • In the Split In Batches node, use an expression to set the batch size: {{$env.BATCH\_SIZE}}

This allows you to adjust batch sizes without modifying your workflow, which is particularly useful when moving between development and production environments.

 

Step 18: Troubleshooting Common Issues

 

Here are solutions to common issues when working with the Split In Batches node:

  • Data not splitting correctly: Verify the "Source Data" parameter is set correctly if your data is nested
  • Workflow timing out: Reduce the batch size to process smaller chunks of data
  • Memory errors: Lower the batch size and ensure efficient data processing in subsequent nodes
  • Batches processing out of order: This is normal behavior; batches are processed sequentially
  • Missing data: Check that all batches complete processing and that no errors occur during processing

 

Step 19: Best Practices for Using Split In Batches

 

Follow these best practices when working with the Split In Batches node:

  • Always test with small datasets before processing large amounts of data
  • Start with a conservative batch size and adjust based on performance
  • Add appropriate error handling for each batch
  • Implement logging to track batch processing progress
  • Use the node's "Options" section to customize behavior for specific use cases
  • Consider adding delays between batches when working with external APIs
  • Validate data before and after batch processing to ensure integrity

 

Step 20: Conclusion and Next Steps

 

The Split In Batches node is a powerful tool for managing large datasets in n8n workflows. By breaking data into manageable chunks, you can process information more efficiently, respect API rate limits, and avoid system resource constraints.

As you become more comfortable with batch processing in n8n, consider exploring these advanced topics:

  • Parallel processing of batches using multiple workflows
  • Implementing retry mechanisms for failed batches
  • Creating dynamic batch sizes based on data characteristics
  • Building monitoring dashboards for batch processing workflows
  • Optimizing database operations with batch processing

With these techniques, you can handle virtually any volume of data efficiently within your n8n workflows.

Want to explore opportunities to work with us?

Connect with our team to unlock the full potential of no-code solutions with a no-commitment consultation!

Book a Free Consultation

Client trust and success are our top priorities

When it comes to serving you, we sweat the little things. That’s why our work makes a big impact.

Rapid Dev was an exceptional project management organization and the best development collaborators I've had the pleasure of working with. They do complex work on extremely fast timelines and effectively manage the testing and pre-launch process to deliver the best possible product. I'm extremely impressed with their execution ability.

CPO, Praction - Arkady Sokolov

May 2, 2023

Working with Matt was comparable to having another co-founder on the team, but without the commitment or cost. He has a strategic mindset and willing to change the scope of the project in real time based on the needs of the client. A true strategic thought partner!

Co-Founder, Arc - Donald Muir

Dec 27, 2022

Rapid Dev are 10/10, excellent communicators - the best I've ever encountered in the tech dev space. They always go the extra mile, they genuinely care, they respond quickly, they're flexible, adaptable and their enthusiasm is amazing.

Co-CEO, Grantify - Mat Westergreen-Thorne

Oct 15, 2022

Rapid Dev is an excellent developer for no-code and low-code solutions.
We’ve had great success since launching the platform in November 2023. In a few months, we’ve gained over 1,000 new active users. We’ve also secured several dozen bookings on the platform and seen about 70% new user month-over-month growth since the launch.

Co-Founder, Church Real Estate Marketplace - Emmanuel Brown

May 1, 2024 

Matt’s dedication to executing our vision and his commitment to the project deadline were impressive. 
This was such a specific project, and Matt really delivered. We worked with a really fast turnaround, and he always delivered. The site was a perfect prop for us!

Production Manager, Media Production Company - Samantha Fekete

Sep 23, 2022