Skip to main content
RapidDev - Software Development Agency
bubble-tutorial

How to Iterate Over Database Records in Bubble

Bubble does not have a traditional for-each loop, but you can iterate over database records using Schedule API Workflow on a List, recursive backend workflows, and batch processing with offsets. This tutorial covers processing records one by one with backend workflows, handling large datasets with batch offsets to avoid timeouts, implementing recursive workflows with termination conditions, and choosing the right iteration pattern for your use case.

What you'll learn

  • How to use Schedule API Workflow on a List for batch processing
  • How to build recursive backend workflows for sequential processing
  • How to handle large datasets with offset-based batching
  • How to choose the right iteration pattern for different use cases
Book a free consultation
4.9Clutch rating
600+Happy partners
17+Countries served
190+Team members
Intermediate6 min read20-25 minPaid Bubble plans (backend workflows required)March 2026RapidDev Engineering Team
TL;DR

Bubble does not have a traditional for-each loop, but you can iterate over database records using Schedule API Workflow on a List, recursive backend workflows, and batch processing with offsets. This tutorial covers processing records one by one with backend workflows, handling large datasets with batch offsets to avoid timeouts, implementing recursive workflows with termination conditions, and choosing the right iteration pattern for your use case.

Overview: Iterating Over Records in Bubble

This tutorial teaches you the patterns for processing database records one by one or in batches in Bubble, replacing traditional loop constructs with Bubble-native approaches.

Prerequisites

  • A Bubble app on a paid plan
  • Backend workflows enabled in Settings → API
  • Basic understanding of Backend Workflows
  • A Data Type with records to process

Step-by-step guide

1

Use Schedule API Workflow on a List

This is the simplest iteration method. Create a backend workflow called 'process_single_record' that accepts a parameter of your Data Type. Add the processing logic inside it. From a frontend workflow, use the 'Schedule API Workflow on a List' action, passing your search results as the list and specifying the backend workflow. Bubble automatically calls the backend workflow once per item in the list. You can set a delay between each call if the processing involves rate-limited external APIs.

Expected result: Each record in the list is processed by the backend workflow individually.

2

Build a recursive backend workflow

For more control over iteration, create a self-scheduling backend workflow. The workflow accepts parameters: offset (number, starting at 0) and batch_size (number, e.g. 50). It searches for records with an items-from offset limited to batch_size, processes them, then schedules itself with offset increased by batch_size. Add a termination condition: Only when the search returns records (if empty, stop). This pattern handles any dataset size since it processes in manageable chunks.

Pro tip: Add a small delay (1-2 seconds) between each self-schedule to avoid overloading Bubble's workflow queue and consuming too many WUs at once.

Expected result: Records are processed in batches with automatic pagination until all records are handled.

3

Handle large datasets with batch processing

For datasets with thousands of records, process in batches to avoid timeouts. Create a backend workflow that accepts an offset parameter. Search for records with items from the offset, limited to 100 items. Process the batch (update records, make API calls, etc.). If the batch returned 100 items (more may exist), schedule the next batch at offset + 100. If fewer than 100, all records are processed — stop. Track progress by logging the offset to a Status record so you can monitor batch completion.

Expected result: Large datasets are processed reliably in 100-record batches without timeouts.

4

Choose the right pattern for your use case

Schedule on a List is best when you need to process each item independently with simple logic and moderate list sizes (under 1,000). Recursive workflows with offset batching are best for large datasets (1,000+) or when you need granular control over processing order and timing. For real-time per-item processing, consider a database trigger on the specific Data Type that fires whenever a record is created or modified. For one-time bulk operations, use the batch approach with progress tracking.

Expected result: You understand which iteration pattern to use for different scenarios.

5

Monitor and handle errors in batch processing

Add error handling to your iteration workflows. Wrap processing logic in error-catching patterns by enabling 'Include errors in response' on API calls. Log each processed record's ID and status to a Progress Data Type. If an error occurs on one record, log it and continue processing the next one rather than stopping the entire batch. After all records are processed, send a summary notification with the count of successful and failed operations. This makes bulk operations robust and auditable.

Expected result: Batch processing continues through individual record errors with full logging and summary reporting.

Complete working example

Workflow summary
1ITERATION PATTERNS SUMMARY
2=====================================
3
4PATTERN 1: SCHEDULE ON A LIST
5 Backend workflow: process_single_record
6 Parameter: record (Data Type)
7 Logic: process one record
8 Frontend trigger:
9 Schedule API Workflow on a List
10 List: Do a Search for [Data Type]
11 Workflow: process_single_record
12 Best for: < 1,000 records, simple logic
13
14PATTERN 2: RECURSIVE WITH BATCHING
15 Backend workflow: process_batch
16 Parameters: offset (number), batch_size (number)
17 1. Search: items from offset, limit batch_size
18 2. Process each item in batch
19 3. If batch count = batch_size:
20 Schedule self (offset + batch_size)
21 4. If batch count < batch_size: stop
22 Best for: 1,000+ records, controlled pace
23
24PATTERN 3: DATABASE TRIGGER
25 Event: A thing of type X is modified
26 Only when: specific condition
27 Process the changed record
28 Best for: real-time per-record processing
29
30ERROR HANDLING:
31 Log each record: ID, status, error
32 Continue on individual errors
33 Summary notification after completion
34 Progress tracking: processed / total
35
36PERFORMANCE:
37 Batch size: 50-100 records
38 Delay between batches: 1-2 seconds
39 Monitor WU consumption during runs
40 Schedule large jobs during off-peak hours

Common mistakes when iterating Over Database Records in Bubble

Why it's a problem: Not adding a termination condition to recursive workflows

How to avoid: Always check if the current batch returned fewer items than the batch size as the stop condition

Why it's a problem: Processing too many records at once without batching

How to avoid: Use batch processing with 50-100 records per batch and offset-based pagination

Why it's a problem: Using frontend workflows to iterate over large lists

How to avoid: Always use backend workflows for bulk processing — they continue running even if the user closes their browser

Best practices

  • Use backend workflows for all bulk processing operations
  • Add termination conditions to every recursive workflow
  • Process in batches of 50-100 records for reliability
  • Add delays between batches to manage WU consumption
  • Log progress for monitoring and error tracking
  • Handle individual record errors without stopping the entire batch
  • Schedule large batch jobs during low-traffic periods

Still stuck?

Copy one of these prompts to get a personalized, step-by-step explanation.

ChatGPT Prompt

I need to update 5,000 database records in my Bubble.io app by adding a new calculated field. What is the best way to iterate through all records without causing timeouts?

Bubble Prompt

Help me create a backend workflow that processes all my Product records in batches of 100, updates a calculated field on each one, and sends me a notification when complete.

Frequently asked questions

Does Bubble have a for-each loop?

Not in the traditional programming sense. The closest equivalent is Schedule API Workflow on a List, which calls a backend workflow once per item in a list.

How many records can I process in one batch?

50-100 records per batch is optimal. Larger batches may timeout depending on processing complexity. Smaller batches are more reliable but take longer overall.

How much do batch operations cost in WUs?

Each backend workflow execution costs WUs based on its operations. Processing 5,000 records typically costs 5,000-25,000 WUs depending on what each iteration does.

Can I cancel a running batch job?

You cannot directly cancel scheduled workflows. Add a 'should_continue' flag that the workflow checks before processing each batch. Set the flag to false to stop the next batch from running.

How do I track progress of a batch job?

Create a BatchJob Data Type with fields for total records, processed count, and status. Update it after each batch. Display progress on an admin page.

Can RapidDev help with bulk data processing?

Yes. RapidDev can implement efficient batch processing workflows in Bubble including progress tracking, error handling, and performance optimization for large datasets.

RapidDev

Talk to an Expert

Our team has built 600+ apps. Get personalized help with your project.

Book a free consultation

Need help with your project?

Our experts have built 600+ apps and can accelerate your development. Book a free consultation — no strings attached.

Book a free consultation

We put the rapid in RapidDev

Need a dedicated strategic tech and growth partner? Discover what RapidDev can do for your business! Book a call with our team to schedule a free, no-obligation consultation. We'll discuss your project and provide a custom quote at no cost.