Bubble does not have a traditional for-each loop, but you can iterate over database records using Schedule API Workflow on a List, recursive backend workflows, and batch processing with offsets. This tutorial covers processing records one by one with backend workflows, handling large datasets with batch offsets to avoid timeouts, implementing recursive workflows with termination conditions, and choosing the right iteration pattern for your use case.
Overview: Iterating Over Records in Bubble
This tutorial teaches you the patterns for processing database records one by one or in batches in Bubble, replacing traditional loop constructs with Bubble-native approaches.
Prerequisites
- A Bubble app on a paid plan
- Backend workflows enabled in Settings → API
- Basic understanding of Backend Workflows
- A Data Type with records to process
Step-by-step guide
Use Schedule API Workflow on a List
Use Schedule API Workflow on a List
This is the simplest iteration method. Create a backend workflow called 'process_single_record' that accepts a parameter of your Data Type. Add the processing logic inside it. From a frontend workflow, use the 'Schedule API Workflow on a List' action, passing your search results as the list and specifying the backend workflow. Bubble automatically calls the backend workflow once per item in the list. You can set a delay between each call if the processing involves rate-limited external APIs.
Expected result: Each record in the list is processed by the backend workflow individually.
Build a recursive backend workflow
Build a recursive backend workflow
For more control over iteration, create a self-scheduling backend workflow. The workflow accepts parameters: offset (number, starting at 0) and batch_size (number, e.g. 50). It searches for records with an items-from offset limited to batch_size, processes them, then schedules itself with offset increased by batch_size. Add a termination condition: Only when the search returns records (if empty, stop). This pattern handles any dataset size since it processes in manageable chunks.
Pro tip: Add a small delay (1-2 seconds) between each self-schedule to avoid overloading Bubble's workflow queue and consuming too many WUs at once.
Expected result: Records are processed in batches with automatic pagination until all records are handled.
Handle large datasets with batch processing
Handle large datasets with batch processing
For datasets with thousands of records, process in batches to avoid timeouts. Create a backend workflow that accepts an offset parameter. Search for records with items from the offset, limited to 100 items. Process the batch (update records, make API calls, etc.). If the batch returned 100 items (more may exist), schedule the next batch at offset + 100. If fewer than 100, all records are processed — stop. Track progress by logging the offset to a Status record so you can monitor batch completion.
Expected result: Large datasets are processed reliably in 100-record batches without timeouts.
Choose the right pattern for your use case
Choose the right pattern for your use case
Schedule on a List is best when you need to process each item independently with simple logic and moderate list sizes (under 1,000). Recursive workflows with offset batching are best for large datasets (1,000+) or when you need granular control over processing order and timing. For real-time per-item processing, consider a database trigger on the specific Data Type that fires whenever a record is created or modified. For one-time bulk operations, use the batch approach with progress tracking.
Expected result: You understand which iteration pattern to use for different scenarios.
Monitor and handle errors in batch processing
Monitor and handle errors in batch processing
Add error handling to your iteration workflows. Wrap processing logic in error-catching patterns by enabling 'Include errors in response' on API calls. Log each processed record's ID and status to a Progress Data Type. If an error occurs on one record, log it and continue processing the next one rather than stopping the entire batch. After all records are processed, send a summary notification with the count of successful and failed operations. This makes bulk operations robust and auditable.
Expected result: Batch processing continues through individual record errors with full logging and summary reporting.
Complete working example
1ITERATION PATTERNS SUMMARY2=====================================34PATTERN 1: SCHEDULE ON A LIST5 Backend workflow: process_single_record6 Parameter: record (Data Type)7 Logic: process one record8 Frontend trigger:9 Schedule API Workflow on a List10 List: Do a Search for [Data Type]11 Workflow: process_single_record12 Best for: < 1,000 records, simple logic1314PATTERN 2: RECURSIVE WITH BATCHING15 Backend workflow: process_batch16 Parameters: offset (number), batch_size (number)17 1. Search: items from offset, limit batch_size18 2. Process each item in batch19 3. If batch count = batch_size:20 Schedule self (offset + batch_size)21 4. If batch count < batch_size: stop22 Best for: 1,000+ records, controlled pace2324PATTERN 3: DATABASE TRIGGER25 Event: A thing of type X is modified26 Only when: specific condition27 Process the changed record28 Best for: real-time per-record processing2930ERROR HANDLING:31 Log each record: ID, status, error32 Continue on individual errors33 Summary notification after completion34 Progress tracking: processed / total3536PERFORMANCE:37 Batch size: 50-100 records38 Delay between batches: 1-2 seconds39 Monitor WU consumption during runs40 Schedule large jobs during off-peak hoursCommon mistakes when iterating Over Database Records in Bubble
Why it's a problem: Not adding a termination condition to recursive workflows
How to avoid: Always check if the current batch returned fewer items than the batch size as the stop condition
Why it's a problem: Processing too many records at once without batching
How to avoid: Use batch processing with 50-100 records per batch and offset-based pagination
Why it's a problem: Using frontend workflows to iterate over large lists
How to avoid: Always use backend workflows for bulk processing — they continue running even if the user closes their browser
Best practices
- Use backend workflows for all bulk processing operations
- Add termination conditions to every recursive workflow
- Process in batches of 50-100 records for reliability
- Add delays between batches to manage WU consumption
- Log progress for monitoring and error tracking
- Handle individual record errors without stopping the entire batch
- Schedule large batch jobs during low-traffic periods
Still stuck?
Copy one of these prompts to get a personalized, step-by-step explanation.
I need to update 5,000 database records in my Bubble.io app by adding a new calculated field. What is the best way to iterate through all records without causing timeouts?
Help me create a backend workflow that processes all my Product records in batches of 100, updates a calculated field on each one, and sends me a notification when complete.
Frequently asked questions
Does Bubble have a for-each loop?
Not in the traditional programming sense. The closest equivalent is Schedule API Workflow on a List, which calls a backend workflow once per item in a list.
How many records can I process in one batch?
50-100 records per batch is optimal. Larger batches may timeout depending on processing complexity. Smaller batches are more reliable but take longer overall.
How much do batch operations cost in WUs?
Each backend workflow execution costs WUs based on its operations. Processing 5,000 records typically costs 5,000-25,000 WUs depending on what each iteration does.
Can I cancel a running batch job?
You cannot directly cancel scheduled workflows. Add a 'should_continue' flag that the workflow checks before processing each batch. Set the flag to false to stop the next batch from running.
How do I track progress of a batch job?
Create a BatchJob Data Type with fields for total records, processed count, and status. Update it after each batch. Display progress on an admin page.
Can RapidDev help with bulk data processing?
Yes. RapidDev can implement efficient batch processing workflows in Bubble including progress tracking, error handling, and performance optimization for large datasets.
Talk to an Expert
Our team has built 600+ apps. Get personalized help with your project.
Book a free consultation