Skip to main content
RapidDev - Software Development Agency
flutterflow-tutorials

How to Set Up Automatic Data Backups in FlutterFlow

Automate Firestore backups using a scheduled Cloud Function that calls the Firestore Admin export API daily at 2 AM UTC, writing snapshots to a Cloud Storage bucket organized by date. A retention function deletes backups older than 30 days. A backup_log collection tracks each backup's status, size, and duration. The FlutterFlow admin page displays backup history, lets admins trigger manual backups, and documents the restore procedure for disaster recovery.

What you'll learn

  • How to create a scheduled Cloud Function that exports Firestore to Cloud Storage
  • How to manage backup retention by automatically deleting old snapshots
  • How to build a backup monitoring UI with status logs and manual triggers
  • How to restore Firestore from a backup in case of data loss
Book a free consultation
4.9Clutch rating
600+Happy partners
17+Countries served
190+Team members
Beginner8 min read20-25 minFlutterFlow Pro+ (Cloud Functions required)March 2026RapidDev Engineering Team
TL;DR

Automate Firestore backups using a scheduled Cloud Function that calls the Firestore Admin export API daily at 2 AM UTC, writing snapshots to a Cloud Storage bucket organized by date. A retention function deletes backups older than 30 days. A backup_log collection tracks each backup's status, size, and duration. The FlutterFlow admin page displays backup history, lets admins trigger manual backups, and documents the restore procedure for disaster recovery.

Setting Up Automatic Firestore Backups in FlutterFlow

Data loss can be catastrophic. Whether from accidental deletions, buggy code, or security breaches, having automated backups is essential for any production app. This tutorial implements scheduled daily Firestore exports to Cloud Storage, automatic cleanup of old backups, monitoring and alerting, and an admin interface for managing the backup process.

Prerequisites

  • A FlutterFlow project on the Pro plan or higher
  • Firebase project on the Blaze plan (pay-as-you-go, required for Cloud Functions and Storage exports)
  • Cloud Firestore Admin API enabled in Google Cloud Console
  • A Cloud Storage bucket for backups (create one at gs://your-project-backups)

Step-by-step guide

1

Create the scheduled Cloud Function for daily Firestore exports

Create a Cloud Function named scheduledBackup triggered by Cloud Scheduler to run daily at 2 AM UTC. The function uses the Firestore Admin API to export all collections to a Cloud Storage bucket path organized by date: gs://your-project-backups/YYYY-MM-DD/. Use the google-cloud/firestore module's exportDocuments method with the project ID and output URI. After the export completes, create a document in a `backup_log` collection with fields: timestamp, status ('success' or 'failed'), bucketPath, and duration in seconds. Wrap the export call in a try-catch to log failures.

scheduledBackup.js
1// Cloud Function: scheduledBackup
2const functions = require('firebase-functions');
3const admin = require('firebase-admin');
4const firestore = require('@google-cloud/firestore');
5admin.initializeApp();
6
7const client = new firestore.v1.FirestoreAdminClient();
8const bucket = 'gs://your-project-backups';
9
10exports.scheduledBackup = functions.pubsub
11 .schedule('every day 02:00')
12 .timeZone('UTC')
13 .onRun(async () => {
14 const projectId = process.env.GCLOUD_PROJECT;
15 const db = `projects/${projectId}/databases/(default)`;
16 const date = new Date().toISOString().split('T')[0];
17 const outputUri = `${bucket}/${date}`;
18 const startTime = Date.now();
19
20 try {
21 const [response] = await client.exportDocuments({
22 name: db,
23 outputUriPrefix: outputUri,
24 collectionIds: [], // empty = all collections
25 });
26
27 await response.promise();
28 const duration = Math.round(
29 (Date.now() - startTime) / 1000
30 );
31
32 await admin.firestore()
33 .collection('backup_log').add({
34 timestamp: admin.firestore.FieldValue
35 .serverTimestamp(),
36 status: 'success',
37 bucketPath: outputUri,
38 durationSeconds: duration,
39 });
40 } catch (err) {
41 await admin.firestore()
42 .collection('backup_log').add({
43 timestamp: admin.firestore.FieldValue
44 .serverTimestamp(),
45 status: 'failed',
46 error: err.message,
47 bucketPath: outputUri,
48 });
49 }
50 });

Expected result: Firestore data exports to Cloud Storage daily at 2 AM UTC with a log entry recording the result.

2

Add a retention function to delete backups older than 30 days

Create a second scheduled Cloud Function named cleanupOldBackups that runs weekly. The function lists objects in the backup bucket, groups them by date folder, and deletes folders older than 30 days. Use the Google Cloud Storage SDK to list and delete objects. Also clean up corresponding backup_log documents older than 90 days to keep the log manageable. This ensures storage costs stay controlled while maintaining a rolling 30-day recovery window.

cleanupOldBackups.js
1// Cloud Function: cleanupOldBackups
2const { Storage } = require('@google-cloud/storage');
3const storage = new Storage();
4
5exports.cleanupOldBackups = functions.pubsub
6 .schedule('every monday 03:00')
7 .timeZone('UTC')
8 .onRun(async () => {
9 const bucketName = 'your-project-backups';
10 const bucket = storage.bucket(bucketName);
11 const cutoff = new Date();
12 cutoff.setDate(cutoff.getDate() - 30);
13
14 const [files] = await bucket.getFiles();
15 const toDelete = files.filter((f) => {
16 const dateStr = f.name.split('/')[0];
17 return new Date(dateStr) < cutoff;
18 });
19
20 for (const file of toDelete) {
21 await file.delete();
22 }
23
24 console.log(`Deleted ${toDelete.length} old files`);
25 });

Expected result: Backup files older than 30 days are automatically deleted weekly, keeping storage costs predictable.

3

Set up failure alerting via email notification

Extend the scheduledBackup function to send an alert when a backup fails. In the catch block, after logging the failure to backup_log, use a Cloud Function HTTP call to a notification service (SendGrid, Mailgun, or Firebase Extensions for email). Alternatively, create a simple Cloud Function triggered by Firestore onCreate on backup_log that checks if the new document has status 'failed' and sends an email to the admin address. Include the error message, timestamp, and a link to the Google Cloud Console for debugging.

Expected result: Admins receive an email notification immediately when a backup fails so they can investigate and fix the issue.

4

Build the admin backup management page in FlutterFlow

Create a BackupAdmin page gated by role check (Conditional Visibility: currentUser.role == 'admin'). At the top, show the last backup status in a prominent Container: green with checkmark for success, red with warning icon for failure, including the timestamp and bucket path. Below, add a ListView bound to a Backend Query on backup_log ordered by timestamp descending. Each item shows the date, status badge (green/red), duration, and bucket path. Add a 'Trigger Manual Backup' Button that calls an HTTP-triggered Cloud Function (a variant of scheduledBackup without the scheduler). Add a Switch to enable/disable automatic backups by updating a config document that the scheduled function checks before running.

Expected result: Admins see a complete backup history, can trigger manual backups on demand, and can toggle automatic backups on or off.

5

Document and test the restore procedure

Restoring from a backup uses the Firestore Admin import API. Create a Cloud Function named restoreBackup that accepts a bucketPath parameter and calls client.importDocuments with the input URI set to the backup path. Gate this function behind an admin auth check. IMPORTANT: test the restore process by importing to a separate Firestore project (not production) to verify backup integrity without overwriting live data. On the admin page, add a 'Restore' Button on each backup log item that opens a confirmation dialog explaining that restoring will overwrite current data. Only enable this for backup entries with status 'success'.

Expected result: Admins can restore Firestore from any successful backup. The process is tested and documented for disaster recovery.

Complete working example

Backup Cloud Functions
1// Cloud Functions: Firestore Backup System
2const functions = require('firebase-functions');
3const admin = require('firebase-admin');
4const firestore = require('@google-cloud/firestore');
5const { Storage } = require('@google-cloud/storage');
6
7admin.initializeApp();
8const client = new firestore.v1.FirestoreAdminClient();
9const storage = new Storage();
10const db = admin.firestore();
11const BUCKET = 'gs://your-project-backups';
12
13// Daily backup at 2 AM UTC
14exports.scheduledBackup = functions.pubsub
15 .schedule('every day 02:00')
16 .timeZone('UTC')
17 .onRun(async () => {
18 const config = await db.doc('config/backup').get();
19 if (config.exists && !config.data().enabled) return;
20
21 const projectId = process.env.GCLOUD_PROJECT;
22 const dbPath =
23 `projects/${projectId}/databases/(default)`;
24 const date = new Date().toISOString().split('T')[0];
25 const outputUri = `${BUCKET}/${date}`;
26 const start = Date.now();
27
28 try {
29 const [op] = await client.exportDocuments({
30 name: dbPath,
31 outputUriPrefix: outputUri,
32 collectionIds: [],
33 });
34 await op.promise();
35
36 await db.collection('backup_log').add({
37 timestamp: admin.firestore.FieldValue
38 .serverTimestamp(),
39 status: 'success',
40 bucketPath: outputUri,
41 durationSeconds:
42 Math.round((Date.now() - start) / 1000),
43 });
44 } catch (err) {
45 await db.collection('backup_log').add({
46 timestamp: admin.firestore.FieldValue
47 .serverTimestamp(),
48 status: 'failed',
49 error: err.message,
50 bucketPath: outputUri,
51 });
52 }
53 });
54
55// Weekly cleanup of backups older than 30 days
56exports.cleanupOldBackups = functions.pubsub
57 .schedule('every monday 03:00')
58 .timeZone('UTC')
59 .onRun(async () => {
60 const bucket = storage.bucket(
61 'your-project-backups'
62 );
63 const cutoff = new Date();
64 cutoff.setDate(cutoff.getDate() - 30);
65 const [files] = await bucket.getFiles();
66
67 let deleted = 0;
68 for (const file of files) {
69 const dateStr = file.name.split('/')[0];
70 if (new Date(dateStr) < cutoff) {
71 await file.delete();
72 deleted++;
73 }
74 }
75 console.log(`Cleaned up ${deleted} old files`);
76 });
77
78// Manual backup trigger (HTTP)
79exports.triggerManualBackup = functions.https
80 .onCall(async (data, context) => {
81 if (!context.auth) throw new functions.https
82 .HttpsError('unauthenticated', 'Login required');
83 // Trigger the same export logic
84 // ... same as scheduledBackup body
85 return { success: true };
86 });

Common mistakes

Why it's a problem: Not testing the restore process until an actual disaster occurs

How to avoid: Run a test restore to a separate Firestore project quarterly. Verify that all collections, documents, and subcollections are present and intact. Document the restore steps so any team member can execute them.

Why it's a problem: Exporting only specific collections instead of all collections

How to avoid: Pass an empty collectionIds array to exportDocuments, which tells Firestore to export ALL collections. This future-proofs the backup against schema additions.

Why it's a problem: Not monitoring backup failures and assuming they always succeed

How to avoid: Log every backup result to backup_log. Send email alerts on failure. Set up Google Cloud Monitoring alerts on the function's error rate as a second safety net.

Best practices

  • Run backups during off-peak hours (2-4 AM UTC) to minimize impact on app performance
  • Organize backup files by date (gs://bucket/YYYY-MM-DD/) for easy identification and cleanup
  • Keep a 30-day rolling retention window to balance recovery options with storage costs
  • Store the backup bucket in a different Google Cloud region than your primary Firestore for geographic redundancy
  • Log every backup result including duration so you can track trends and detect degradation
  • Gate the restore function behind multiple confirmations and admin-only access to prevent accidental data overwrites
  • Test restores quarterly to a non-production project to verify backup integrity

Still stuck?

Copy one of these prompts to get a personalized, step-by-step explanation.

ChatGPT Prompt

I need to set up automatic daily Firestore backups for my FlutterFlow app. Show me Cloud Functions for: scheduled daily export to Cloud Storage, cleanup of backups older than 30 days, failure alerting, and a restore function. Include the Firestore Admin API setup and a backup_log collection for monitoring.

FlutterFlow Prompt

Create an admin page showing Firestore backup history with status badges (green for success, red for failed). Add a Trigger Backup button that calls a Cloud Function. Show the last backup timestamp and status prominently at the top. Add a toggle switch to enable or disable automatic daily backups.

Frequently asked questions

How much does it cost to run daily Firestore backups?

Firestore exports are billed at the same rate as reads: about $0.06 per 100K documents. Cloud Storage costs about $0.02/GB/month. For a database with 100K documents, daily backups cost roughly $2-3/month including storage.

Can I back up only specific collections instead of the entire database?

Yes. Pass an array of collection IDs to the exportDocuments method. However, backing up all collections (empty array) is recommended to ensure nothing is missed when new collections are added.

How long does a Firestore export take?

It depends on database size. Small databases (under 100K documents) export in under a minute. Large databases (millions of documents) can take 10-30 minutes. The Cloud Function times out at 9 minutes, but the export continues as a background operation.

Does restoring from a backup overwrite existing data?

Yes. Firestore import overwrites documents with matching IDs and creates new ones for non-matching IDs. It does NOT delete documents that exist in the current database but not in the backup. Always test restores on a separate project first.

Can I automate backups without Cloud Functions?

Yes. Google Cloud offers a native Firestore scheduled backup feature (in preview) that can be configured directly in the console without writing any code. Check the Firebase documentation for current availability.

Can RapidDev help set up a production backup and disaster recovery system?

Yes. RapidDev can implement multi-region backup strategies, point-in-time recovery, automated restore testing, compliance-grade retention policies, and monitoring dashboards.

RapidDev

Talk to an Expert

Our team has built 600+ apps. Get personalized help with your project.

Book a free consultation

Need help with your project?

Our experts have built 600+ apps and can accelerate your development. Book a free consultation — no strings attached.

Book a free consultation

We put the rapid in RapidDev

Need a dedicated strategic tech and growth partner? Discover what RapidDev can do for your business! Book a call with our team to schedule a free, no-obligation consultation. We'll discuss your project and provide a custom quote at no cost.