Automate Firestore backups using a scheduled Cloud Function that calls the Firestore Admin export API daily at 2 AM UTC, writing snapshots to a Cloud Storage bucket organized by date. A retention function deletes backups older than 30 days. A backup_log collection tracks each backup's status, size, and duration. The FlutterFlow admin page displays backup history, lets admins trigger manual backups, and documents the restore procedure for disaster recovery.
Setting Up Automatic Firestore Backups in FlutterFlow
Data loss can be catastrophic. Whether from accidental deletions, buggy code, or security breaches, having automated backups is essential for any production app. This tutorial implements scheduled daily Firestore exports to Cloud Storage, automatic cleanup of old backups, monitoring and alerting, and an admin interface for managing the backup process.
Prerequisites
- A FlutterFlow project on the Pro plan or higher
- Firebase project on the Blaze plan (pay-as-you-go, required for Cloud Functions and Storage exports)
- Cloud Firestore Admin API enabled in Google Cloud Console
- A Cloud Storage bucket for backups (create one at gs://your-project-backups)
Step-by-step guide
Create the scheduled Cloud Function for daily Firestore exports
Create the scheduled Cloud Function for daily Firestore exports
Create a Cloud Function named scheduledBackup triggered by Cloud Scheduler to run daily at 2 AM UTC. The function uses the Firestore Admin API to export all collections to a Cloud Storage bucket path organized by date: gs://your-project-backups/YYYY-MM-DD/. Use the google-cloud/firestore module's exportDocuments method with the project ID and output URI. After the export completes, create a document in a `backup_log` collection with fields: timestamp, status ('success' or 'failed'), bucketPath, and duration in seconds. Wrap the export call in a try-catch to log failures.
1// Cloud Function: scheduledBackup2const functions = require('firebase-functions');3const admin = require('firebase-admin');4const firestore = require('@google-cloud/firestore');5admin.initializeApp();67const client = new firestore.v1.FirestoreAdminClient();8const bucket = 'gs://your-project-backups';910exports.scheduledBackup = functions.pubsub11 .schedule('every day 02:00')12 .timeZone('UTC')13 .onRun(async () => {14 const projectId = process.env.GCLOUD_PROJECT;15 const db = `projects/${projectId}/databases/(default)`;16 const date = new Date().toISOString().split('T')[0];17 const outputUri = `${bucket}/${date}`;18 const startTime = Date.now();1920 try {21 const [response] = await client.exportDocuments({22 name: db,23 outputUriPrefix: outputUri,24 collectionIds: [], // empty = all collections25 });2627 await response.promise();28 const duration = Math.round(29 (Date.now() - startTime) / 100030 );3132 await admin.firestore()33 .collection('backup_log').add({34 timestamp: admin.firestore.FieldValue35 .serverTimestamp(),36 status: 'success',37 bucketPath: outputUri,38 durationSeconds: duration,39 });40 } catch (err) {41 await admin.firestore()42 .collection('backup_log').add({43 timestamp: admin.firestore.FieldValue44 .serverTimestamp(),45 status: 'failed',46 error: err.message,47 bucketPath: outputUri,48 });49 }50 });Expected result: Firestore data exports to Cloud Storage daily at 2 AM UTC with a log entry recording the result.
Add a retention function to delete backups older than 30 days
Add a retention function to delete backups older than 30 days
Create a second scheduled Cloud Function named cleanupOldBackups that runs weekly. The function lists objects in the backup bucket, groups them by date folder, and deletes folders older than 30 days. Use the Google Cloud Storage SDK to list and delete objects. Also clean up corresponding backup_log documents older than 90 days to keep the log manageable. This ensures storage costs stay controlled while maintaining a rolling 30-day recovery window.
1// Cloud Function: cleanupOldBackups2const { Storage } = require('@google-cloud/storage');3const storage = new Storage();45exports.cleanupOldBackups = functions.pubsub6 .schedule('every monday 03:00')7 .timeZone('UTC')8 .onRun(async () => {9 const bucketName = 'your-project-backups';10 const bucket = storage.bucket(bucketName);11 const cutoff = new Date();12 cutoff.setDate(cutoff.getDate() - 30);1314 const [files] = await bucket.getFiles();15 const toDelete = files.filter((f) => {16 const dateStr = f.name.split('/')[0];17 return new Date(dateStr) < cutoff;18 });1920 for (const file of toDelete) {21 await file.delete();22 }2324 console.log(`Deleted ${toDelete.length} old files`);25 });Expected result: Backup files older than 30 days are automatically deleted weekly, keeping storage costs predictable.
Set up failure alerting via email notification
Set up failure alerting via email notification
Extend the scheduledBackup function to send an alert when a backup fails. In the catch block, after logging the failure to backup_log, use a Cloud Function HTTP call to a notification service (SendGrid, Mailgun, or Firebase Extensions for email). Alternatively, create a simple Cloud Function triggered by Firestore onCreate on backup_log that checks if the new document has status 'failed' and sends an email to the admin address. Include the error message, timestamp, and a link to the Google Cloud Console for debugging.
Expected result: Admins receive an email notification immediately when a backup fails so they can investigate and fix the issue.
Build the admin backup management page in FlutterFlow
Build the admin backup management page in FlutterFlow
Create a BackupAdmin page gated by role check (Conditional Visibility: currentUser.role == 'admin'). At the top, show the last backup status in a prominent Container: green with checkmark for success, red with warning icon for failure, including the timestamp and bucket path. Below, add a ListView bound to a Backend Query on backup_log ordered by timestamp descending. Each item shows the date, status badge (green/red), duration, and bucket path. Add a 'Trigger Manual Backup' Button that calls an HTTP-triggered Cloud Function (a variant of scheduledBackup without the scheduler). Add a Switch to enable/disable automatic backups by updating a config document that the scheduled function checks before running.
Expected result: Admins see a complete backup history, can trigger manual backups on demand, and can toggle automatic backups on or off.
Document and test the restore procedure
Document and test the restore procedure
Restoring from a backup uses the Firestore Admin import API. Create a Cloud Function named restoreBackup that accepts a bucketPath parameter and calls client.importDocuments with the input URI set to the backup path. Gate this function behind an admin auth check. IMPORTANT: test the restore process by importing to a separate Firestore project (not production) to verify backup integrity without overwriting live data. On the admin page, add a 'Restore' Button on each backup log item that opens a confirmation dialog explaining that restoring will overwrite current data. Only enable this for backup entries with status 'success'.
Expected result: Admins can restore Firestore from any successful backup. The process is tested and documented for disaster recovery.
Complete working example
1// Cloud Functions: Firestore Backup System2const functions = require('firebase-functions');3const admin = require('firebase-admin');4const firestore = require('@google-cloud/firestore');5const { Storage } = require('@google-cloud/storage');67admin.initializeApp();8const client = new firestore.v1.FirestoreAdminClient();9const storage = new Storage();10const db = admin.firestore();11const BUCKET = 'gs://your-project-backups';1213// Daily backup at 2 AM UTC14exports.scheduledBackup = functions.pubsub15 .schedule('every day 02:00')16 .timeZone('UTC')17 .onRun(async () => {18 const config = await db.doc('config/backup').get();19 if (config.exists && !config.data().enabled) return;2021 const projectId = process.env.GCLOUD_PROJECT;22 const dbPath =23 `projects/${projectId}/databases/(default)`;24 const date = new Date().toISOString().split('T')[0];25 const outputUri = `${BUCKET}/${date}`;26 const start = Date.now();2728 try {29 const [op] = await client.exportDocuments({30 name: dbPath,31 outputUriPrefix: outputUri,32 collectionIds: [],33 });34 await op.promise();3536 await db.collection('backup_log').add({37 timestamp: admin.firestore.FieldValue38 .serverTimestamp(),39 status: 'success',40 bucketPath: outputUri,41 durationSeconds:42 Math.round((Date.now() - start) / 1000),43 });44 } catch (err) {45 await db.collection('backup_log').add({46 timestamp: admin.firestore.FieldValue47 .serverTimestamp(),48 status: 'failed',49 error: err.message,50 bucketPath: outputUri,51 });52 }53 });5455// Weekly cleanup of backups older than 30 days56exports.cleanupOldBackups = functions.pubsub57 .schedule('every monday 03:00')58 .timeZone('UTC')59 .onRun(async () => {60 const bucket = storage.bucket(61 'your-project-backups'62 );63 const cutoff = new Date();64 cutoff.setDate(cutoff.getDate() - 30);65 const [files] = await bucket.getFiles();6667 let deleted = 0;68 for (const file of files) {69 const dateStr = file.name.split('/')[0];70 if (new Date(dateStr) < cutoff) {71 await file.delete();72 deleted++;73 }74 }75 console.log(`Cleaned up ${deleted} old files`);76 });7778// Manual backup trigger (HTTP)79exports.triggerManualBackup = functions.https80 .onCall(async (data, context) => {81 if (!context.auth) throw new functions.https82 .HttpsError('unauthenticated', 'Login required');83 // Trigger the same export logic84 // ... same as scheduledBackup body85 return { success: true };86 });Common mistakes
Why it's a problem: Not testing the restore process until an actual disaster occurs
How to avoid: Run a test restore to a separate Firestore project quarterly. Verify that all collections, documents, and subcollections are present and intact. Document the restore steps so any team member can execute them.
Why it's a problem: Exporting only specific collections instead of all collections
How to avoid: Pass an empty collectionIds array to exportDocuments, which tells Firestore to export ALL collections. This future-proofs the backup against schema additions.
Why it's a problem: Not monitoring backup failures and assuming they always succeed
How to avoid: Log every backup result to backup_log. Send email alerts on failure. Set up Google Cloud Monitoring alerts on the function's error rate as a second safety net.
Best practices
- Run backups during off-peak hours (2-4 AM UTC) to minimize impact on app performance
- Organize backup files by date (gs://bucket/YYYY-MM-DD/) for easy identification and cleanup
- Keep a 30-day rolling retention window to balance recovery options with storage costs
- Store the backup bucket in a different Google Cloud region than your primary Firestore for geographic redundancy
- Log every backup result including duration so you can track trends and detect degradation
- Gate the restore function behind multiple confirmations and admin-only access to prevent accidental data overwrites
- Test restores quarterly to a non-production project to verify backup integrity
Still stuck?
Copy one of these prompts to get a personalized, step-by-step explanation.
I need to set up automatic daily Firestore backups for my FlutterFlow app. Show me Cloud Functions for: scheduled daily export to Cloud Storage, cleanup of backups older than 30 days, failure alerting, and a restore function. Include the Firestore Admin API setup and a backup_log collection for monitoring.
Create an admin page showing Firestore backup history with status badges (green for success, red for failed). Add a Trigger Backup button that calls a Cloud Function. Show the last backup timestamp and status prominently at the top. Add a toggle switch to enable or disable automatic daily backups.
Frequently asked questions
How much does it cost to run daily Firestore backups?
Firestore exports are billed at the same rate as reads: about $0.06 per 100K documents. Cloud Storage costs about $0.02/GB/month. For a database with 100K documents, daily backups cost roughly $2-3/month including storage.
Can I back up only specific collections instead of the entire database?
Yes. Pass an array of collection IDs to the exportDocuments method. However, backing up all collections (empty array) is recommended to ensure nothing is missed when new collections are added.
How long does a Firestore export take?
It depends on database size. Small databases (under 100K documents) export in under a minute. Large databases (millions of documents) can take 10-30 minutes. The Cloud Function times out at 9 minutes, but the export continues as a background operation.
Does restoring from a backup overwrite existing data?
Yes. Firestore import overwrites documents with matching IDs and creates new ones for non-matching IDs. It does NOT delete documents that exist in the current database but not in the backup. Always test restores on a separate project first.
Can I automate backups without Cloud Functions?
Yes. Google Cloud offers a native Firestore scheduled backup feature (in preview) that can be configured directly in the console without writing any code. Check the Firebase documentation for current availability.
Can RapidDev help set up a production backup and disaster recovery system?
Yes. RapidDev can implement multi-region backup strategies, point-in-time recovery, automated restore testing, compliance-grade retention policies, and monitoring dashboards.
Talk to an Expert
Our team has built 600+ apps. Get personalized help with your project.
Book a free consultation