Implement facial recognition authentication in FlutterFlow using a Custom Widget for camera preview with google_mlkit_face_detection for on-device detection, plus a Cloud Function that calls AWS Rekognition or Azure Face API for face matching. The registration flow captures the face, indexes it with the cloud API, and stores the faceId on the user's Firestore document. Login captures a live face, calls SearchFacesByImage via the Cloud Function, and issues a custom Firebase Auth token on match. Always add liveness detection to prevent photo spoofing.
Build app-level face authentication that works across all devices
Device-level biometrics (Face ID on iOS, face unlock on Android) are OS features managed by the platform — FlutterFlow can trigger them with the local_auth package but they do not provide a face embedding that your app controls. App-level facial recognition means your app captures the face image, extracts a facial embedding using a machine learning model, and verifies it against a stored reference. This is useful for kiosks, high-security apps, or multi-user devices where OS biometrics are not the right tool. This tutorial covers the full flow: camera capture, on-device face detection, cloud API verification via a Cloud Function, and issuing a Firebase custom token on successful match.
Prerequisites
- A FlutterFlow project with Firebase connected and Firebase Authentication enabled
- An AWS account with Rekognition enabled (or an Azure account with Face API enabled)
- FlutterFlow Standard or higher plan for Custom Code (Custom Widget, Custom Action)
Step-by-step guide
Add the camera and face detection Custom Widget
Add the camera and face detection Custom Widget
In FlutterFlow, go to Custom Code → Custom Widgets → Add Widget. Name it FaceCaptureWidget. In the pubspec dependencies, add camera: ^0.10.0 and google_mlkit_face_detection: ^0.9.0. The widget initializes the front camera, displays a live preview, and adds an oval face guide overlay. When the user taps Capture, it takes a picture, runs google_mlkit_face_detection on the image, and checks: (1) exactly one face detected, (2) face confidence > 0.95, (3) face size is at least 20% of image width (user is close enough). If all checks pass, it calls the widget's onFaceCaptured callback with the image file path. For liveness detection, require the user to blink: check that eyesOpenProbability changes from > 0.8 to < 0.2 within a 3-second window before accepting the capture.
1// FaceCaptureWidget — simplified structure2import 'package:camera/camera.dart';3import 'package:google_mlkit_face_detection/google_mlkit_face_detection.dart';45class FaceCaptureWidget extends StatefulWidget {6 final Function(String imagePath) onFaceCaptured;7 const FaceCaptureWidget({required this.onFaceCaptured, Key? key}) : super(key: key);89 @override10 State<FaceCaptureWidget> createState() => _FaceCaptureWidgetState();11}1213class _FaceCaptureWidgetState extends State<FaceCaptureWidget> {14 CameraController? _controller;15 final FaceDetector _faceDetector = FaceDetector(16 options: FaceDetectorOptions(17 enableClassification: true, // enables eyeOpenProbability18 minFaceSize: 0.2,19 ),20 );21 bool _blinkDetected = false;2223 Future<void> _captureAndVerify() async {24 final image = await _controller!.takePicture();25 final inputImage = InputImage.fromFilePath(image.path);26 final faces = await _faceDetector.processImage(inputImage);2728 if (faces.length != 1) return; // must be exactly one face29 final face = faces.first;30 if (!_blinkDetected) return; // liveness check31 if ((face.headEulerAngleY ?? 0).abs() > 15) return; // face forward3233 widget.onFaceCaptured(image.path);34 }35}Expected result: A live camera preview appears with an oval overlay. The widget only calls onFaceCaptured after detecting one forward-facing face and a blink event.
Deploy the face registration Cloud Function
Deploy the face registration Cloud Function
Deploy a Firebase Cloud Function named registerFace. It receives the base64-encoded face image and the authenticated user's Firebase UID (verify the ID token with admin.auth().verifyIdToken). It calls AWS Rekognition IndexFaces on a collection named app-faces with the image bytes and ExternalImageId set to the user's UID. Rekognition returns a FaceId (a UUID representing this face in the collection). Store this FaceId in Firestore at users/{uid}/faceId. Also store faceRegisteredAt: serverTimestamp() and faceRegistrationStatus: 'active'. Return {success: true, faceId} to FlutterFlow. In FlutterFlow, add this as an API Call in the API Manager, triggered from a Register Face button in the user's profile or onboarding flow. Send the base64 image from the Custom Widget callback and the Firebase ID token in the Authorization header.
1// registerFace Cloud Function2const functions = require('firebase-functions');3const admin = require('firebase-admin');4const AWS = require('aws-sdk');56const rekognition = new AWS.Rekognition({7 region: process.env.AWS_REGION,8 accessKeyId: process.env.AWS_ACCESS_KEY_ID,9 secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY10});1112exports.registerFace = functions.https.onCall(async (data, context) => {13 if (!context.auth) throw new functions.https.HttpsError('unauthenticated', 'Login required');14 15 const { imageBase64 } = data;16 const uid = context.auth.uid;17 const imageBytes = Buffer.from(imageBase64, 'base64');18 19 const result = await rekognition.indexFaces({20 CollectionId: 'app-faces',21 Image: { Bytes: imageBytes },22 ExternalImageId: uid,23 DetectionAttributes: ['DEFAULT'],24 MaxFaces: 1,25 QualityFilter: 'HIGH'26 }).promise();27 28 const faceId = result.FaceRecords[0]?.Face?.FaceId;29 if (!faceId) throw new functions.https.HttpsError('failed-precondition', 'No face detected');30 31 await admin.firestore().collection('users').doc(uid).update({32 faceId,33 faceRegisteredAt: admin.firestore.FieldValue.serverTimestamp(),34 faceRegistrationStatus: 'active'35 });36 37 return { success: true, faceId };38});Expected result: After tapping Register Face and completing the liveness check, the user's Firestore document is updated with a faceId from AWS Rekognition.
Deploy the face login Cloud Function
Deploy the face login Cloud Function
Deploy a Cloud Function named verifyFaceLogin. This function is called BEFORE the user is authenticated (it's a pre-auth endpoint). It receives the base64 face image. It calls AWS Rekognition SearchFacesByImage on the app-faces collection with the image, requesting the top 1 match with similarity threshold 90. If a match is found, it retrieves the ExternalImageId (which is the user's UID stored in step 2). It then uses Firebase Admin SDK to generate a custom auth token for that UID with admin.auth().createCustomToken(uid). It returns the custom token to FlutterFlow. In FlutterFlow, create a login page with the FaceCaptureWidget. On capture, call this Cloud Function, receive the customToken, and use a Custom Action: FirebaseAuth.instance.signInWithCustomToken(customToken) to complete the login. This does NOT use a password — the face is the credential.
Expected result: The user opens the app, faces the camera, blinks, and is logged in automatically using their registered face — no password required.
Add the face login page and Action Flow in FlutterFlow
Add the face login page and Action Flow in FlutterFlow
Create a new page named FaceLogin. Add the FaceCaptureWidget (from Custom Widgets) to the page. Add a Text widget showing 'Look at the camera and blink to sign in'. Add an onFaceCaptured callback: when the widget fires this callback with the image path, an Action Flow begins: (1) Read image file → encode to base64 using a Custom Function. (2) API Call → verifyFaceLogin Cloud Function with the base64 image. (3) If the response contains a customToken, execute a Custom Action: await FirebaseAuth.instance.signInWithCustomToken(token). (4) Navigate to the home page. (5) If the response contains an error (no match), show a Snackbar: 'Face not recognized. Please try again or use your password.' Add a fallback 'Sign in with password instead' TextButton below the camera.
Expected result: The face login page is functional — users who registered their face can sign in with a blink, and unregistered users see a clear fallback to password login.
Complete working example
1FACIAL RECOGNITION AUTH ARCHITECTURE23FLUTTERFLOW SIDE:4├── Custom Widget: FaceCaptureWidget5│ ├── Dependencies: camera, google_mlkit_face_detection6│ ├── Features: live preview, oval overlay, blink detection7│ └── Callback: onFaceCaptured(String imagePath)8│9├── Registration Flow (authenticated user):10│ ├── Page: ProfileSettings or Onboarding11│ ├── Action: FaceCaptureWidget → encode base6412│ ├── API Call: POST /registerFace13│ │ ├── Header: Authorization Bearer [firebase_id_token]14│ │ └── Body: { imageBase64 }15│ └── On success: show 'Face registered successfully'16│17└── Login Flow (unauthenticated):18 ├── Page: FaceLogin19 ├── Action: FaceCaptureWidget → encode base6420 ├── API Call: POST /verifyFaceLogin21 │ └── Body: { imageBase64 }22 ├── On match: Custom Action signInWithCustomToken(token)23 ├── Navigate to Home24 └── On no match: Snackbar + fallback to password2526CLOUD FUNCTION SIDE:27├── registerFace (callable, requires auth)28│ ├── Verify Firebase ID token29│ ├── Call Rekognition.indexFaces(CollectionId: 'app-faces')30│ ├── Store faceId in Firestore users/{uid}31│ └── Return { faceId }32│33└── verifyFaceLogin (callable, no auth required)34 ├── Call Rekognition.searchFacesByImage35 │ └── Threshold: 90%, MaxFaces: 136 ├── Get ExternalImageId → uid37 ├── admin.auth().createCustomToken(uid)38 └── Return { customToken }3940LIVENESS DETECTION (prevents photo spoofing):41├── Track eyeOpenProbability over time42├── Require: open (>0.8) → closed (<0.2) transition43└── Only accept capture after blink confirmed4445AWS REKOGNITION COLLECTION:46├── Create once: aws rekognition create-collection47│ --collection-id app-faces48└── One collection per app environment (test/prod)Common mistakes
Why it's a problem: Not implementing liveness detection — accepting a static photo as a valid face
How to avoid: Require a live action before accepting the face capture. The simplest approach: track eyeOpenProbability from google_mlkit_face_detection and require a blink (eyes open then closed then open) within a 3-second window. More robust: require the user to follow a moving dot or smile.
Why it's a problem: Calling AWS Rekognition directly from the Flutter app with embedded AWS credentials
How to avoid: All Rekognition calls must happen in a Cloud Function. The Flutter app sends the face image to a Cloud Function URL, and the Cloud Function uses AWS credentials stored in environment variables. Never put AWS keys in pubspec, Dart code, or FlutterFlow secrets that get embedded in the app.
Why it's a problem: Using a single Rekognition collection for both test and production environments
How to avoid: Create separate Rekognition collections: app-faces-dev and app-faces-prod. Use Cloud Function environment variables to select the correct collection based on the deployment environment.
Best practices
- Always implement liveness detection (blink, head turn, or smile) before accepting a face capture for authentication — static photos must not be accepted
- Store AWS Rekognition credentials only in Cloud Function environment variables — never in FlutterFlow, Flutter code, or any client-side location
- Use a similarity threshold of at least 90% in SearchFacesByImage to balance security and usability — lower thresholds increase false positives
- Provide a clear password or email fallback login path — not all users will want or be able to use facial recognition
- Clearly disclose facial data collection in your privacy policy and app onboarding — required by GDPR, CCPA, and BIPA in Illinois
- Implement a way for users to delete their registered face from the Rekognition collection (GDPR right to erasure) — call rekognition.deleteFaces() from a Cloud Function on user request
- Test face recognition across diverse lighting conditions, face angles, and with and without glasses before launching
Still stuck?
Copy one of these prompts to get a personalized, step-by-step explanation.
I am building a FlutterFlow app and want to add facial recognition login as an alternative to password authentication. I plan to use AWS Rekognition. Explain the architecture: what happens during face registration, what happens during face login, how do I issue a Firebase Auth token after a successful face match, and what liveness detection should I implement to prevent photo spoofing?
Create a Custom Widget for FlutterFlow that shows a live front camera preview with an oval face guide overlay. The widget should use google_mlkit_face_detection to: (1) detect if exactly one face is visible, (2) check that the face is looking forward (headEulerAngleY < 15 degrees), (3) detect a blink by tracking eyeOpenProbability changes. Only fire the onFaceCaptured callback after all three conditions are met.
Frequently asked questions
How do I add facial recognition to a FlutterFlow app?
Facial recognition in FlutterFlow requires three components: (1) a Custom Widget using the camera and google_mlkit_face_detection packages to capture and validate the face on-device, (2) a Cloud Function that calls AWS Rekognition or Azure Face API to match the face against registered users, and (3) a Firebase custom token issued on match to complete the authentication. All cloud API credentials stay server-side in Cloud Functions.
Can I use device Face ID (iPhone) instead of building custom face recognition?
Yes, and for most apps, device Face ID or fingerprint is the better choice. Add the local_auth Flutter package as a Custom Action in FlutterFlow: LocalAuthentication().authenticate() triggers iOS Face ID or Android biometric prompt. This is simpler, more secure (biometric data never leaves the device), and already trusted by users. Custom face recognition is for cases where you need cross-device face matching or cannot use device biometrics.
What is the difference between on-device face detection and cloud face recognition?
On-device face detection (google_mlkit_face_detection) detects that a face is present and estimates landmarks, but cannot match faces across different images or users. Cloud face recognition (AWS Rekognition, Azure Face) creates a unique mathematical embedding of a face that can be compared against a database of registered faces. Use on-device detection for liveness checks and face quality validation, then send to the cloud for actual identity matching.
Is AWS Rekognition or Azure Face better for FlutterFlow face authentication?
Both work well. AWS Rekognition is more widely used, has a generous free tier (5,000 images/month for 12 months), and has excellent documentation. Azure Face API has strong liveness detection built into the SDK and integrates well if your team already uses Azure. The Cloud Function patterns are identical — only the SDK calls differ. For a new project with no existing cloud preferences, start with AWS Rekognition.
How do I comply with privacy laws when collecting facial data?
Facial recognition data is biometric data regulated by GDPR (EU), CCPA (California), and state biometric laws like Illinois BIPA. Required steps: (1) display explicit consent before face registration explaining what data is collected and how it is used, (2) provide a way for users to delete their facial data (call rekognition.deleteFaces() from a Cloud Function), (3) add facial data processing to your privacy policy, (4) do not sell or share facial embeddings with third parties. Consult a lawyer for jurisdiction-specific requirements before shipping.
What if the face recognition fails for legitimate users?
False rejections happen due to changes in lighting, new glasses, aging, or the user tilting their head. Always provide an accessible fallback login method (email + password, magic link, or SMS OTP). Show clear UI guidance when face recognition fails: 'Make sure your face is well-lit and looking directly at the camera.' After 3 failed attempts, automatically offer the fallback method. If you need help building a robust multi-factor auth system with facial recognition as one factor, RapidDev has implemented this pattern across several security-sensitive FlutterFlow apps.
Talk to an Expert
Our team has built 600+ apps. Get personalized help with your project.
Book a free consultation