Skip to main content
RapidDev - Software Development Agency
flutterflow-tutorials

How to Integrate Facial Recognition for User Authentication in FlutterFlow

Implement facial recognition authentication in FlutterFlow using a Custom Widget for camera preview with google_mlkit_face_detection for on-device detection, plus a Cloud Function that calls AWS Rekognition or Azure Face API for face matching. The registration flow captures the face, indexes it with the cloud API, and stores the faceId on the user's Firestore document. Login captures a live face, calls SearchFacesByImage via the Cloud Function, and issues a custom Firebase Auth token on match. Always add liveness detection to prevent photo spoofing.

What you'll learn

  • How to capture a face image using a Custom Widget with camera preview in FlutterFlow
  • How to use google_mlkit_face_detection for on-device face validation before sending to the cloud
  • How to call AWS Rekognition IndexFaces and SearchFacesByImage via a Cloud Function
  • How to implement liveness detection to prevent photo spoofing attacks
Book a free consultation
4.9Clutch rating
600+Happy partners
17+Countries served
190+Team members
Intermediate11 min read90-120 minFlutterFlow Standard+ (Custom Code required)March 2026RapidDev Engineering Team
TL;DR

Implement facial recognition authentication in FlutterFlow using a Custom Widget for camera preview with google_mlkit_face_detection for on-device detection, plus a Cloud Function that calls AWS Rekognition or Azure Face API for face matching. The registration flow captures the face, indexes it with the cloud API, and stores the faceId on the user's Firestore document. Login captures a live face, calls SearchFacesByImage via the Cloud Function, and issues a custom Firebase Auth token on match. Always add liveness detection to prevent photo spoofing.

Build app-level face authentication that works across all devices

Device-level biometrics (Face ID on iOS, face unlock on Android) are OS features managed by the platform — FlutterFlow can trigger them with the local_auth package but they do not provide a face embedding that your app controls. App-level facial recognition means your app captures the face image, extracts a facial embedding using a machine learning model, and verifies it against a stored reference. This is useful for kiosks, high-security apps, or multi-user devices where OS biometrics are not the right tool. This tutorial covers the full flow: camera capture, on-device face detection, cloud API verification via a Cloud Function, and issuing a Firebase custom token on successful match.

Prerequisites

  • A FlutterFlow project with Firebase connected and Firebase Authentication enabled
  • An AWS account with Rekognition enabled (or an Azure account with Face API enabled)
  • FlutterFlow Standard or higher plan for Custom Code (Custom Widget, Custom Action)

Step-by-step guide

1

Add the camera and face detection Custom Widget

In FlutterFlow, go to Custom Code → Custom Widgets → Add Widget. Name it FaceCaptureWidget. In the pubspec dependencies, add camera: ^0.10.0 and google_mlkit_face_detection: ^0.9.0. The widget initializes the front camera, displays a live preview, and adds an oval face guide overlay. When the user taps Capture, it takes a picture, runs google_mlkit_face_detection on the image, and checks: (1) exactly one face detected, (2) face confidence > 0.95, (3) face size is at least 20% of image width (user is close enough). If all checks pass, it calls the widget's onFaceCaptured callback with the image file path. For liveness detection, require the user to blink: check that eyesOpenProbability changes from > 0.8 to < 0.2 within a 3-second window before accepting the capture.

face_capture_widget.dart
1// FaceCaptureWidget — simplified structure
2import 'package:camera/camera.dart';
3import 'package:google_mlkit_face_detection/google_mlkit_face_detection.dart';
4
5class FaceCaptureWidget extends StatefulWidget {
6 final Function(String imagePath) onFaceCaptured;
7 const FaceCaptureWidget({required this.onFaceCaptured, Key? key}) : super(key: key);
8
9 @override
10 State<FaceCaptureWidget> createState() => _FaceCaptureWidgetState();
11}
12
13class _FaceCaptureWidgetState extends State<FaceCaptureWidget> {
14 CameraController? _controller;
15 final FaceDetector _faceDetector = FaceDetector(
16 options: FaceDetectorOptions(
17 enableClassification: true, // enables eyeOpenProbability
18 minFaceSize: 0.2,
19 ),
20 );
21 bool _blinkDetected = false;
22
23 Future<void> _captureAndVerify() async {
24 final image = await _controller!.takePicture();
25 final inputImage = InputImage.fromFilePath(image.path);
26 final faces = await _faceDetector.processImage(inputImage);
27
28 if (faces.length != 1) return; // must be exactly one face
29 final face = faces.first;
30 if (!_blinkDetected) return; // liveness check
31 if ((face.headEulerAngleY ?? 0).abs() > 15) return; // face forward
32
33 widget.onFaceCaptured(image.path);
34 }
35}

Expected result: A live camera preview appears with an oval overlay. The widget only calls onFaceCaptured after detecting one forward-facing face and a blink event.

2

Deploy the face registration Cloud Function

Deploy a Firebase Cloud Function named registerFace. It receives the base64-encoded face image and the authenticated user's Firebase UID (verify the ID token with admin.auth().verifyIdToken). It calls AWS Rekognition IndexFaces on a collection named app-faces with the image bytes and ExternalImageId set to the user's UID. Rekognition returns a FaceId (a UUID representing this face in the collection). Store this FaceId in Firestore at users/{uid}/faceId. Also store faceRegisteredAt: serverTimestamp() and faceRegistrationStatus: 'active'. Return {success: true, faceId} to FlutterFlow. In FlutterFlow, add this as an API Call in the API Manager, triggered from a Register Face button in the user's profile or onboarding flow. Send the base64 image from the Custom Widget callback and the Firebase ID token in the Authorization header.

register_face.js
1// registerFace Cloud Function
2const functions = require('firebase-functions');
3const admin = require('firebase-admin');
4const AWS = require('aws-sdk');
5
6const rekognition = new AWS.Rekognition({
7 region: process.env.AWS_REGION,
8 accessKeyId: process.env.AWS_ACCESS_KEY_ID,
9 secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY
10});
11
12exports.registerFace = functions.https.onCall(async (data, context) => {
13 if (!context.auth) throw new functions.https.HttpsError('unauthenticated', 'Login required');
14
15 const { imageBase64 } = data;
16 const uid = context.auth.uid;
17 const imageBytes = Buffer.from(imageBase64, 'base64');
18
19 const result = await rekognition.indexFaces({
20 CollectionId: 'app-faces',
21 Image: { Bytes: imageBytes },
22 ExternalImageId: uid,
23 DetectionAttributes: ['DEFAULT'],
24 MaxFaces: 1,
25 QualityFilter: 'HIGH'
26 }).promise();
27
28 const faceId = result.FaceRecords[0]?.Face?.FaceId;
29 if (!faceId) throw new functions.https.HttpsError('failed-precondition', 'No face detected');
30
31 await admin.firestore().collection('users').doc(uid).update({
32 faceId,
33 faceRegisteredAt: admin.firestore.FieldValue.serverTimestamp(),
34 faceRegistrationStatus: 'active'
35 });
36
37 return { success: true, faceId };
38});

Expected result: After tapping Register Face and completing the liveness check, the user's Firestore document is updated with a faceId from AWS Rekognition.

3

Deploy the face login Cloud Function

Deploy a Cloud Function named verifyFaceLogin. This function is called BEFORE the user is authenticated (it's a pre-auth endpoint). It receives the base64 face image. It calls AWS Rekognition SearchFacesByImage on the app-faces collection with the image, requesting the top 1 match with similarity threshold 90. If a match is found, it retrieves the ExternalImageId (which is the user's UID stored in step 2). It then uses Firebase Admin SDK to generate a custom auth token for that UID with admin.auth().createCustomToken(uid). It returns the custom token to FlutterFlow. In FlutterFlow, create a login page with the FaceCaptureWidget. On capture, call this Cloud Function, receive the customToken, and use a Custom Action: FirebaseAuth.instance.signInWithCustomToken(customToken) to complete the login. This does NOT use a password — the face is the credential.

Expected result: The user opens the app, faces the camera, blinks, and is logged in automatically using their registered face — no password required.

4

Add the face login page and Action Flow in FlutterFlow

Create a new page named FaceLogin. Add the FaceCaptureWidget (from Custom Widgets) to the page. Add a Text widget showing 'Look at the camera and blink to sign in'. Add an onFaceCaptured callback: when the widget fires this callback with the image path, an Action Flow begins: (1) Read image file → encode to base64 using a Custom Function. (2) API Call → verifyFaceLogin Cloud Function with the base64 image. (3) If the response contains a customToken, execute a Custom Action: await FirebaseAuth.instance.signInWithCustomToken(token). (4) Navigate to the home page. (5) If the response contains an error (no match), show a Snackbar: 'Face not recognized. Please try again or use your password.' Add a fallback 'Sign in with password instead' TextButton below the camera.

Expected result: The face login page is functional — users who registered their face can sign in with a blink, and unregistered users see a clear fallback to password login.

Complete working example

face_auth_architecture.txt
1FACIAL RECOGNITION AUTH ARCHITECTURE
2
3FLUTTERFLOW SIDE:
4 Custom Widget: FaceCaptureWidget
5 Dependencies: camera, google_mlkit_face_detection
6 Features: live preview, oval overlay, blink detection
7 Callback: onFaceCaptured(String imagePath)
8
9 Registration Flow (authenticated user):
10 Page: ProfileSettings or Onboarding
11 Action: FaceCaptureWidget encode base64
12 API Call: POST /registerFace
13 Header: Authorization Bearer [firebase_id_token]
14 Body: { imageBase64 }
15 On success: show 'Face registered successfully'
16
17 Login Flow (unauthenticated):
18 Page: FaceLogin
19 Action: FaceCaptureWidget encode base64
20 API Call: POST /verifyFaceLogin
21 Body: { imageBase64 }
22 On match: Custom Action signInWithCustomToken(token)
23 Navigate to Home
24 On no match: Snackbar + fallback to password
25
26CLOUD FUNCTION SIDE:
27 registerFace (callable, requires auth)
28 Verify Firebase ID token
29 Call Rekognition.indexFaces(CollectionId: 'app-faces')
30 Store faceId in Firestore users/{uid}
31 Return { faceId }
32
33 verifyFaceLogin (callable, no auth required)
34 Call Rekognition.searchFacesByImage
35 Threshold: 90%, MaxFaces: 1
36 Get ExternalImageId uid
37 admin.auth().createCustomToken(uid)
38 Return { customToken }
39
40LIVENESS DETECTION (prevents photo spoofing):
41 Track eyeOpenProbability over time
42 Require: open (>0.8) closed (<0.2) transition
43 Only accept capture after blink confirmed
44
45AWS REKOGNITION COLLECTION:
46 Create once: aws rekognition create-collection
47 --collection-id app-faces
48 One collection per app environment (test/prod)

Common mistakes

Why it's a problem: Not implementing liveness detection — accepting a static photo as a valid face

How to avoid: Require a live action before accepting the face capture. The simplest approach: track eyeOpenProbability from google_mlkit_face_detection and require a blink (eyes open then closed then open) within a 3-second window. More robust: require the user to follow a moving dot or smile.

Why it's a problem: Calling AWS Rekognition directly from the Flutter app with embedded AWS credentials

How to avoid: All Rekognition calls must happen in a Cloud Function. The Flutter app sends the face image to a Cloud Function URL, and the Cloud Function uses AWS credentials stored in environment variables. Never put AWS keys in pubspec, Dart code, or FlutterFlow secrets that get embedded in the app.

Why it's a problem: Using a single Rekognition collection for both test and production environments

How to avoid: Create separate Rekognition collections: app-faces-dev and app-faces-prod. Use Cloud Function environment variables to select the correct collection based on the deployment environment.

Best practices

  • Always implement liveness detection (blink, head turn, or smile) before accepting a face capture for authentication — static photos must not be accepted
  • Store AWS Rekognition credentials only in Cloud Function environment variables — never in FlutterFlow, Flutter code, or any client-side location
  • Use a similarity threshold of at least 90% in SearchFacesByImage to balance security and usability — lower thresholds increase false positives
  • Provide a clear password or email fallback login path — not all users will want or be able to use facial recognition
  • Clearly disclose facial data collection in your privacy policy and app onboarding — required by GDPR, CCPA, and BIPA in Illinois
  • Implement a way for users to delete their registered face from the Rekognition collection (GDPR right to erasure) — call rekognition.deleteFaces() from a Cloud Function on user request
  • Test face recognition across diverse lighting conditions, face angles, and with and without glasses before launching

Still stuck?

Copy one of these prompts to get a personalized, step-by-step explanation.

ChatGPT Prompt

I am building a FlutterFlow app and want to add facial recognition login as an alternative to password authentication. I plan to use AWS Rekognition. Explain the architecture: what happens during face registration, what happens during face login, how do I issue a Firebase Auth token after a successful face match, and what liveness detection should I implement to prevent photo spoofing?

FlutterFlow Prompt

Create a Custom Widget for FlutterFlow that shows a live front camera preview with an oval face guide overlay. The widget should use google_mlkit_face_detection to: (1) detect if exactly one face is visible, (2) check that the face is looking forward (headEulerAngleY < 15 degrees), (3) detect a blink by tracking eyeOpenProbability changes. Only fire the onFaceCaptured callback after all three conditions are met.

Frequently asked questions

How do I add facial recognition to a FlutterFlow app?

Facial recognition in FlutterFlow requires three components: (1) a Custom Widget using the camera and google_mlkit_face_detection packages to capture and validate the face on-device, (2) a Cloud Function that calls AWS Rekognition or Azure Face API to match the face against registered users, and (3) a Firebase custom token issued on match to complete the authentication. All cloud API credentials stay server-side in Cloud Functions.

Can I use device Face ID (iPhone) instead of building custom face recognition?

Yes, and for most apps, device Face ID or fingerprint is the better choice. Add the local_auth Flutter package as a Custom Action in FlutterFlow: LocalAuthentication().authenticate() triggers iOS Face ID or Android biometric prompt. This is simpler, more secure (biometric data never leaves the device), and already trusted by users. Custom face recognition is for cases where you need cross-device face matching or cannot use device biometrics.

What is the difference between on-device face detection and cloud face recognition?

On-device face detection (google_mlkit_face_detection) detects that a face is present and estimates landmarks, but cannot match faces across different images or users. Cloud face recognition (AWS Rekognition, Azure Face) creates a unique mathematical embedding of a face that can be compared against a database of registered faces. Use on-device detection for liveness checks and face quality validation, then send to the cloud for actual identity matching.

Is AWS Rekognition or Azure Face better for FlutterFlow face authentication?

Both work well. AWS Rekognition is more widely used, has a generous free tier (5,000 images/month for 12 months), and has excellent documentation. Azure Face API has strong liveness detection built into the SDK and integrates well if your team already uses Azure. The Cloud Function patterns are identical — only the SDK calls differ. For a new project with no existing cloud preferences, start with AWS Rekognition.

How do I comply with privacy laws when collecting facial data?

Facial recognition data is biometric data regulated by GDPR (EU), CCPA (California), and state biometric laws like Illinois BIPA. Required steps: (1) display explicit consent before face registration explaining what data is collected and how it is used, (2) provide a way for users to delete their facial data (call rekognition.deleteFaces() from a Cloud Function), (3) add facial data processing to your privacy policy, (4) do not sell or share facial embeddings with third parties. Consult a lawyer for jurisdiction-specific requirements before shipping.

What if the face recognition fails for legitimate users?

False rejections happen due to changes in lighting, new glasses, aging, or the user tilting their head. Always provide an accessible fallback login method (email + password, magic link, or SMS OTP). Show clear UI guidance when face recognition fails: 'Make sure your face is well-lit and looking directly at the camera.' After 3 failed attempts, automatically offer the fallback method. If you need help building a robust multi-factor auth system with facial recognition as one factor, RapidDev has implemented this pattern across several security-sensitive FlutterFlow apps.

RapidDev

Talk to an Expert

Our team has built 600+ apps. Get personalized help with your project.

Book a free consultation

Need help with your project?

Our experts have built 600+ apps and can accelerate your development. Book a free consultation — no strings attached.

Book a free consultation

We put the rapid in RapidDev

Need a dedicated strategic tech and growth partner? Discover what RapidDev can do for your business! Book a call with our team to schedule a free, no-obligation consultation. We'll discuss your project and provide a custom quote at no cost.