Integrating Siri and Google Assistant with FlutterFlow involves two different approaches. Google Assistant works via Actions on Google, which calls a Cloud Function webhook that reads from and writes to Firestore — no code export needed. Siri Shortcuts via SiriKit requires adding an Intents Extension in Xcode after exporting your FlutterFlow project, which means it is only practical on the Pro plan with code export. For most apps, in-app voice input using speech_to_text is a simpler and more achievable goal.
Google Assistant via Cloud Function webhooks, Siri Shortcuts via code export
Voice assistant integration in FlutterFlow is more limited than in native iOS/Android development, and the exact approach differs significantly depending on which platform you are targeting. Google Assistant integration does not require code export — you build a conversational Action in the Actions on Google console, create a Cloud Function as the fulfillment webhook, and the Cloud Function reads and writes Firestore data that FlutterFlow displays. Siri integration via SiriKit (which enables commands like 'Hey Siri, order coffee from MyApp') requires adding a native Intents Extension to the Xcode project, which is only possible after code export. This tutorial covers both paths and the trade-offs of each.
Prerequisites
- A FlutterFlow project with Firebase connected
- A Google account with access to the Actions on Google console at console.actions.google.com
- For Siri integration: FlutterFlow Pro plan with code export, Xcode on macOS, and an Apple Developer account
- Basic understanding of Cloud Functions for the Google Assistant webhook approach
Step-by-step guide
Understand the two integration architectures before choosing an approach
Understand the two integration architectures before choosing an approach
There are three levels of voice assistant integration available for FlutterFlow apps. Level 1 (easiest — works in FlutterFlow visual builder): In-app voice input using speech_to_text Custom Action. The user taps a mic button inside your app, speaks, and the text appears in a TextField. This is NOT Siri or Google Assistant — it is your app's own voice UI. Level 2 (intermediate — no code export needed): Google Assistant Actions. Users say 'Hey Google, talk to MyApp' and a conversational flow runs through your Cloud Function. Level 3 (advanced — requires code export and Xcode): Siri Shortcuts with SiriKit. Users say 'Hey Siri, [your shortcut phrase]' and the app opens to a specific state or page. Choose based on your use case and plan.
Expected result: Clear understanding of which approach is appropriate for your app's requirements and your FlutterFlow plan.
Add in-app voice input with speech_to_text (Level 1 — no code export)
Add in-app voice input with speech_to_text (Level 1 — no code export)
In FlutterFlow, go to Custom Code → Pubspec Dependencies. Add speech_to_text (version ^6.6.0). Create a Custom Action named startVoiceInput. In the action, initialize SpeechToText, request microphone permission with stt.initialize(), then start listening with stt.listen() and on result set a Page State variable searchText to the recognized words. In FlutterFlow, add an IconButton with mic icon. On tap, call the startVoiceInput Custom Action. Add a Text widget bound to the searchText Page State variable to display the recognized text in real time. On a second tap (stop listening), call another Custom Action that calls stt.stop() and then triggers whatever action you want (search, submit form, etc.). Add microphone permission in Settings → Permissions: NSMicrophoneUsageDescription for iOS and android.permission.RECORD_AUDIO for Android.
1import 'package:speech_to_text/speech_to_text.dart';23final SpeechToText _stt = SpeechToText();45Future<String> startVoiceInput() async {6 bool available = await _stt.initialize(7 onError: (error) => print('Voice error: $error'),8 );9 if (!available) return '';1011 String result = '';12 await _stt.listen(13 onResult: (r) => result = r.recognizedWords,14 listenFor: Duration(seconds: 10),15 pauseFor: Duration(seconds: 3),16 localeId: 'en_US',17 );18 await Future.delayed(Duration(seconds: 10));19 await _stt.stop();20 return result;21}Expected result: Tapping the mic button starts voice recognition. Spoken words appear as text in the bound TextField or Text widget within 1-2 seconds.
Create a Google Assistant Action with Actions on Google (Level 2)
Create a Google Assistant Action with Actions on Google (Level 2)
Go to console.actions.google.com. Click New Project. Give it a name matching your FlutterFlow app. Select the type: Custom. In the Develop tab, go to Actions → Add your first action. Choose Custom intent. In the Invocation Name section, set the phrase users say to start your action (e.g., 'talk to [app name]'). In the Intents section, define what users can say after invocation — for example, 'check my balance', 'place an order', 'show my appointments'. For each intent, go to the Fulfillment tab and enable Webhook fulfillment. Enter your Cloud Function URL as the fulfillment endpoint. Your Cloud Function receives a JSON request from Google Assistant, reads from Firestore based on the user's request, and returns a response JSON with a simple or rich text response that Google Assistant speaks back to the user.
Expected result: Google Assistant responds when triggered with your invocation phrase and routes intents to your Cloud Function for fulfillment.
Build the Google Assistant Cloud Function webhook
Build the Google Assistant Cloud Function webhook
Create a Firebase Cloud Function named googleAssistantFulfillment as an HTTP trigger. Google Assistant sends a POST request with the intent name and any extracted parameters. Parse the request to identify which intent fired (req.body.handler.name). For each intent, perform the relevant Firestore read or write operation. Return a JSON response with a simple object containing a prompt: { simple: { speech: 'Your order total is $24.99' } }. For navigating to app pages after the assistant interaction, use Firebase Dynamic Links — include a deep link in the response as a card with a URL button. When the user taps the card on their phone, the Dynamic Link opens your FlutterFlow app to the specified page. Store the Cloud Function URL in the Actions on Google fulfillment webhook configuration.
Expected result: Saying 'Hey Google, talk to [app name]' triggers your Cloud Function. Google Assistant speaks the response returned by the function.
Handle deep links from voice assistant responses back into FlutterFlow
Handle deep links from voice assistant responses back into FlutterFlow
When a Google Assistant response includes a link (e.g., 'Here is your appointment — tap to open the app'), clicking it should navigate to the right page in FlutterFlow. Create Firebase Dynamic Links for each page you want reachable from Assistant: the Dynamic Link contains the target page path and content ID as query parameters. In your Google Assistant Cloud Function, include the Dynamic Link as a card button in the response object. In FlutterFlow, add the firebase_dynamic_links package to Pubspec Dependencies and create an App Start Custom Action that calls FirebaseDynamicLinks.instance.getInitialLink() and navigates to the page specified in the link parameters. Use the onLink stream listener for foreground deep link handling.
Expected result: Tapping the link card shown by Google Assistant opens the FlutterFlow app directly to the relevant page with the correct content loaded.
Add Siri Shortcut support via code export and Xcode (Level 3 — Pro plan)
Add Siri Shortcut support via code export and Xcode (Level 3 — Pro plan)
Siri Shortcuts via SiriKit require native iOS code that cannot be added in FlutterFlow's visual builder. In FlutterFlow, go to Settings → Export Code. Download the full project as a ZIP. Open the project in Xcode on macOS. In Xcode's Project Navigator, click your project target → Editor → Add Target → Intent Extension. Configure the Intent Extension with a custom intent definition file (.intentdefinition). Define the intent title, parameters, and response template. Implement the IntentHandler class to execute the action (read from Firestore using Firebase Admin SDK or Firebase REST API, not the iOS FlutterKit). Add NSUserActivityTypes to Info.plist for each intent. In the main FlutterFlow app target, add code to donate shortcuts (INInteraction.donate) when the user performs the relevant action, making the shortcut available in Siri Suggestions. Re-build and deploy via Xcode.
Expected result: Saying 'Hey Siri, [shortcut phrase]' triggers the Intents Extension and executes the defined action without opening the app UI.
Complete working example
1// Siri Shortcuts Integration (after code export)2// In iOS Runner project, add SiriKit Intent Extension:3// 1. Xcode → File → New → Target → Intents Extension4// 2. Define intents in Intents.intentdefinition file56// IntentHandler.swift7import Intents89class IntentHandler: INExtension {10 override func handler(for intent: INIntent) -> Any {11 if intent is CheckBalanceIntent {12 return CheckBalanceIntentHandler()13 }14 return self15 }16}1718class CheckBalanceIntentHandler: NSObject, CheckBalanceIntentHandling {19 func handle(intent: CheckBalanceIntent,20 completion: @escaping (CheckBalanceIntentResponse) -> Void) {21 // Read from shared UserDefaults or Keychain22 let balance = UserDefaults(suiteName: "group.com.yourapp")?.double(forKey: "balance") ?? 023 let response = CheckBalanceIntentResponse(code: .success, userActivity: nil)24 response.balance = NSDecimalNumber(value: balance)25 completion(response)26 }27}2829// Google Assistant - Actions on Google webhook (Cloud Function)30// const { conversation } = require('@assistant/conversation');31// const app = conversation();32// app.handle('check_balance', async (conv) => {33// const uid = conv.user.params.uid;34// const userDoc = await admin.firestore().doc(`users/${uid}`).get();35// conv.add(`Your balance is $${userDoc.data().balance}`);36// });3738// FlutterFlow deep link handling:39// URL scheme: yourapp://balance → opens BalancePage40// Configure in Settings → Advanced → URL Scheme41// On Page Load: read route parameters for contextCommon mistakes
Why it's a problem: Trying to implement Siri Shortcuts (SiriKit Intents Extension) without code export from FlutterFlow
How to avoid: For Siri integration, you must export your FlutterFlow project to Xcode (Settings → Export Code, Pro plan required). After export, add the Intent Extension target in Xcode. Be aware that this is a one-way export — you cannot sync Xcode changes back to FlutterFlow.
Why it's a problem: Expecting the Google Assistant webhook to authenticate users automatically
How to avoid: Use Google Sign-In account linking in your Google Action to link the user's Google account with their app account. This gives your webhook a verified Google user ID. Alternatively, design the voice interaction to not return sensitive personal data, or require the user to open the app to see sensitive information.
Why it's a problem: Conflating in-app speech-to-text (speech_to_text package) with Siri or Google Assistant integration
How to avoid: Use speech_to_text for in-app voice commands (user opens your app and taps a mic button). Use SiriKit/Actions on Google for device-level voice assistant integration that works outside the app. These are separate features requiring different implementations.
Best practices
- Start with in-app speech-to-text using speech_to_text before attempting platform voice assistant integration — 90% of voice input use cases are satisfied by in-app voice without the complexity of Siri or Google Assistant
- Design your Google Assistant intents to be conversational with clear, short responses — voice responses longer than two sentences are difficult to follow
- Test Google Assistant Actions in the simulator at console.actions.google.com before deploying, as testing on a real Google Assistant device can be slow for iteration
- Always include a deep link card in Google Assistant responses that opens your FlutterFlow app for actions that require more detail than can be conveyed via voice
- For Siri Shortcuts, donate shortcuts programmatically when users complete key actions in your app — this makes the shortcut appear in Siri Suggestions without the user needing to manually create it
- Add microphone permission request explanation text that is specific and non-scary — iOS users are more likely to grant permission when the explanation makes the benefit clear
Still stuck?
Copy one of these prompts to get a personalized, step-by-step explanation.
I am building a FlutterFlow app and want to add voice capabilities. Explain the difference between: (1) In-app voice input using speech_to_text, (2) Google Assistant Actions with a fulfillment webhook, and (3) Siri SiriKit Intents Extension. For each, tell me what code is required, whether it works without code export from FlutterFlow, and which is best for my use case of letting users check their account balance by voice.
Add a microphone icon button to my search page. When tapped, it should start voice recognition using speech_to_text, display the recognized text in the search TextField in real time, and automatically trigger the search when the user stops speaking. Request microphone permission if not already granted.
Frequently asked questions
Can I add Siri Shortcuts to my FlutterFlow app without code export?
No. Siri Shortcuts that respond to Hey Siri commands require a native Intents Extension — a separate iOS target written in Swift that cannot be configured in FlutterFlow's visual builder. You must export the project code (FlutterFlow Pro plan) and add the extension in Xcode. There is a simpler alternative: donate INShortcut URLs that open your app to a specific page when the user says the shortcut phrase, but these require custom URL scheme handling via a Custom Action.
Does Google Assistant integration work on iOS as well as Android?
Yes. Google Assistant is available on both iOS and Android. Your Actions on Google integration works regardless of the user's device — the assistant sends the intent to your Cloud Function either way. The deep link back to your FlutterFlow app uses Firebase Dynamic Links which handle both iOS and Android routing.
How do I make the speech_to_text microphone button work in the FlutterFlow test mode?
Speech recognition does not work in FlutterFlow's web test mode preview. You must test on a physical iOS or Android device by running the app in Test Mode on device, or build the debug APK/IPA. The web preview has no microphone access for speech recognition. Always test voice input on a physical device.
What is the difference between speech_to_text and flutter_tts?
speech_to_text converts spoken audio to text (user speaks, app reads). flutter_tts converts text to speech (app speaks, user hears). For a complete voice interface in FlutterFlow, you might use both — speech_to_text to understand the user's voice command, and flutter_tts to speak the response back. Add both packages to Pubspec Dependencies and create separate Custom Actions for each.
How do I handle multiple languages in Google Assistant?
In the Actions on Google console, go to Deploy → Directory Information → Languages and add the languages you want to support. Define intents with training phrases in each language. Your Cloud Function receives the locale in the request (req.body.user.locale) and can return responses in the appropriate language, or route to different response strings based on the locale value.
Can RapidDev help implement Siri or Google Assistant integration for my FlutterFlow app?
Yes. Siri integration in particular requires native Xcode work that goes beyond the FlutterFlow visual builder. RapidDev handles the full stack — code export, Xcode Intent Extension implementation, Cloud Function fulfillment, and deep link routing back to your app — for teams that want voice assistant capabilities without managing the native development complexity themselves.
Talk to an Expert
Our team has built 600+ apps. Get personalized help with your project.
Book a free consultation