FlutterFlow cannot train AI models, but it can integrate any AI API: OpenAI GPT, Anthropic Claude, Google Gemini, or a custom model endpoint. All AI API calls must go through a Firebase Cloud Function — never directly from the client app — to keep API keys secure. The Cloud Function acts as the AI gateway, and Firestore stores conversation history for context-aware multi-turn chat.
Integrating AI APIs Securely into FlutterFlow
The phrase 'create AI models in FlutterFlow' is a common search query, but it reflects a misconception: FlutterFlow is a UI and backend orchestration platform, not a machine learning training environment. What you CAN do is connect to powerful pre-trained AI models via API. This tutorial shows how to integrate four major AI providers using the secure server-side pattern, build a functional multi-turn chat interface, and store conversation history for context-aware responses.
Prerequisites
- FlutterFlow project with Firebase connected (Firestore and Cloud Functions enabled on Blaze plan)
- An API key from at least one AI provider (OpenAI, Anthropic, or Google AI Studio)
- Basic understanding of FlutterFlow Custom Actions and Firestore
- Firebase CLI installed locally for Cloud Function deployment
Step-by-step guide
Set Up the Firestore Conversation Schema
Set Up the Firestore Conversation Schema
AI chat requires storing message history so the model has context for multi-turn conversations. In Firebase Console, create a 'conversations' collection. Each document has: 'userId' (String), 'title' (String), 'createdAt' (Timestamp), 'model' (String: 'gpt-4o', 'claude-3-5-sonnet', etc.), and 'updatedAt' (Timestamp). Under each conversation document, create a 'messages' subcollection. Each message document has: 'role' (String: 'user' or 'assistant'), 'content' (String), 'timestamp' (Timestamp), and 'tokenCount' (Number, optional). In FlutterFlow's Firestore schema editor, add this nested structure. Your chat UI page will have a Backend Query on the messages subcollection, ordered by timestamp ascending, with a real-time stream.
Expected result: The Firestore schema shows the conversations collection with a messages subcollection. A test conversation document is visible in Firebase Console.
Deploy the AI Gateway Cloud Function
Deploy the AI Gateway Cloud Function
This Cloud Function acts as a secure proxy between your FlutterFlow app and any AI provider. It receives the user's message and conversation ID, loads the message history from Firestore, calls the AI API with the full context, saves the response back to Firestore, and returns the response to the client. The API keys are stored in Cloud Function environment variables — they never touch the client app. The function supports multiple providers through a 'provider' parameter. Deploy to Firebase using the CLI. Store API keys using Firebase Functions Config: 'firebase functions:config:set openai.key="sk-..." anthropic.key="sk-ant-..."'.
1// Firebase Cloud Function: chatWithAI2// Callable function — invoked from FlutterFlow Custom Action3// firebase deploy --only functions:chatWithAI45const functions = require('firebase-functions');6const admin = require('firebase-admin');7const OpenAI = require('openai');8const Anthropic = require('@anthropic-ai/sdk');910exports.chatWithAI = functions.https.onCall(async (data, context) => {11 if (!context.auth) {12 throw new functions.https.HttpsError('unauthenticated', 'Login required');13 }1415 const { conversationId, userMessage, provider = 'openai' } = data;16 if (!conversationId || !userMessage) {17 throw new functions.https.HttpsError('invalid-argument', 'Missing required fields');18 }1920 const db = admin.firestore();21 const uid = context.auth.uid;2223 // Save user message to Firestore24 const messagesRef = db.collection('conversations')25 .doc(conversationId).collection('messages');26 await messagesRef.add({27 role: 'user',28 content: userMessage,29 timestamp: admin.firestore.FieldValue.serverTimestamp(),30 });3132 // Load last 20 messages for context33 const historySnap = await messagesRef34 .orderBy('timestamp', 'desc').limit(20).get();35 const history = historySnap.docs.reverse().map(d => ({36 role: d.data().role,37 content: d.data().content,38 }));3940 let assistantMessage = '';4142 if (provider === 'openai') {43 const openai = new OpenAI({ apiKey: functions.config().openai.key });44 const completion = await openai.chat.completions.create({45 model: 'gpt-4o',46 messages: [47 { role: 'system', content: 'You are a helpful assistant.' },48 ...history,49 ],50 max_tokens: 1000,51 });52 assistantMessage = completion.choices[0].message.content;5354 } else if (provider === 'anthropic') {55 const anthropic = new Anthropic({ apiKey: functions.config().anthropic.key });56 const message = await anthropic.messages.create({57 model: 'claude-3-5-sonnet-20241022',58 max_tokens: 1024,59 system: 'You are a helpful assistant.',60 messages: history,61 });62 assistantMessage = message.content[0].text;6364 } else if (provider === 'gemini') {65 const { GoogleGenerativeAI } = require('@google/generative-ai');66 const genAI = new GoogleGenerativeAI(functions.config().gemini.key);67 const model = genAI.getGenerativeModel({ model: 'gemini-1.5-pro' });68 const chat = model.startChat({69 history: history.slice(0, -1).map(m => ({70 role: m.role === 'assistant' ? 'model' : m.role,71 parts: [{ text: m.content }],72 })),73 });74 const result = await chat.sendMessage(userMessage);75 assistantMessage = result.response.text();76 }7778 // Save assistant response to Firestore79 await messagesRef.add({80 role: 'assistant',81 content: assistantMessage,82 timestamp: admin.firestore.FieldValue.serverTimestamp(),83 });8485 // Update conversation updatedAt86 await db.collection('conversations').doc(conversationId).update({87 updatedAt: admin.firestore.FieldValue.serverTimestamp(),88 });8990 return { message: assistantMessage };91});Expected result: The Cloud Function deploys successfully. Calling it from the Firebase Console's test interface with a test message returns an AI response.
Create the SendMessage Custom Action in FlutterFlow
Create the SendMessage Custom Action in FlutterFlow
In Custom Code → Custom Actions → '+', create 'sendMessageToAI'. This action takes two arguments: 'userMessage' (String) and 'conversationId' (String). It calls the 'chatWithAI' Firebase callable function, waits for the response, and returns the assistant's reply as a String. The Firestore message history is updated automatically by the Cloud Function — the action does not need to write anything directly. The FlutterFlow chat UI's Repeating Group is bound to a real-time stream on the messages subcollection, so the new assistant message appears automatically when Firestore updates. Wire this action to your chat input's Send button. Pass the text field value as 'userMessage' and the current conversation ID from App State as 'conversationId'.
1// Custom Action: sendMessageToAI2// Arguments: userMessage (String), conversationId (String)3// Return type: String (assistant response or error message)45Future<String> sendMessageToAI(6 String userMessage,7 String conversationId,8) async {9 if (userMessage.trim().isEmpty) return '';1011 try {12 final callable = FirebaseFunctions.instance13 .httpsCallable(14 'chatWithAI',15 options: HttpsCallableOptions(16 timeout: const Duration(seconds: 30),17 ),18 );1920 final result = await callable.call({21 'userMessage': userMessage.trim(),22 'conversationId': conversationId,23 'provider': 'openai', // or 'anthropic', 'gemini'24 });2526 return result.data['message'] as String? ?? '';27 } on FirebaseFunctionsException catch (e) {28 return 'Error: ${e.message}';29 } on TimeoutException {30 return 'Request timed out. Please try again.';31 } catch (e) {32 return 'An unexpected error occurred.';33 }34}Expected result: Tapping Send calls the Custom Action, which calls the Cloud Function, which calls the AI API. The response appears in the chat UI automatically via the Firestore real-time stream.
Build the Chat UI with a Repeating Group
Build the Chat UI with a Repeating Group
Create a chat page in FlutterFlow. Add a Column layout: (1) a Repeating Group widget at the top (takes all available space with Expanded) bound to a Backend Query on the messages subcollection of the current conversation, ordered by timestamp ascending, real-time stream; (2) a Row at the bottom containing a TextField widget (bound to a Page State variable 'currentMessage') and a Send IconButton. In the Repeating Group's item builder, add a conditional widget: if the message role is 'user', show a right-aligned blue bubble; if 'assistant', show a left-aligned grey bubble. Each bubble contains a Text widget bound to the message content field. Set the Repeating Group's reverse scroll to true and add a listKey so it scrolls to the bottom when new messages arrive.
Expected result: The chat page shows alternating user (right, blue) and assistant (left, grey) message bubbles. New messages scroll into view automatically.
Create a New Conversation and Set System Prompt
Create a New Conversation and Set System Prompt
Before the user sends their first message, create a conversation document in Firestore. In FlutterFlow, on the chat page's On Page Load action (or on a 'New Chat' button), call a Custom Action 'createConversation' that: (1) creates a new document in the conversations collection with userId, title ('New Conversation'), model, and createdAt, and (2) returns the document ID. Store this ID in App State as 'currentConversationId'. Every subsequent sendMessageToAI call passes this ID. To add a system prompt (making the AI act as a specific persona), modify the Cloud Function to accept a 'systemPrompt' parameter and pass it as the system message. Different app sections can send different system prompts to create specialized AI assistants.
Expected result: Each new chat session creates a Firestore conversation document with a unique ID. App State tracks the current conversation ID. Multiple conversations can be maintained and switched between.
Complete working example
1// ============================================================2// FlutterFlow AI Chat — Complete Custom Actions3// ============================================================4// Packages: cloud_functions, cloud_firestore, firebase_auth56// Action 1: Create a new conversation document7Future<String> createConversation(String modelName) async {8 final uid = FirebaseAuth.instance.currentUser?.uid;9 if (uid == null) return '';1011 final docRef = await FirebaseFirestore.instance12 .collection('conversations')13 .add({14 'userId': uid,15 'title': 'New Conversation',16 'model': modelName,17 'createdAt': FieldValue.serverTimestamp(),18 'updatedAt': FieldValue.serverTimestamp(),19 });2021 return docRef.id;22}2324// Action 2: Send a message to the AI via Cloud Function25Future<String> sendMessageToAI(26 String userMessage,27 String conversationId,28 String provider,29) async {30 if (userMessage.trim().isEmpty || conversationId.isEmpty) return '';3132 try {33 final callable = FirebaseFunctions.instance.httpsCallable(34 'chatWithAI',35 options: HttpsCallableOptions(36 timeout: const Duration(seconds: 30),37 ),38 );3940 final result = await callable.call({41 'userMessage': userMessage.trim(),42 'conversationId': conversationId,43 'provider': provider,44 });4546 return result.data['message'] as String? ?? '';47 } on FirebaseFunctionsException catch (e) {48 return 'AI error: ${e.message}';49 } on TimeoutException {50 return 'Request timed out — the AI took too long to respond.';51 } catch (_) {52 return 'Something went wrong. Please try again.';53 }54}5556// Action 3: Load conversation list for history sidebar57Future<List<Map<String, dynamic>>> loadConversations() async {58 final uid = FirebaseAuth.instance.currentUser?.uid;59 if (uid == null) return [];6061 final snap = await FirebaseFirestore.instance62 .collection('conversations')63 .where('userId', isEqualTo: uid)64 .orderBy('updatedAt', descending: true)65 .limit(20)66 .get();6768 return snap.docs.map((d) => {69 'id': d.id,70 'title': d.data()['title'] as String? ?? 'Untitled',71 'model': d.data()['model'] as String? ?? 'unknown',72 }).toList();73}Common mistakes when creating Custom Artificial Intelligence Models in FlutterFlow
Why it's a problem: Calling AI APIs directly from a client-side Custom Action — exposing API keys in the app binary
How to avoid: Always proxy AI API calls through a Firebase Cloud Function. The client sends the user message to the Cloud Function (authenticated via Firebase ID token). The Cloud Function stores the API key in environment config (never in code) and calls the AI provider. The key never leaves the server.
Why it's a problem: Sending the entire conversation history to the AI API on every message
How to avoid: Limit the history sent to the last 20 messages. In Firestore, query messages with 'orderBy timestamp descending, limit 20', then reverse the array before sending to the API. This keeps context relevant while controlling costs.
Why it's a problem: Hardcoding the AI model name as 'gpt-4' without specifying the exact version
How to avoid: Always use fully-versioned model names like 'gpt-4o-2024-11-20' or 'claude-3-5-sonnet-20241022'. Check the provider's documentation for the latest stable versioned name and update your function config when you intentionally upgrade.
Why it's a problem: Not setting a timeout on the Cloud Function callable and letting the UI hang indefinitely
How to avoid: Set HttpsCallableOptions(timeout: Duration(seconds: 30)) on the callable. Show a 'typing...' indicator while waiting. If the timeout fires, show a user-friendly error with a retry button rather than leaving the UI in a loading state.
Best practices
- Keep all AI API keys in Cloud Function environment config (firebase functions:config:set) — never in app code, FlutterFlow App State, or Firestore.
- Show a 'typing' animation (three animated dots) in the chat UI while waiting for the AI response — users expect visual feedback during the 5-30 second wait.
- Implement a cost safeguard: store per-user message counts in Firestore and reject requests in the Cloud Function once a daily limit is exceeded.
- Design your system prompt carefully — the system prompt determines the AI's persona, constraints, and tone for your specific app use case.
- Handle AI API error codes gracefully: 429 (rate limit) should show 'Try again in a moment', 500 (server error) should show 'AI service temporarily unavailable'.
- For production deployments, consider streaming responses: use Server-Sent Events in the Cloud Function to write partial responses to Firestore as they arrive, giving users a real-time typing effect.
- Store the AI model used alongside each message — this helps you understand costs and allows you to migrate conversations to new models without confusion about response quality differences.
- Contact RapidDev for help architecting multi-agent systems, RAG pipelines with vector embeddings, or fine-tuning integrations — these require architecture beyond the basic call pattern shown here.
Still stuck?
Copy one of these prompts to get a personalized, step-by-step explanation.
I'm building an AI chat feature in a FlutterFlow app backed by Firebase. I need a Firebase Cloud Function that: accepts a conversationId and userMessage, loads the last 20 messages from a Firestore messages subcollection, calls the OpenAI chat completions API with the history as context, saves the response to Firestore, and returns it to the client. Write the complete Node.js Cloud Function.
Write a FlutterFlow Custom Action in Dart called sendMessageToAI that calls a Firebase callable function named chatWithAI with userMessage, conversationId, and provider parameters. Handle FirebaseFunctionsException, TimeoutException, and generic errors with user-friendly return strings. Set a 30-second callable timeout.
Frequently asked questions
Can FlutterFlow train a custom AI model?
No. FlutterFlow is a UI and backend integration platform — it cannot train machine learning models. What you can do is integrate pre-trained models (GPT-4o, Claude, Gemini) via API calls, or call custom model endpoints that you have trained and deployed elsewhere (e.g., on AWS SageMaker or Google Vertex AI).
Which AI provider should I use for my FlutterFlow app?
OpenAI GPT-4o is the safest choice for most use cases — widest documentation, largest community, and good general-purpose quality. Claude (Anthropic) has a longer context window and is better for tasks requiring nuanced understanding of long documents. Gemini is cost-effective for high-volume apps and integrates naturally with Google Cloud. For cost-sensitive apps, GPT-4o-mini or Claude 3 Haiku are strong budget options.
How much do AI API calls cost for a FlutterFlow app?
As of March 2026: GPT-4o costs about $2.50 per 1M input tokens and $10 per 1M output tokens. Claude 3.5 Sonnet costs $3 per 1M input and $15 per 1M output. A typical chat message is 500-2000 tokens total. For an app with 1000 users sending 10 messages per day, expect roughly $10-50/month depending on message length and model choice. Always implement usage limits per user to prevent runaway costs.
Can I use AI directly in FlutterFlow without Cloud Functions using the API Manager?
Technically yes — FlutterFlow's API Manager can call any REST endpoint, including OpenAI's API. But this requires putting your API key in the API Manager as a header value, which gets compiled into the app binary and is accessible to anyone who inspects the app. This is a security vulnerability. Always use a Cloud Function to proxy AI API calls.
How do I handle different AI personas for different parts of my app?
Pass a 'systemPrompt' parameter from the FlutterFlow client to the Cloud Function alongside the user message. The customer support section sends 'You are a friendly customer support agent for [Brand]...', the recipe assistant sends 'You are a professional chef...', and so on. Different App State variables or conversation metadata fields can store which persona is active.
What is the difference between calling AI from a Custom Action vs the API Manager in FlutterFlow?
The API Manager makes client-side HTTP calls directly from the app. Custom Actions run Dart code in the app. Both expose your API key if used directly. A Cloud Function (callable from a Custom Action) runs server-side and is the correct architecture for AI API calls. Use the Custom Action to CALL the Cloud Function, not to call the AI provider directly.
Talk to an Expert
Our team has built 600+ apps. Get personalized help with your project.
Book a free consultation