FlutterFlow cannot train machine learning models — that happens in Python tools like TensorFlow or PyTorch outside the app. Once trained, you use models in FlutterFlow through three paths: Cloud ML APIs (Google Vision, OpenAI, custom REST endpoints called via API Manager), TensorFlow Lite on-device models (bundled as app assets via Custom Action), or your own model server (REST API endpoint called from FlutterFlow's API Manager). Start with Cloud ML APIs — they require no custom code.
Using Machine Learning in FlutterFlow: What is Possible and How
The phrase 'create machine learning models in FlutterFlow' is a common search, but it contains a misconception worth clarifying immediately. FlutterFlow is a mobile and web app builder — it does not include tools for training, fine-tuning, or evaluating ML models. Model training happens in Python environments using TensorFlow, PyTorch, scikit-learn, or hosted platforms like Google Vertex AI, Hugging Face, or OpenAI. What FlutterFlow excels at is consuming trained models through APIs and on-device inference. There are three practical paths: Cloud ML APIs (fastest to implement, pay-per-call), TensorFlow Lite on-device (no network needed, but requires model optimization and app asset management), and custom model server (your own trained model behind a REST API endpoint). This guide covers all three, starting with the simplest.
Prerequisites
- FlutterFlow project with Firebase connected
- A Cloud ML API key (Google Cloud, OpenAI, or Hugging Face) for the Cloud API path
- A trained TFLite model file (.tflite) for the on-device path
- Basic understanding of REST APIs and FlutterFlow's API Manager
- FlutterFlow Pro plan for Custom Actions needed in the TFLite path
Step-by-step guide
Understand what FlutterFlow can and cannot do with ML
Understand what FlutterFlow can and cannot do with ML
FlutterFlow cannot: train ML models, fine-tune neural networks, evaluate model performance, or run Python-based ML libraries. These operations require Python, significant compute resources, and specialized frameworks. FlutterFlow CAN: send image or text data to a Cloud ML API and display the result, run a pre-compiled TensorFlow Lite model on the device for real-time inference, call any custom REST API endpoint (including your own model server) and parse the response, and display ML results in any widget (Text, Image, chart, etc.). Before choosing an implementation path, decide whether your use case requires real-time on-device inference (privacy-sensitive data, offline use) or can tolerate cloud API latency (0.5-3 seconds per call).
Expected result: You have chosen one of the three ML paths based on your app's requirements: Cloud API, TFLite on-device, or custom model server.
Path A — Call a Cloud ML API via FlutterFlow's API Manager
Path A — Call a Cloud ML API via FlutterFlow's API Manager
Open FlutterFlow → left sidebar → API Calls (the antenna icon). Tap '+' to create a new API call. Name it 'analyzeImage'. Set method to POST. Set the URL to the Cloud ML API endpoint — for Google Vision AI: https://vision.googleapis.com/v1/images:annotate?key=YOUR_API_KEY. Set the request body to JSON with the base64-encoded image and the requested feature types (LABEL_DETECTION, TEXT_DETECTION, OBJECT_LOCALIZATION). Tap 'Test API Call' with a sample image to verify the response structure. Then add the API call to a Button's Action Flow: first, take a photo with the Image Picker action, then call analyzeImage with the selected image as a base64 string. Parse the JSON response and bind the results to Text widgets or a list.
1// API Manager request body for Google Vision AI2// Use in FlutterFlow API Manager → Body → JSON3{4 "requests": [5 {6 "image": {7 "content": "[BASE64_IMAGE_STRING]"8 },9 "features": [10 {11 "type": "LABEL_DETECTION",12 "maxResults": 1013 },14 {15 "type": "TEXT_DETECTION"16 }17 ]18 }19 ]20}Expected result: The API Manager shows a successful test response with label detections and confidence scores. The API call is wired to a button that captures an image and displays the ML results.
Path B — Run a TensorFlow Lite model on-device
Path B — Run a TensorFlow Lite model on-device
On-device inference with TFLite requires: a trained and converted .tflite model file, the tflite_flutter Flutter package, and a Custom Action that loads the model and runs inference. Add 'tflite_flutter: ^0.10.4' to pubspec.yaml in the Custom Code panel and tap Get Packages. Add your .tflite model file to the project assets by exporting the FlutterFlow project code (Pro plan), placing the file in the assets/ folder, and registering it in pubspec.yaml under flutter.assets. Create a Custom Action named 'runImageClassification' that loads the interpreter from assets, preprocesses the input image, runs interpreter.run(), and returns the top classification result as a String.
1import 'package:tflite_flutter/tflite_flutter.dart';2import 'package:image/image.dart' as img;3import 'dart:io';4import 'dart:typed_data';56// Custom Action: runImageClassification7// Parameters: imagePath (String — local file path from Image Picker)8// Returns: String (top classification label)9Future<String> runImageClassification(String imagePath) async {10 try {11 // Load interpreter from app assets12 final interpreter =13 await Interpreter.fromAsset('assets/models/classifier.tflite');14 debugPrint('runImageClassification: interpreter loaded');1516 // Preprocess the image17 final imageFile = File(imagePath);18 final rawImage = img.decodeImage(imageFile.readAsBytesSync());19 if (rawImage == null) return 'Error: could not decode image';2021 final resized = img.copyResize(rawImage, width: 224, height: 224);22 final input = Float32List(1 * 224 * 224 * 3);23 int pixelIndex = 0;24 for (int y = 0; y < 224; y++) {25 for (int x = 0; x < 224; x++) {26 final pixel = resized.getPixel(x, y);27 input[pixelIndex++] = img.getRed(pixel) / 255.0;28 input[pixelIndex++] = img.getGreen(pixel) / 255.0;29 input[pixelIndex++] = img.getBlue(pixel) / 255.0;30 }31 }32 final inputTensor = input.reshape([1, 224, 224, 3]);3334 // Run inference35 final output = List.filled(1 * 1000, 0.0).reshape([1, 1000]);36 interpreter.run(inputTensor, output);3738 // Find top result39 final scores = (output[0] as List).cast<double>();40 final maxIndex =41 scores.indexOf(scores.reduce((a, b) => a > b ? a : b));42 debugPrint('runImageClassification: top class index $maxIndex');43 interpreter.close();44 return 'Class $maxIndex (${(scores[maxIndex] * 100).toStringAsFixed(1)}%)';45 } catch (e) {46 debugPrint('runImageClassification error: $e');47 return 'Error: $e';48 }49}Expected result: The Custom Action loads the TFLite model from assets and returns a classification result string when given a local image file path.
Path C — Connect to a custom model server via REST API
Path C — Connect to a custom model server via REST API
If you have trained your own ML model and deployed it as a REST API (using FastAPI, Flask, TensorFlow Serving, or a hosted service like Hugging Face Inference API, Replicate, or Render), FlutterFlow's API Manager can call it just like any other REST endpoint. In FlutterFlow's API Manager, create a new API call. Set the URL to your model server endpoint (e.g., https://your-model-server.com/predict). Set the request body structure to match what your model server expects (typically a JSON object with an 'input' field). Test the call in the API Manager. Wire it to a button Action Flow: capture user input (image, text, or structured data), call the API, and display the result. This path gives you full control over the model without any mobile-side ML code.
Expected result: The FlutterFlow API Manager successfully calls your custom model server and returns predictions. The response is bound to widgets in the app.
Display ML results in the FlutterFlow UI
Display ML results in the FlutterFlow UI
After your ML API call or Custom Action returns results, you need to store and display them. For Cloud API results (JSON response), parse the fields using FlutterFlow's JSON path extraction in the API response binding. Store the top result in App State (e.g., 'mlResult' as String). For TFLite Custom Action results (returned as String), store the output directly in App State. Add a Results section to your UI: a Text widget bound to the mlResult App State variable showing the top prediction, a CircularProgressIndicator visible while the analysis is running (use an 'isAnalyzing' Boolean App State variable), and an error message visible when the result contains 'Error:'. For multi-label results (like image classification with 10 labels), use a ListView bound to a List App State variable.
Expected result: The app shows a loading spinner while the ML call runs, then displays the prediction result(s) with confidence scores after the analysis completes.
Complete working example
1// ─── Custom Action: analyzeImageWithCloudVision ──────────────────────────────2// Calls Google Vision AI to analyze an image.3// Parameters: imageBase64 (String — base64-encoded image data)4// Returns: String (comma-separated label list)5//6// Alternative: configure this via FlutterFlow API Manager without Custom Code.7// This Custom Action approach is useful if you need preprocessing or retry logic.89import 'dart:convert';10import 'package:http/http.dart' as http;1112Future<String> analyzeImageWithCloudVision(String imageBase64) async {13 // Store your API key in FlutterFlow environment variables, not here14 const apiKey = 'YOUR_GOOGLE_VISION_API_KEY'; // Replace with env variable15 const endpoint =16 'https://vision.googleapis.com/v1/images:annotate?key=$apiKey';1718 final requestBody = jsonEncode({19 'requests': [20 {21 'image': {'content': imageBase64},22 'features': [23 {'type': 'LABEL_DETECTION', 'maxResults': 5},24 ],25 }26 ]27 });2829 try {30 debugPrint('analyzeImageWithCloudVision: sending request...');31 final response = await http.post(32 Uri.parse(endpoint),33 headers: {'Content-Type': 'application/json'},34 body: requestBody,35 );3637 if (response.statusCode != 200) {38 debugPrint('Vision API error: ${response.statusCode} ${response.body}');39 return 'Error: API returned ${response.statusCode}';40 }4142 final json = jsonDecode(response.body) as Map<String, dynamic>;43 final responses = json['responses'] as List;44 if (responses.isEmpty) return 'No results';4546 final labels = responses[0]['labelAnnotations'] as List? ?? [];47 final topLabels = labels48 .take(5)49 .map((l) =>50 '${l['description']} (${(l['score'] * 100).toStringAsFixed(0)}%)')51 .join(', ');5253 debugPrint('analyzeImageWithCloudVision: $topLabels');54 return topLabels;55 } catch (e) {56 debugPrint('analyzeImageWithCloudVision error: $e');57 return 'Error: $e';58 }59}6061// ─── API Manager configuration (alternative to Custom Action) ─────────────────62// Name: analyzeImage63// Method: POST64// URL: https://vision.googleapis.com/v1/images:annotate?key=[API_KEY]65// Headers: Content-Type: application/json66// Body: (see vision_api_body.json)67// Response path: responses[0].labelAnnotations[0].descriptionCommon mistakes when creating Custom Machine Learning Models in FlutterFlow
Why it's a problem: Bundling a 200MB+ TFLite model in the app's assets folder
How to avoid: Use model quantization to compress TFLite models — INT8 quantization typically reduces model size by 4x with minimal accuracy loss. Alternatively, download the model at first launch from Firebase Storage and cache it locally, rather than bundling it in the app assets.
Why it's a problem: Expecting FlutterFlow to train or fine-tune a machine learning model
How to avoid: Train your model in a Python environment (Google Colab is free, Vertex AI is managed). Convert to TFLite with tf.lite.TFLiteConverter or export as ONNX. Deploy as an API endpoint or bundle as a .tflite file. Then use one of the three paths in this guide to consume the trained model in FlutterFlow.
Why it's a problem: Hardcoding Cloud ML API keys in the FlutterFlow API Manager URL or request body
How to avoid: Store API keys in FlutterFlow's environment variable system and reference them as variables in the API Manager configuration. For sensitive keys, route all ML API calls through a Firebase Cloud Function that adds the key server-side.
Why it's a problem: Not handling TFLite interpreter loading failures gracefully
How to avoid: Wrap all TFLite interpreter operations in try/catch blocks. Return a descriptive error string from the Custom Action on failure. Show the error in the UI so users (and you in testing) understand what went wrong.
Best practices
- Start with a Cloud ML API before investing in TFLite on-device implementation — validate the use case works before optimizing for latency and cost.
- Never embed API keys for Cloud ML services in client-side code — always route through a Firebase Cloud Function or backend proxy.
- Quantize TFLite models to INT8 before bundling in apps — this typically reduces model size by 75% with less than 1% accuracy degradation for common vision models.
- Download large TFLite models from Firebase Storage at first launch and cache them on-device rather than bundling them in the app binary.
- Show clear loading and error states in the ML results UI — ML inference can take 0.5-5 seconds, and users need feedback that the app is working.
- Test ML features on real device hardware representative of your lowest-spec target device — inference speed on a flagship phone can be 10x faster than a budget phone.
- For privacy-sensitive use cases (medical images, personal documents), use on-device TFLite inference so user data never leaves the device.
- Log ML inference results (model name, inference time, top result confidence) to Firestore for monitoring model performance in production — degraded accuracy is often silent without logging.
Still stuck?
Copy one of these prompts to get a personalized, step-by-step explanation.
I want to add image classification to my FlutterFlow app. I have a trained TensorFlow Lite model (.tflite file) that classifies product images into 10 categories. The model expects 224x224 RGB images normalized to [0,1]. Please write a FlutterFlow Custom Action in Dart called 'classifyProductImage' that: loads the model from assets using tflite_flutter, preprocesses a local image file path (from FlutterFlow's Image Picker) to 224x224 RGB Float32, runs inference, and returns the top category name as a String. Include error handling and debugPrint logging.
I want to add machine learning features to my FlutterFlow app using a Cloud ML API. I have a Google Vision AI API key. Walk me through setting up the API call in FlutterFlow's API Manager — the URL, method, headers, request body structure for label detection, and how to parse the response to get the top label. Also explain how to wire this API call to a button that first captures an image with the Image Picker, then calls the API, then displays the result in a Text widget. Use only FlutterFlow visual builder steps.
Frequently asked questions
Can FlutterFlow train machine learning models?
No. FlutterFlow is a mobile and web app builder with no model training capabilities. Training requires Python, ML frameworks (TensorFlow, PyTorch), and compute infrastructure. Train your model in a Python environment (Google Colab, Vertex AI, Hugging Face), then consume the trained model in FlutterFlow via a Cloud API, TFLite on-device inference, or your own REST API endpoint.
What is the easiest way to add AI/ML features to a FlutterFlow app?
The easiest path is calling a Cloud ML API via FlutterFlow's API Manager — no custom Dart code required. Google Vision AI, OpenAI GPT, and Hugging Face Inference API all work this way. Set up the API call in the API Manager, test the response, and wire it to a button Action Flow. You can have a working ML feature in under 30 minutes.
What is TensorFlow Lite and when should I use it?
TensorFlow Lite (TFLite) is a compact version of TensorFlow designed to run on mobile devices. Models run entirely on-device — no internet connection needed and user data never leaves the phone. Use TFLite when: latency requirements are strict (real-time camera inference), the app must work offline, or user privacy prevents sending data to cloud APIs. The tradeoff is more complex setup and model optimization work.
How do I convert my TensorFlow or PyTorch model to TFLite format?
For TensorFlow/Keras models: use tf.lite.TFLiteConverter.from_keras_model() or from_saved_model() in Python to export a .tflite file. For PyTorch: convert to ONNX first (torch.onnx.export()), then convert ONNX to TFLite using the onnx-tf library or TFLite Converter. Apply INT8 quantization during conversion for smaller file sizes. Google Colab provides free GPU compute for the conversion process.
Can I use OpenAI's GPT API for text-based ML features in FlutterFlow?
Yes. Add the OpenAI API as a POST call in FlutterFlow's API Manager: URL is https://api.openai.com/v1/chat/completions, add an Authorization header with your Bearer token, and set the request body with the model and messages array. Store your API key in FlutterFlow environment variables, not hardcoded. Wire the API call to a text input and a submit button to build a chat or analysis feature.
What happens if the Cloud ML API is slow or unavailable?
Cloud ML APIs typically respond in 0.5-3 seconds. Show a loading indicator (CircularProgressIndicator or shimmer) while waiting. Add timeout handling in your API Manager settings or Custom Action (e.g., 10 second timeout). On timeout or API error, show a user-friendly error message and offer a retry button. For mission-critical features, consider implementing a fallback (simpler heuristic rule or cached result) when the API is unavailable.
Talk to an Expert
Our team has built 600+ apps. Get personalized help with your project.
Book a free consultation