Skip to main content
RapidDev - Software Development Agency
flutterflow-tutorials

How to Implement AI-Powered Search in FlutterFlow

AI-powered search uses vector embeddings to understand natural language queries instead of matching exact keywords. You create a Cloud Function that calls OpenAI's Embeddings API, stores vectors in Supabase pgvector, and queries by cosine similarity. Users can search 'affordable red shoes' and find results even if your data says 'budget crimson sneakers'.

What you'll learn

  • How vector embeddings turn text into searchable numbers
  • Setting up a Cloud Function to index documents with OpenAI Embeddings
  • Storing and querying vectors in Supabase pgvector
  • Wiring the semantic search endpoint to a FlutterFlow search bar
Book a free consultation
4.9Clutch rating
600+Happy partners
17+Countries served
190+Team members
Beginner10 min read45-60 minFlutterFlow Pro+ (code export required for custom functions)March 2026RapidDev Engineering Team
TL;DR

AI-powered search uses vector embeddings to understand natural language queries instead of matching exact keywords. You create a Cloud Function that calls OpenAI's Embeddings API, stores vectors in Supabase pgvector, and queries by cosine similarity. Users can search 'affordable red shoes' and find results even if your data says 'budget crimson sneakers'.

Why keyword search fails — and how AI fixes it

Standard Firestore queries match exact text. If a user types 'cheap hotel near airport' but your data says 'budget accommodation close to terminal', nothing matches. AI-powered search converts both the query and your documents into high-dimensional vectors — numerical representations of meaning. A similarity search finds vectors that are close in meaning, not just identical in spelling. The architecture has two parts: an indexing pipeline that runs once per document (embed → store vector) and a search pipeline that runs per query (embed query → find nearest vectors → return ranked results). Because you only embed the query once per search, API costs stay low.

Prerequisites

  • FlutterFlow project with a Supabase backend connected
  • OpenAI account with an API key (platform.openai.com)
  • Supabase project with pgvector extension enabled (Database → Extensions → vector)
  • Basic understanding of FlutterFlow Custom Actions
  • Firebase or Supabase project with Edge Functions or Cloud Functions enabled

Step-by-step guide

1

Enable pgvector and create the documents table in Supabase

Open your Supabase project dashboard and navigate to Database → Extensions. Search for 'vector' and enable the pgvector extension — this adds vector similarity search to PostgreSQL. Next, open the SQL editor and create a table to store your documents and their embeddings. The vector(1536) column matches the output dimension of OpenAI's text-embedding-ada-002 model. Run the SQL below. After running, go to Table Editor to confirm the documents table appears with an embedding column.

supabase_setup.sql
1-- Enable pgvector (if not already done via UI)
2create extension if not exists vector;
3
4-- Documents table with embedding column
5create table documents (
6 id uuid primary key default gen_random_uuid(),
7 title text not null,
8 body text not null,
9 embedding vector(1536),
10 created_at timestamptz default now()
11);
12
13-- Index for fast similarity search
14create index on documents
15using ivfflat (embedding vector_cosine_ops)
16with (lists = 100);
17
18-- RLS: allow authenticated reads
19alter table documents enable row level security;
20create policy "Public read" on documents
21 for select using (true);

Expected result: A 'documents' table visible in Supabase Table Editor with id, title, body, embedding, and created_at columns.

2

Create a Supabase Edge Function to embed and index documents

This function runs once when you add a new document. It calls the OpenAI Embeddings API to convert the document text into a 1536-dimension vector, then stores it alongside the document in Supabase. In your Supabase dashboard, go to Edge Functions → Create a new function named 'embed-document'. Paste the code below. Set your OpenAI API key as a Supabase Secret: Project Settings → API → Secrets → Add new secret with key OPENAI_API_KEY. Deploy the function by clicking Save.

embed-document/index.ts
1// supabase/functions/embed-document/index.ts
2import { serve } from 'https://deno.land/std@0.168.0/http/server.ts';
3import { createClient } from 'https://esm.sh/@supabase/supabase-js@2';
4
5serve(async (req) => {
6 const { title, body } = await req.json();
7
8 // Get embedding from OpenAI
9 const embeddingRes = await fetch('https://api.openai.com/v1/embeddings', {
10 method: 'POST',
11 headers: {
12 'Authorization': `Bearer ${Deno.env.get('OPENAI_API_KEY')}`,
13 'Content-Type': 'application/json',
14 },
15 body: JSON.stringify({
16 input: `${title} ${body}`,
17 model: 'text-embedding-ada-002',
18 }),
19 });
20
21 const { data } = await embeddingRes.json();
22 const embedding = data[0].embedding;
23
24 // Store in Supabase
25 const supabase = createClient(
26 Deno.env.get('SUPABASE_URL')!,
27 Deno.env.get('SUPABASE_SERVICE_ROLE_KEY')!
28 );
29
30 const { error } = await supabase
31 .from('documents')
32 .insert({ title, body, embedding });
33
34 if (error) return new Response(JSON.stringify({ error }), { status: 500 });
35 return new Response(JSON.stringify({ success: true }), { status: 200 });
36});

Expected result: Edge Function deployed and visible in Supabase dashboard. Test it via the 'Invoke' button with a sample payload like {"title": "Red sneakers", "body": "Lightweight running shoes in red"}.

3

Create a search Edge Function that embeds the query and returns ranked results

Create a second Edge Function named 'semantic-search'. This one accepts a user's search query, converts it to a vector using OpenAI, then calls a Supabase RPC function to find the most similar documents by cosine similarity. First, add the RPC function in your SQL editor. Then create the Edge Function. This separation means you pay for one embedding per search, not per document returned.

semantic-search/index.ts
1-- First, add this SQL function in Supabase SQL editor:
2create or replace function match_documents(
3 query_embedding vector(1536),
4 match_count int default 5
5)
6returns table (id uuid, title text, body text, similarity float)
7language sql stable as $$
8 select id, title, body,
9 1 - (embedding <=> query_embedding) as similarity
10 from documents
11 order by embedding <=> query_embedding
12 limit match_count;
13$$;
14
15-- Edge Function: supabase/functions/semantic-search/index.ts
16// (same imports as embed-document)
17serve(async (req) => {
18 const { query } = await req.json();
19
20 const embeddingRes = await fetch('https://api.openai.com/v1/embeddings', {
21 method: 'POST',
22 headers: {
23 'Authorization': `Bearer ${Deno.env.get('OPENAI_API_KEY')}`,
24 'Content-Type': 'application/json',
25 },
26 body: JSON.stringify({ input: query, model: 'text-embedding-ada-002' }),
27 });
28
29 const { data } = await embeddingRes.json();
30 const queryEmbedding = data[0].embedding;
31
32 const supabase = createClient(
33 Deno.env.get('SUPABASE_URL')!,
34 Deno.env.get('SUPABASE_SERVICE_ROLE_KEY')!
35 );
36
37 const { data: results, error } = await supabase
38 .rpc('match_documents', {
39 query_embedding: queryEmbedding,
40 match_count: 10,
41 });
42
43 if (error) return new Response(JSON.stringify({ error }), { status: 500 });
44 return new Response(JSON.stringify({ results }), { status: 200 });
45});

Expected result: Invoking the semantic-search function with {"query": "affordable shoes"} returns a ranked JSON array of documents ordered by relevance.

4

Add the API call as a Custom Action in FlutterFlow

In FlutterFlow, go to Custom Code → Custom Actions → Add Action. Name it 'semanticSearch'. This action will call your semantic-search Edge Function with the user's query text and return a list of result maps. Go to Settings → Custom Files and add your Supabase URL and anon key as app constants if you haven't already. The Custom Action uses the http package which is available by default. Paste the Dart code below and click Save. FlutterFlow will validate the code and show a green checkmark when it compiles.

semantic_search_action.dart
1// Custom Action: semanticSearch
2import 'dart:convert';
3import 'package:http/http.dart' as http;
4
5Future<List<dynamic>> semanticSearch(String query) async {
6 const supabaseUrl = 'https://YOUR_PROJECT.supabase.co';
7 const anonKey = 'YOUR_ANON_KEY';
8
9 final response = await http.post(
10 Uri.parse('$supabaseUrl/functions/v1/semantic-search'),
11 headers: {
12 'Content-Type': 'application/json',
13 'Authorization': 'Bearer $anonKey',
14 },
15 body: jsonEncode({'query': query}),
16 );
17
18 if (response.statusCode != 200) return [];
19 final data = jsonDecode(response.body);
20 return data['results'] as List<dynamic>;
21}

Expected result: Custom Action appears in FlutterFlow with return type List<dynamic>. No compilation errors shown in the editor.

5

Build the search UI with a TextField and ListView in FlutterFlow

On your search page, add a Column widget. Inside it, add a TextField widget for the search input — set its label to 'Search...' and give it a unique widget name like 'searchField'. Below it, add a ListView widget. Set the ListView's data source to an App State variable named 'searchResults' (type: JSON, list: true). Add a Card widget inside the ListView with two Text widgets: one bound to currentItem.title and one to currentItem.body. Now add the search trigger: select the TextField → Actions → On Submit → Custom Action → semanticSearch, passing the TextField's value. In the Action output, update the searchResults App State variable with the returned list.

Expected result: A search bar and results list are visible on the page. Typing a query and pressing Enter triggers the action flow.

6

Test the full search flow in Run Mode

Click Run Mode (top right) to launch your app in the browser. First, populate some test documents by calling your embed-document Edge Function directly from the Supabase dashboard Invoke panel with a few sample records. Then return to your app's search page, type a natural language query like 'how do I reset my password', and submit. You should see semantically relevant results appear within 1-2 seconds. Test edge cases: misspelled words, synonyms, and queries in different phrasing than your source documents. All should return relevant results if the embeddings are working correctly.

Expected result: Natural language queries return a ranked list of semantically relevant documents, even when exact keywords don't match.

Complete working example

semantic_search_action.dart
1// FlutterFlow Custom Action: semanticSearch
2// Dependencies: http (already included in FlutterFlow)
3import 'dart:convert';
4import 'package:http/http.dart' as http;
5
6/// Sends a natural-language query to the Supabase semantic-search
7/// Edge Function and returns a ranked list of matching documents.
8///
9/// Each result map contains: id, title, body, similarity (0-1 float)
10Future<List<dynamic>> semanticSearch(String query) async {
11 // Replace with your actual Supabase project URL and anon key
12 const String supabaseUrl = 'https://YOUR_PROJECT_ID.supabase.co';
13 const String anonKey = 'YOUR_SUPABASE_ANON_KEY';
14 const String functionUrl = '$supabaseUrl/functions/v1/semantic-search';
15
16 if (query.trim().isEmpty) return [];
17
18 try {
19 final response = await http
20 .post(
21 Uri.parse(functionUrl),
22 headers: {
23 'Content-Type': 'application/json',
24 'Authorization': 'Bearer $anonKey',
25 },
26 body: jsonEncode({'query': query.trim()}),
27 )
28 .timeout(const Duration(seconds: 10));
29
30 if (response.statusCode != 200) {
31 debugPrint('Search error: ${response.statusCode} ${response.body}');
32 return [];
33 }
34
35 final Map<String, dynamic> data = jsonDecode(response.body);
36 final List<dynamic> results = data['results'] ?? [];
37
38 // Filter out low-confidence results
39 return results
40 .where((r) => (r['similarity'] as double? ?? 0) > 0.70)
41 .toList();
42 } catch (e) {
43 debugPrint('semanticSearch exception: $e');
44 return [];
45 }
46}

Common mistakes when implementing AI-Powered Search in FlutterFlow

Why it's a problem: Re-computing embeddings for every search query without caching

How to avoid: Debounce the search input by at least 500ms. Cache results in App State for the same query string. Only call the API when the query changes.

Why it's a problem: Calling the Embeddings API directly from the Flutter app

How to avoid: Always route embedding calls through a server-side function (Supabase Edge Function or Firebase Cloud Function) where the API key is stored as a secret environment variable.

Why it's a problem: Skipping the ivfflat index on the embedding column

How to avoid: Add the ivfflat index as shown in Step 1 before loading production data. Rebuild the index after bulk inserts with REINDEX INDEX.

Why it's a problem: Using the wrong embedding model dimension

How to avoid: Match the vector() size to the model: ada-002 = 1536, text-embedding-3-small = 1536, text-embedding-3-large = 3072. Define the column size before inserting any data.

Best practices

  • Index documents at write time, never at read time — keep the search path fast and cheap
  • Combine semantic search with a keyword filter for precision: semantic for intent, keyword for exact product codes
  • Store the original text alongside the embedding so you can display results without a second query
  • Set a minimum similarity threshold (0.70-0.75) to avoid surfacing irrelevant results for obscure queries
  • Log search queries and zero-result searches to identify gaps in your content
  • Batch-embed new documents using a queue rather than one API call per insert under high write load
  • Use text-embedding-3-small for cost savings — it performs comparably to ada-002 at lower cost

Still stuck?

Copy one of these prompts to get a personalized, step-by-step explanation.

ChatGPT Prompt

I have a FlutterFlow app with a Supabase backend. I want to add AI-powered semantic search using OpenAI Embeddings and pgvector. Can you write me a Supabase Edge Function (Deno/TypeScript) that accepts a search query, calls text-embedding-ada-002 to get a vector, then uses a Supabase RPC function to return the top 10 most similar documents by cosine similarity? Also include the SQL to create the documents table and the match_documents function.

FlutterFlow Prompt

Generate a FlutterFlow Custom Action in Dart that calls a Supabase Edge Function URL with a search query string, handles the JSON response containing a results array of {id, title, body, similarity} objects, filters out results below 0.75 similarity, and returns the filtered list. Include error handling and a 10-second timeout.

Frequently asked questions

How much does AI search cost with OpenAI Embeddings?

text-embedding-ada-002 costs $0.0001 per 1,000 tokens. A typical search query is about 10 tokens, so 10,000 searches cost roughly $0.01. Indexing a document of 200 words costs about $0.003. Costs are very low unless you have millions of daily searches.

Can I use this with Firebase instead of Supabase?

Yes. Replace the pgvector storage with Pinecone or Google Cloud Vertex AI Vector Search. Use a Firebase Cloud Function instead of a Supabase Edge Function. The embedding step (calling OpenAI) is identical — only the vector storage and retrieval layer changes.

What is the difference between semantic search and keyword search?

Keyword search matches exact words. Semantic search understands meaning — 'car' and 'automobile' are treated as similar. Semantic search handles synonyms, paraphrasing, and intent. It performs better for natural language queries but is slower and costs more than a simple Firestore query.

Do I need the Pro plan to use Custom Actions in FlutterFlow?

Yes. Custom Actions (which run Dart code) require the FlutterFlow Pro plan ($70/mo). On the Free and Standard plans you can use API calls via the API Manager, but you cannot write custom Dart logic.

How do I keep the search index up to date when documents change?

Trigger the embed-document Edge Function from a Firestore/Supabase trigger whenever a document is created or updated. For Supabase, use a database webhook on the documents table INSERT and UPDATE events pointing to your Edge Function.

What happens if OpenAI is down — will search break completely?

Yes, if OpenAI's Embeddings API is unavailable, new searches will fail. Add a fallback: if the Edge Function returns an error, fall back to a standard Supabase full-text search (using the @@ operator) so users always get some results.

RapidDev

Talk to an Expert

Our team has built 600+ apps. Get personalized help with your project.

Book a free consultation

Need help with your project?

Our experts have built 600+ apps and can accelerate your development. Book a free consultation — no strings attached.

Book a free consultation

We put the rapid in RapidDev

Need a dedicated strategic tech and growth partner? Discover what RapidDev can do for your business! Book a call with our team to schedule a free, no-obligation consultation. We'll discuss your project and provide a custom quote at no cost.