Build an API integration hub in Lovable that manages encrypted third-party credentials, defines data flows between sources and destinations, logs every sync with success and error status, and exposes webhook receiver Edge Functions. Health monitoring with automatic retry logic keeps integrations running reliably.
What you're building
An integration hub connects external APIs and automates data movement between them. The core entities are connectors (definitions of how to reach an API), credentials (encrypted keys per connector per user), integrations (a source connector → destination connector pairing with field mapping), and sync_logs (the record of every execution).
Credentials are the most sensitive piece. Supabase Vault (available on Pro plans) encrypts secrets at the database level using pgsodium. The credentials table stores only the vault secret ID, never the plaintext key. When an Edge Function needs a credential, it calls the vault.decrypted_secrets view to decrypt it at query time. On Free plans, you can achieve similar encryption by encrypting credentials in an Edge Function using a master key stored in Deno.env.
Data flows run on a schedule via pg_cron or are triggered by incoming webhooks. Each run creates a sync_log entry before executing. The Edge Function fetches data from the source connector, applies field mappings (a JSON transform config), and posts to the destination connector. On failure, it updates the sync_log with error details and queues a retry. Retries use exponential backoff: 5 minutes, 15 minutes, 1 hour.
Final result
A production-ready integration hub where users can connect APIs, define data flows, monitor sync health, and receive incoming webhooks — all without writing connector-specific code.
Tech stack
Prerequisites
- Lovable Pro account for multiple Edge Functions
- Supabase Pro plan for Vault encrypted storage (or implement manual AES encryption on Free)
- Supabase URL, service role key saved to Cloud tab → Secrets
- At least two API keys from services you want to connect (e.g. Airtable + Slack, or Shopify + a database)
Build steps
Create the integration hub schema with Vault credentials
Set up the database schema with secure credential storage. This is the most important step — credentials must never be stored in plaintext in application tables.
1Build an integration hub app. Create these Supabase tables:23- connectors: id, name, logo_url, base_url, auth_type (api_key|oauth2|basic|bearer), config_schema (jsonb, describes required fields for this connector type), is_builtin (bool), created_at4- credentials: id, user_id, connector_id (FK connectors), name (user label like 'My Shopify Store'), vault_secret_id (uuid, references vault.secrets), status (active|invalid|expired), last_validated_at, created_at5- integrations: id, user_id, name, source_credential_id (FK credentials), destination_credential_id (FK credentials), field_mapping (jsonb), filter_config (jsonb, optional data filtering), schedule (cron string e.g. '0 * * * *'), is_active (bool default true), last_run_at, next_run_at, created_at6- sync_logs: id, integration_id (FK integrations), status (running|success|error|retrying), records_processed (int), error_message (text), started_at, completed_at, retry_count (int default 0), next_retry_at78For credential creation: use the Supabase Vault SQL function select vault.create_secret(secret_value, name) to store the credential. Insert the returned UUID as vault_secret_id in the credentials table.910RLS: all tables require user_id = auth.uid(). sync_logs is accessible via integration_id FK chain. Never expose vault.secrets directly to the frontend.Pro tip: Ask Lovable to create 5 built-in connector definitions (Slack, Airtable, Google Sheets, Shopify, and a generic REST API) with their config_schema already filled in. These seeded connectors appear in the connector Select when users create credentials.
Expected result: All four tables are created. The Vault is set up on the Supabase project. The app loads with a connector gallery and a credentials management page.
Build the credential vault UI
Create the credential management page where users can add new API keys for supported connectors. Keys are never shown after being saved.
1Build the credentials management page at src/pages/Credentials.tsx:231. Credential Cards: show existing credentials as Cards with connector logo, credential name, status Badge (active=green, invalid=red, expired=yellow), last_validated_at, and an Edit/Delete actions menu42. 'Add Credential' Dialog:5 - Connector Select: shows all connectors with logos6 - Credential name Input (user labels this, e.g. 'Production Shopify')7 - Dynamic credential fields: based on the selected connector's config_schema, render the appropriate Input fields (api_key, shop_url, etc.)8 - On submit: call an Edge Function add-credential that calls vault.create_secret() server-side and returns only the new credential record ID. Never send the plaintext key back to the frontend.9 - Show a success Alert: 'Credential saved securely. The key cannot be retrieved after this.'103. 'Validate' Button per Card: calls a validate-credential Edge Function that fetches the credential from vault, makes a test API call to the connector, and updates credentials.status based on the result114. Revoke (delete) Button: prompts with a Dialog warning that all integrations using this credential will stop. Deletes the vault secret and the credential row.Expected result: The credentials page shows existing credentials as Cards. Adding a new Slack API key saves it to Vault via the Edge Function. The 'Validate' button tests the key and updates the status Badge.
Build the data flow execution Edge Function
Create the core Edge Function that executes a data flow: fetches from source, applies field mapping, posts to destination, and logs the result.
1// supabase/functions/run-integration/index.ts2import { serve } from 'https://deno.land/std@0.168.0/http/server.ts'3import { createClient } from 'https://esm.sh/@supabase/supabase-js@2'45const corsHeaders = { 'Access-Control-Allow-Origin': '*', 'Access-Control-Allow-Headers': 'authorization, apikey, content-type' }67serve(async (req: Request) => {8 if (req.method === 'OPTIONS') return new Response('ok', { headers: corsHeaders })910 const supabase = createClient(Deno.env.get('SUPABASE_URL') ?? '', Deno.env.get('SUPABASE_SERVICE_ROLE_KEY') ?? '')11 const { integrationId } = await req.json()1213 const { data: integration } = await supabase14 .from('integrations')15 .select('*, source_credential:credentials!source_credential_id(connector_id, vault_secret_id), destination_credential:credentials!destination_credential_id(connector_id, vault_secret_id)')16 .eq('id', integrationId)17 .single()1819 if (!integration) return new Response(JSON.stringify({ error: 'Integration not found' }), { status: 404, headers: corsHeaders })2021 const { data: logRow } = await supabase.from('sync_logs').insert({ integration_id: integrationId, status: 'running', started_at: new Date().toISOString() }).select().single()22 const logId = logRow?.id2324 try {25 const { data: srcSecret } = await supabase.from('vault.decrypted_secrets').select('decrypted_secret').eq('id', integration.source_credential.vault_secret_id).single()26 const { data: dstSecret } = await supabase.from('vault.decrypted_secrets').select('decrypted_secret').eq('id', integration.destination_credential.vault_secret_id).single()2728 const sourceData = await fetchFromSource(integration.source_credential.connector_id, srcSecret?.decrypted_secret, integration.filter_config)29 const transformed = applyFieldMapping(sourceData, integration.field_mapping)30 const count = await postToDestination(integration.destination_credential.connector_id, dstSecret?.decrypted_secret, transformed)3132 await supabase.from('sync_logs').update({ status: 'success', records_processed: count, completed_at: new Date().toISOString() }).eq('id', logId)33 await supabase.from('integrations').update({ last_run_at: new Date().toISOString() }).eq('id', integrationId)3435 return new Response(JSON.stringify({ success: true, records: count }), { headers: corsHeaders })36 } catch (err) {37 const msg = err instanceof Error ? err.message : 'Unknown error'38 await supabase.from('sync_logs').update({ status: 'error', error_message: msg, completed_at: new Date().toISOString() }).eq('id', logId)39 return new Response(JSON.stringify({ error: msg }), { status: 500, headers: corsHeaders })40 }41})4243function applyFieldMapping(data: any[], mapping: Record<string, string>): any[] {44 return data.map((row) => Object.fromEntries(Object.entries(mapping).map(([dest, src]) => [dest, row[src]])))45}4647async function fetchFromSource(connectorId: string, secret: string, filters: any): Promise<any[]> {48 // Implement per-connector fetch logic based on connectorId49 return []50}5152async function postToDestination(connectorId: string, secret: string, data: any[]): Promise<number> {53 // Implement per-connector post logic based on connectorId54 return data.length55}Expected result: The run-integration Edge Function deploys. Calling it with an integration ID creates a sync_log entry. The function fetches credentials from Vault without exposing plaintext keys.
Add webhook receiver and the health monitoring dashboard
Create a generic webhook receiver Edge Function that accepts inbound payloads and routes them to matching integrations, plus the health monitoring dashboard that shows integration status.
1Build two features:231. Webhook receiver Edge Function at supabase/functions/webhook-receiver/index.ts:4 - Accept POST requests at URL pattern /functions/v1/webhook-receiver?integration_id=UUID5 - Validate the integration_id exists and is active6 - Look up the integration's source connector and validate an optional webhook_secret header (HMAC-SHA256)7 - Insert the raw payload into a webhooks_queue table: id, integration_id, payload (jsonb), received_at, processed (bool default false)8 - Return 200 immediately so the sender doesn't time out9 - A separate pg_cron job processes webhooks_queue every minute: for each unprocessed row, call run-integration with the queued payload10112. Health monitoring dashboard at src/pages/IntegrationHealth.tsx:12 - Summary Cards at top: Total Active Integrations, Syncs Today, Error Rate (%), Average Duration13 - Integration list as a DataTable: name, source → destination, last run (relative time), status Badge (healthy|warning|error|idle), error rate last 24h, 'Run Now' Button14 - Clicking a row opens a Sheet with: last 50 sync_logs for that integration as a mini DataTable, a Recharts AreaChart showing daily sync counts and error counts over 30 days15 - 'Pause/Resume' Toggle per integration (sets is_active)16 - Error state integrations show at the top of the list with a red left borderPro tip: Add a retry handler: a pg_cron job that runs every 5 minutes and queries sync_logs WHERE status = 'error' AND retry_count < 3 AND next_retry_at <= now(). For each row, call run-integration and update retry_count and next_retry_at with exponential backoff (5min, 15min, 1hr).
Expected result: The webhook receiver URL is live. Sending a POST to it creates a webhooks_queue entry. The health dashboard shows integration status with colored Badges and last-run times.
Complete code
1import { useQuery } from '@tanstack/react-query'2import { supabase } from '@/integrations/supabase/client'34export interface SyncLog {5 id: string6 integration_id: string7 status: 'running' | 'success' | 'error' | 'retrying'8 records_processed: number | null9 error_message: string | null10 started_at: string11 completed_at: string | null12 retry_count: number13}1415export interface IntegrationHealth {16 integrationId: string17 totalRuns: number18 successRate: number19 lastStatus: SyncLog['status'] | null20 lastRunAt: string | null21 avgDurationMs: number22}2324export function useSyncLogs(integrationId: string, limit = 50) {25 return useQuery({26 queryKey: ['sync-logs', integrationId],27 queryFn: async () => {28 const { data, error } = await supabase29 .from('sync_logs')30 .select('*')31 .eq('integration_id', integrationId)32 .order('started_at', { ascending: false })33 .limit(limit)34 if (error) throw error35 return data as SyncLog[]36 },37 refetchInterval: 30_000,38 })39}4041export function useIntegrationHealth(integrationId: string): IntegrationHealth {42 const { data: logs = [] } = useSyncLogs(integrationId, 100)43 const completed = logs.filter((l) => l.status !== 'running')44 const successes = completed.filter((l) => l.status === 'success').length45 const durations = completed46 .filter((l) => l.completed_at)47 .map((l) => new Date(l.completed_at!).getTime() - new Date(l.started_at).getTime())4849 return {50 integrationId,51 totalRuns: completed.length,52 successRate: completed.length > 0 ? Math.round((successes / completed.length) * 100) : 100,53 lastStatus: logs[0]?.status ?? null,54 lastRunAt: logs[0]?.started_at ?? null,55 avgDurationMs: durations.length > 0 ? Math.round(durations.reduce((a, b) => a + b, 0) / durations.length) : 0,56 }57}Customization ideas
OAuth2 connector support
Add OAuth2 flow support for connectors like Google Sheets, Salesforce, and HubSpot. Store the access_token and refresh_token in Vault. Before each integration run, check if the access_token is expired (using expires_at column in credentials) and refresh it automatically using the connector's token endpoint.
Field mapping visual editor
Replace the raw JSON field_mapping config with a visual drag-and-drop field mapper. Show source connector fields on the left and destination connector fields on the right. Users connect fields by drawing lines. Store the mapping as JSON but build a visual interface over it. Fetch available fields from each connector using their respective schema/field APIs.
Data transformation rules
Add a transform_rules JSONB column to integrations that defines per-field transformations: string operations (uppercase, trim, regex replace), type conversions (string to number), and conditional logic (if field is empty, use default value). Apply transformations in the Edge Function between fetching and posting data.
Slack alert notifications
Add a notifications table with webhook URLs per user (Slack, Discord, or generic webhook). When a sync fails and all retries are exhausted, send an alert to the user's configured notification channel via an Edge Function. Include the integration name, error message, and a direct link to the integration's health page.
Common pitfalls
Pitfall: Storing credentials in a regular table column
How to avoid: Use Supabase Vault (pgsodium) for all third-party credentials. Call vault.create_secret() to store and get back a UUID reference. Access decrypted values only in Edge Functions via vault.decrypted_secrets — never return decrypted values to the frontend.
Pitfall: Not returning 200 immediately from webhook receivers
How to avoid: Return 200 immediately after validating the payload and inserting it into webhooks_queue. Process the queue asynchronously via pg_cron. This pattern decouples receipt from processing and prevents timeouts.
Pitfall: Logging credential values in sync_log error messages
How to avoid: Sanitize error messages before storing them. Strip any Authorization headers, Bearer tokens, or api_key query parameters from error strings using a regex: message.replace(/Bearer [a-zA-Z0-9._-]+/g, 'Bearer [REDACTED]').
Pitfall: Running all integrations without rate limiting
How to avoid: Spread integration schedules. When a user creates an integration with an hourly schedule, automatically offset it by a few minutes based on the integration ID hash. Also add connector-level rate limiting: maximum N calls per minute per credential.
Best practices
- Always start each integration run by inserting a sync_log row with status 'running'. If the Edge Function crashes or times out, you can detect stale 'running' entries and mark them as errors with a cleanup job.
- Store field mappings as a declarative JSON object, not as imperative code. This makes mappings auditable, editable through a UI, and portable. Never store JavaScript eval-able expressions in field_mapping.
- Implement idempotency for integration runs. When processing data, track processed record IDs in a deduplication table so re-running an integration (due to retry) doesn't create duplicate destination records.
- Test each connector with a dry_run mode: the Edge Function fetches and transforms data but does not post to the destination. Add a 'Dry Run' button in the integration dashboard that shows a preview of what would be sent.
- Add a sync_log.checksum column that stores a hash of the processed data. On the next run, compare checksums — if nothing changed, skip the write to the destination. This prevents unnecessary API calls for unchanged data.
- Monitor Edge Function execution time. If a sync regularly takes more than 25 seconds (Supabase's Edge Function timeout is 150 seconds but calls can be cut off by load balancers), split the integration into paginated batches and chain calls.
AI prompts to try
Copy these prompts to build this project faster.
I'm building an integration hub where data flows between APIs. Each flow has a source connector and a destination connector with a field mapping configuration stored as JSON. Help me design a TypeScript function applyFieldMapping(sourceData: Record<string, unknown>[], mapping: Record<string, string>): Record<string, unknown>[] that maps source field names to destination field names. Also design a transform_rules extension that supports: string.toUpperCase(), string.trim(), number.toString(), and conditional defaults when a field is null or empty.
Add an integration template gallery to the hub. Create a pre-built_integrations table with name, description, source_connector_name, destination_connector_name, default_field_mapping (jsonb), and thumbnail_url. Show these as Cards on the integrations page with a 'Use Template' Button. Clicking creates a new integration pre-filled with the template's field mapping. Include templates for: Shopify Orders to Slack, Airtable to Google Sheets, and generic Webhook to REST API.
In Supabase, create a pg_cron job that runs every 5 minutes and processes failed integrations with retry logic. The job should: 1) find sync_logs WHERE status = 'error' AND retry_count < 3 AND (next_retry_at IS NULL OR next_retry_at <= now()), 2) for each, call the run-integration Edge Function via net.http_post, 3) update sync_logs SET retry_count = retry_count + 1, next_retry_at = now() + (interval '5 minutes' * power(3, retry_count)), status = 'retrying'. Show the complete SQL.
Frequently asked questions
Is Supabase Vault available on the Free plan?
Supabase Vault (pgsodium encryption) is available on all plans including Free. However, the vault.secrets table and vault.decrypted_secrets view are created by default in Supabase projects. You can use vault.create_secret() on the Free plan. The Supabase Pro plan adds additional security features and dedicated infrastructure, but Vault encryption itself is not a paid feature.
How do I add a new connector type for an API I use?
Add a row to the connectors table with the new connector's name, config_schema (the fields needed, like api_key or base_url), and auth_type. Then add a case to the fetchFromSource and postToDestination functions in the run-integration Edge Function. For complex connectors, create a separate dedicated Edge Function and call it from the main runner. The connector definition in the table drives the credential form UI automatically.
What is the maximum data volume an integration can handle per run?
Supabase Edge Functions have a 150-second execution timeout and a 6MB response size limit. For large data sets, implement pagination in the fetchFromSource function: fetch in pages of 100-500 records and process incrementally. Store the last processed cursor (page token, ID, or timestamp) in the integrations table as last_sync_cursor so each run starts from where the previous run ended.
Can multiple users share the same connector credential?
No — credentials are scoped to user_id by RLS. Each user must add their own credentials for each connector. This is intentional: sharing credentials would mean multiple users have access to the same API account, which is a security risk. For team use cases, you'd need to add an organization layer and update RLS to allow organization-level credential sharing with access control.
How do I test a webhook integration locally before deploying?
Lovable does not provide a local development environment. To test webhook receivers, publish your app first (or create a staging deployment) and configure the external service to send webhooks to your deployed Edge Function URL. Use the webhooks_queue table as a debug log — inspect received payloads in the Supabase table viewer to verify the webhook is being received and stored correctly before processing.
What happens if the destination API is down when an integration runs?
The run-integration Edge Function catches the HTTP error from the destination API, updates the sync_log with status 'error' and the error message, and returns a 500 response. The retry handler (described in the Step 4 pro tip) will automatically retry up to 3 times with exponential backoff. After 3 failures, the integration stays in error state and the health dashboard shows it with a red status Badge.
Can I build a fully custom connector for an internal API?
Yes. Add a row to the connectors table with is_builtin = false and set base_url to your internal API's URL. Configure config_schema to define the fields users need to enter (like an API key or auth token). In the fetchFromSource function, add a case for this connector ID that uses the stored base_url and credential to make authenticated requests to your internal API. Internal APIs work as long as they're reachable from Supabase's edge network.
How do I get help building integrations for specific APIs my business uses?
RapidDev builds production Lovable apps including custom integration hubs with connectors for specific business tools, OAuth flows, and enterprise data pipelines. Reach out if your use case requires connectors or data transformation beyond what this guide covers.
Talk to an Expert
Our team has built 600+ apps. Get personalized help with your project.
Book a free consultation