To set up real-time listeners in Supabase, first add your table to the realtime publication with ALTER PUBLICATION supabase_realtime ADD TABLE your_table. Then subscribe to changes using supabase.channel().on('postgres_changes', ...).subscribe(). You can listen for INSERT, UPDATE, DELETE, or all events, and filter by specific column values. Always clean up subscriptions with removeChannel() when the component unmounts to prevent memory leaks.
Subscribing to Real-Time Database Changes in Supabase
Supabase Realtime lets your application receive database changes as they happen, without polling. When a row is inserted, updated, or deleted, all connected clients receive the change instantly via WebSocket. This tutorial covers enabling real-time on your tables, subscribing to specific events, filtering by column values, and properly managing subscription lifecycles in React and other frameworks.
Prerequisites
- A Supabase project with at least one table
- @supabase/supabase-js v2+ installed in your project
- RLS enabled with SELECT policies on the table (real-time respects RLS)
- Basic understanding of WebSockets (conceptual, no direct WebSocket code needed)
Step-by-step guide
Add your table to the realtime publication
Add your table to the realtime publication
Supabase does not stream changes for all tables by default. You must explicitly add each table to the supabase_realtime publication. This is a one-time setup per table. You can do this in the SQL Editor in the Dashboard. Without this step, subscriptions will connect but never receive any events.
1-- Add a single table to the realtime publication2ALTER PUBLICATION supabase_realtime ADD TABLE messages;34-- Add multiple tables at once5ALTER PUBLICATION supabase_realtime ADD TABLE messages, notifications, chat_rooms;67-- Verify which tables are in the publication8SELECT * FROM pg_publication_tables WHERE pubname = 'supabase_realtime';Expected result: The table appears in the supabase_realtime publication and changes will be streamed to connected clients.
Subscribe to all changes on a table
Subscribe to all changes on a table
Create a channel and subscribe to postgres_changes events. Use event: '*' to listen for all change types (INSERT, UPDATE, DELETE). The callback receives a payload object containing the event type, the new row (for INSERT/UPDATE), and the old row (for DELETE). The schema parameter should be 'public' unless your table is in a different schema.
1import { createClient } from '@supabase/supabase-js'23const supabase = createClient(4 process.env.NEXT_PUBLIC_SUPABASE_URL!,5 process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY!6)78// Subscribe to all changes on the messages table9const channel = supabase10 .channel('messages-all-changes')11 .on(12 'postgres_changes',13 { event: '*', schema: 'public', table: 'messages' },14 (payload) => {15 console.log('Event type:', payload.eventType)16 console.log('New data:', payload.new)17 console.log('Old data:', payload.old)18 }19 )20 .subscribe((status) => {21 console.log('Subscription status:', status)22 // SUBSCRIBED, CHANNEL_ERROR, TIMED_OUT, CLOSED23 })Expected result: The callback fires whenever any row in the messages table is inserted, updated, or deleted.
Subscribe to specific event types
Subscribe to specific event types
Instead of listening to all events, you can subscribe to specific operations. Chain multiple .on() calls on the same channel to handle each event type differently. This is useful when INSERT and DELETE require different UI updates.
1const channel = supabase2 .channel('messages-specific')3 .on(4 'postgres_changes',5 { event: 'INSERT', schema: 'public', table: 'messages' },6 (payload) => {7 console.log('New message:', payload.new)8 // Append to message list9 }10 )11 .on(12 'postgres_changes',13 { event: 'UPDATE', schema: 'public', table: 'messages' },14 (payload) => {15 console.log('Updated message:', payload.new)16 // Replace in message list17 }18 )19 .on(20 'postgres_changes',21 { event: 'DELETE', schema: 'public', table: 'messages' },22 (payload) => {23 console.log('Deleted message ID:', payload.old.id)24 // Remove from message list25 }26 )27 .subscribe()Expected result: Each event type triggers its own callback with the appropriate payload.
Filter events by column values
Filter events by column values
For tables with many rows, you often want to receive changes only for specific records — such as messages in a particular chat room. Use the filter parameter to subscribe only to changes where a column matches a specific value. This reduces bandwidth and processing on the client.
1// Only receive messages for a specific chat room2const roomId = 'room-123'34const channel = supabase5 .channel(`room-${roomId}`)6 .on(7 'postgres_changes',8 {9 event: '*',10 schema: 'public',11 table: 'messages',12 filter: `room_id=eq.${roomId}`,13 },14 (payload) => {15 console.log('Message in room:', payload.new)16 }17 )18 .subscribe()1920// Filter supports eq, neq, gt, gte, lt, lte, in21// Examples:22// filter: 'status=eq.active'23// filter: 'priority=gt.5'24// filter: 'type=in.(message,notification)'Expected result: The client only receives real-time events for rows matching the filter condition.
Use real-time listeners in a React component
Use real-time listeners in a React component
In React, set up subscriptions inside useEffect and clean them up in the cleanup function. This ensures the subscription is created when the component mounts and removed when it unmounts. Always call supabase.removeChannel() in the cleanup to prevent memory leaks and duplicate event handlers.
1import { useEffect, useState } from 'react'23function ChatRoom({ roomId }: { roomId: string }) {4 const [messages, setMessages] = useState<any[]>([])56 // Fetch initial messages7 useEffect(() => {8 supabase9 .from('messages')10 .select('*')11 .eq('room_id', roomId)12 .order('created_at', { ascending: true })13 .then(({ data }) => { if (data) setMessages(data) })14 }, [roomId])1516 // Subscribe to real-time changes17 useEffect(() => {18 const channel = supabase19 .channel(`room-${roomId}`)20 .on(21 'postgres_changes',22 {23 event: 'INSERT',24 schema: 'public',25 table: 'messages',26 filter: `room_id=eq.${roomId}`,27 },28 (payload) => {29 setMessages((prev) => [...prev, payload.new])30 }31 )32 .subscribe()3334 // Cleanup on unmount or roomId change35 return () => {36 supabase.removeChannel(channel)37 }38 }, [roomId])3940 return (41 <div>42 {messages.map((msg) => (43 <p key={msg.id}>{msg.content}</p>44 ))}45 </div>46 )47}Expected result: The chat room displays messages in real time. Switching rooms cleans up the old subscription and creates a new one.
Complete working example
1import { useEffect, useState, useCallback } from 'react'2import { createClient, RealtimeChannel } from '@supabase/supabase-js'34const supabase = createClient(5 process.env.NEXT_PUBLIC_SUPABASE_URL!,6 process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY!7)89interface Message {10 id: string11 room_id: string12 user_id: string13 content: string14 created_at: string15}1617export function useRealtimeMessages(roomId: string) {18 const [messages, setMessages] = useState<Message[]>([])19 const [status, setStatus] = useState<string>('connecting')2021 // Fetch existing messages22 useEffect(() => {23 supabase24 .from('messages')25 .select('*')26 .eq('room_id', roomId)27 .order('created_at', { ascending: true })28 .then(({ data, error }) => {29 if (data) setMessages(data)30 if (error) console.error('Fetch error:', error.message)31 })32 }, [roomId])3334 // Subscribe to real-time changes35 useEffect(() => {36 const channel: RealtimeChannel = supabase37 .channel(`room-${roomId}`)38 .on('postgres_changes',39 { event: 'INSERT', schema: 'public', table: 'messages',40 filter: `room_id=eq.${roomId}` },41 (payload) => {42 setMessages((prev) => [...prev, payload.new as Message])43 }44 )45 .on('postgres_changes',46 { event: 'UPDATE', schema: 'public', table: 'messages',47 filter: `room_id=eq.${roomId}` },48 (payload) => {49 setMessages((prev) =>50 prev.map((m) => m.id === payload.new.id ? payload.new as Message : m)51 )52 }53 )54 .on('postgres_changes',55 { event: 'DELETE', schema: 'public', table: 'messages',56 filter: `room_id=eq.${roomId}` },57 (payload) => {58 setMessages((prev) => prev.filter((m) => m.id !== payload.old.id))59 }60 )61 .subscribe((s) => setStatus(s))6263 return () => {64 supabase.removeChannel(channel)65 }66 }, [roomId])6768 const sendMessage = useCallback(async (content: string, userId: string) => {69 const { error } = await supabase70 .from('messages')71 .insert({ room_id: roomId, user_id: userId, content })72 if (error) console.error('Send error:', error.message)73 }, [roomId])7475 return { messages, status, sendMessage }76}Common mistakes when setting up Real-Time Listeners in Supabase
Why it's a problem: Not adding the table to the supabase_realtime publication
How to avoid: Run ALTER PUBLICATION supabase_realtime ADD TABLE your_table in the SQL Editor. Subscriptions connect but receive no events without this.
Why it's a problem: Not cleaning up subscriptions when a component unmounts
How to avoid: Always call supabase.removeChannel(channel) in the useEffect cleanup function. Without it, you get duplicate event handlers and memory leaks.
Why it's a problem: Missing RLS SELECT policy on the table, causing real-time events to be silently dropped
How to avoid: Real-time respects RLS. The subscribed user must have a SELECT policy on the table. Add a policy that allows authenticated users to read the relevant rows.
Why it's a problem: Expecting DELETE payload.new to contain data
How to avoid: For DELETE events, payload.new is empty. The deleted row data is in payload.old. Set REPLICA IDENTITY FULL on the table if you need all columns in the old payload.
Best practices
- Always add tables to the supabase_realtime publication before subscribing to changes
- Use descriptive, unique channel names that include the table and filter context
- Clean up subscriptions in useEffect cleanup functions to prevent memory leaks
- Use server-side filters (filter parameter) instead of client-side filtering to reduce WebSocket traffic
- Set REPLICA IDENTITY FULL on tables where you need the full row data in DELETE event payloads
- Fetch initial data before subscribing to real-time changes to avoid missing events during the gap
- Handle the CHANNEL_ERROR and TIMED_OUT subscription statuses to implement reconnection logic
Still stuck?
Copy one of these prompts to get a personalized, step-by-step explanation.
I have a Supabase messages table and I want to build a real-time chat. Show me how to enable real-time on the table, subscribe to INSERT events filtered by room_id, display messages as they arrive, and clean up the subscription in React.
Create a React hook that subscribes to real-time INSERT, UPDATE, and DELETE events on a Supabase table, filtered by a room_id column. Include initial data fetch, proper cleanup on unmount, and a sendMessage function that inserts a new row.
Frequently asked questions
Does Supabase real-time work with RLS?
Yes. Real-time respects Row Level Security. The user's JWT is used to evaluate RLS policies, and only events for rows the user has SELECT access to are delivered. If real-time is not delivering events, check your SELECT policy.
How many real-time connections can I have?
Supabase allows up to 200 concurrent connections on the Free plan, 500 on Pro, and more on higher plans. Each browser tab with a subscription counts as one connection.
Can I subscribe to changes across multiple tables in one channel?
Yes. Chain multiple .on('postgres_changes', ...) calls on the same channel, each with a different table. All events flow through the single WebSocket connection.
Why am I not receiving DELETE events with full row data?
By default, DELETE events only include the primary key in payload.old. To receive all columns, set REPLICA IDENTITY FULL on the table: ALTER TABLE messages REPLICA IDENTITY FULL. This increases WAL size slightly.
What happens if the WebSocket connection drops?
The Supabase client automatically attempts to reconnect. During the disconnection, events are lost. Fetch the latest data when the subscription status changes back to SUBSCRIBED to catch up on missed events.
Is real-time suitable for high-frequency updates?
For very high-frequency updates (hundreds per second), use Broadcast instead of Postgres Changes. Broadcast is client-to-client via the Realtime server and does not go through the database, making it significantly faster.
Can RapidDev help build real-time features with Supabase?
Yes. RapidDev can architect and implement real-time features including chat systems, live dashboards, collaborative editing, and notification systems using Supabase Realtime.
Talk to an Expert
Our team has built 600+ apps. Get personalized help with your project.
Book a free consultation