To delete files from Supabase Storage, use the storage.from('bucket').remove(['path/to/file.png']) method from the JavaScript client. You can delete single files or pass an array of file paths for bulk deletion. Make sure your RLS policies on the storage.objects table allow DELETE operations for the authenticated user, and always verify the file path matches exactly what was used during upload.
Deleting Files from Supabase Storage Buckets
Supabase Storage is an S3-compatible object storage system integrated with PostgreSQL Row Level Security. Deleting files requires both the correct API call and proper RLS policies on the storage.objects table. This tutorial covers single-file and bulk deletion, writing DELETE policies, handling user-scoped folder patterns, and verifying that files were successfully removed.
Prerequisites
- A Supabase project with a storage bucket containing files
- The Supabase JS client installed and initialized
- RLS enabled on the storage.objects table (enabled by default)
- Files previously uploaded to the bucket with known paths
Step-by-step guide
Delete a single file from a storage bucket
Delete a single file from a storage bucket
Use the storage.from('bucket-name').remove() method with an array containing the file path. The path must exactly match the path used during upload, including any folder prefixes. The remove method always takes an array of paths, even for a single file. If the file does not exist, the operation succeeds silently without errors.
1import { createClient } from '@supabase/supabase-js';23const supabase = createClient(4 process.env.NEXT_PUBLIC_SUPABASE_URL!,5 process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY!6);78// Delete a single file9const { data, error } = await supabase.storage10 .from('avatars')11 .remove(['public/avatar-123.png']);1213if (error) {14 console.error('Delete failed:', error.message);15} else {16 console.log('Deleted files:', data);17}Expected result: The file is deleted from storage. The data array contains objects describing the deleted files.
Delete multiple files in a single request
Delete multiple files in a single request
Pass multiple file paths in the array to delete several files at once. This is more efficient than making separate API calls for each file. All files must be in the same bucket. If any file in the array does not exist, the rest are still deleted — there is no rollback on partial failure.
1// Bulk delete multiple files from the same bucket2const filesToDelete = [3 'user-123/document-a.pdf',4 'user-123/document-b.pdf',5 'user-123/photos/vacation.jpg',6];78const { data, error } = await supabase.storage9 .from('documents')10 .remove(filesToDelete);1112if (error) {13 console.error('Bulk delete failed:', error.message);14} else {15 console.log(`Successfully deleted ${data.length} files`);16}Expected result: All specified files are removed from the bucket. The response data array includes an entry for each deleted file.
Write an RLS policy to allow users to delete their own files
Write an RLS policy to allow users to delete their own files
Supabase Storage uses RLS on the storage.objects table to control access. Without a DELETE policy, remove() calls will fail silently or return an error. The most common pattern is user-scoped folders where each user's files are stored under their user ID. The policy below checks that the authenticated user owns the folder by comparing auth.uid() to the first segment of the file path.
1-- Allow users to delete files in their own folder2-- File paths follow the pattern: {user_id}/filename.ext3CREATE POLICY "Users can delete their own files"4ON storage.objects FOR DELETE5TO authenticated6USING (7 bucket_id = 'documents'8 AND (SELECT auth.uid())::text = (storage.foldername(name))[1]9);1011-- If you also need a policy for public bucket cleanup12CREATE POLICY "Users can delete their own uploads from public bucket"13ON storage.objects FOR DELETE14TO authenticated15USING (16 bucket_id = 'avatars'17 AND (SELECT auth.uid())::text = (storage.foldername(name))[1]18);Expected result: Authenticated users can delete files in their own folder but cannot delete files belonging to other users.
Delete all files in a user's folder
Delete all files in a user's folder
To delete all files for a specific user or folder, first list the files in that folder, then pass all paths to remove(). The list method returns file metadata including the name property, which you combine with the folder prefix to construct the full path for deletion.
1async function deleteAllUserFiles(userId: string, bucket: string) {2 // Step 1: List all files in the user's folder3 const { data: files, error: listError } = await supabase.storage4 .from(bucket)5 .list(userId, { limit: 1000 });67 if (listError) {8 console.error('Failed to list files:', listError.message);9 return;10 }1112 if (!files || files.length === 0) {13 console.log('No files to delete');14 return;15 }1617 // Step 2: Build full paths and delete18 const filePaths = files.map((file) => `${userId}/${file.name}`);1920 const { data, error } = await supabase.storage21 .from(bucket)22 .remove(filePaths);2324 if (error) {25 console.error('Delete failed:', error.message);26 } else {27 console.log(`Deleted ${data.length} files from ${userId}/`);28 }29}3031// Usage32await deleteAllUserFiles('user-uuid-here', 'documents');Expected result: All files in the specified user folder are deleted. The folder itself is virtual and disappears when empty.
Handle deletion errors and verify removal
Handle deletion errors and verify removal
After deleting files, verify they are gone by attempting to generate a URL or list the folder contents. Common error causes include missing RLS policies, incorrect file paths, and trying to delete from a non-existent bucket. If the remove() call returns no error but the file still exists, the most likely cause is an RLS policy blocking the operation silently.
1async function deleteAndVerify(bucket: string, path: string) {2 // Delete the file3 const { data, error } = await supabase.storage4 .from(bucket)5 .remove([path]);67 if (error) {8 console.error(`Delete error: ${error.message}`);9 return false;10 }1112 // Verify deletion by trying to get a signed URL13 const { data: urlData, error: urlError } = await supabase.storage14 .from(bucket)15 .createSignedUrl(path, 10);1617 if (urlError) {18 console.log('File confirmed deleted (cannot generate URL)');19 return true;20 }2122 console.warn('File may still exist — check RLS policies');23 return false;24}Expected result: The file is deleted and verification confirms it no longer exists in storage.
Complete working example
1// Supabase Storage deletion helper2// Handles single, bulk, and folder-level file deletion34import { createClient } from '@supabase/supabase-js';56const supabase = createClient(7 process.env.NEXT_PUBLIC_SUPABASE_URL!,8 process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY!9);1011// Delete a single file12export async function deleteFile(bucket: string, path: string) {13 const { data, error } = await supabase.storage14 .from(bucket)15 .remove([path]);1617 if (error) throw new Error(`Delete failed: ${error.message}`);18 return data;19}2021// Delete multiple files22export async function deleteFiles(bucket: string, paths: string[]) {23 const { data, error } = await supabase.storage24 .from(bucket)25 .remove(paths);2627 if (error) throw new Error(`Bulk delete failed: ${error.message}`);28 return data;29}3031// Delete all files in a folder (user-scoped pattern)32export async function deleteFolder(bucket: string, folderPath: string) {33 const allFiles: string[] = [];34 let offset = 0;35 const batchSize = 100;3637 // Paginate through all files in the folder38 while (true) {39 const { data: files, error } = await supabase.storage40 .from(bucket)41 .list(folderPath, { limit: batchSize, offset });4243 if (error) throw new Error(`List failed: ${error.message}`);44 if (!files || files.length === 0) break;4546 allFiles.push(...files.map((f) => `${folderPath}/${f.name}`));47 offset += batchSize;4849 if (files.length < batchSize) break;50 }5152 if (allFiles.length === 0) return [];5354 const { data, error } = await supabase.storage55 .from(bucket)56 .remove(allFiles);5758 if (error) throw new Error(`Folder delete failed: ${error.message}`);59 return data;60}6162// RLS policy SQL — run this in the Supabase SQL Editor:63//64// CREATE POLICY "Users can delete own files"65// ON storage.objects FOR DELETE66// TO authenticated67// USING (68// bucket_id = 'your-bucket'69// AND (SELECT auth.uid())::text = (storage.foldername(name))[1]70// );Common mistakes when deleting Files from Supabase Storage
Why it's a problem: Passing a single file path as a string instead of wrapping it in an array
How to avoid: The remove() method always expects an array of paths, even for a single file: remove(['path/to/file.png']), not remove('path/to/file.png').
Why it's a problem: Including the bucket name in the file path when calling remove()
How to avoid: The bucket is specified in storage.from('bucket'). The path array should only contain paths within that bucket, not the bucket name itself.
Why it's a problem: Having an INSERT and SELECT RLS policy but forgetting to add a DELETE policy on storage.objects
How to avoid: Each operation needs its own policy. Add a DELETE policy: CREATE POLICY ON storage.objects FOR DELETE TO authenticated USING (...).
Why it's a problem: Assuming remove() will return an error if the file does not exist
How to avoid: remove() succeeds silently for non-existent files. If you need to confirm a file exists before deleting, use list() or createSignedUrl() first.
Best practices
- Always write explicit DELETE RLS policies on storage.objects for each bucket that needs deletion support
- Use user-scoped folder patterns (user_id/filename) and match them in RLS policies for secure per-user file management
- Paginate through files with list() before bulk deletion when a folder may contain more than 100 files
- Verify deletion in critical workflows by attempting to access the file after removal
- Delete associated database records and storage files together to keep your data consistent
- Log deletion operations for audit purposes, especially in multi-user applications
- Use the service_role key in server-side admin tools only when you need to delete files that bypass RLS
Still stuck?
Copy one of these prompts to get a personalized, step-by-step explanation.
I need to delete files from Supabase Storage. Show me how to delete single files, bulk delete multiple files, and delete all files in a user's folder. Include the RLS policies needed on storage.objects to allow authenticated users to delete only their own files.
Write a Supabase Storage deletion utility that handles single file removal, bulk removal, and folder cleanup with pagination. Include the SQL for RLS DELETE policies using the user-scoped folder pattern with storage.foldername().
Frequently asked questions
Does deleting a file from Supabase Storage also delete it from CDN caches?
Supabase invalidates the CDN cache when a file is deleted, but propagation may take a few seconds. If you need immediate cache invalidation, use a cache-busting query parameter in your file URLs.
Can I recover a deleted file from Supabase Storage?
No. Supabase Storage does not have a recycle bin or versioning feature. Once a file is deleted, it is permanently removed. Implement your own soft-delete pattern by moving files to an archive bucket before permanent deletion.
Why does remove() succeed but the file is still accessible?
This usually means the DELETE RLS policy is not matching the file. The remove() call may appear to succeed while RLS silently blocks the actual deletion. Check that your policy matches the correct bucket_id and file path pattern.
Can I delete an entire bucket at once?
You cannot delete a non-empty bucket. First remove all files using the list-and-delete pattern, then delete the bucket itself using supabase.storage.deleteBucket('bucket-name').
Is there a limit to how many files I can delete in one remove() call?
There is no documented hard limit, but for reliability, delete in batches of 100-500 files at a time. Very large arrays may time out on slower connections.
How do I delete files from a public bucket?
Public buckets still require RLS policies for write and delete operations. The 'public' setting only affects read access. Write a DELETE policy on storage.objects just as you would for a private bucket.
Can RapidDev help build a file management system with Supabase Storage?
Yes. RapidDev can build complete file management solutions with Supabase Storage including upload, deletion, access control, folder organization, and admin interfaces with proper RLS policy design.
Talk to an Expert
Our team has built 600+ apps. Get personalized help with your project.
Book a free consultation