Skip to main content
RapidDev - Software Development Agency
supabase-tutorial

How to Upload Files to Supabase Storage

To upload files to Supabase Storage, create a storage bucket in the Dashboard or via the client, then use supabase.storage.from('bucket').upload() to send files. Set buckets as public or private, configure RLS policies on storage.objects to control who can upload, and use getPublicUrl() or createSignedUrl() to retrieve file URLs. Always validate file type and size before uploading.

What you'll learn

  • How to create storage buckets and configure public or private access
  • How to upload files using the Supabase JavaScript client
  • How to write RLS policies on storage.objects for secure uploads
  • How to retrieve public URLs and signed URLs for uploaded files
Book a free consultation
4.9Clutch rating
600+Happy partners
17+Countries served
190+Team members
Intermediate8 min read15-20 minSupabase (all plans), @supabase/supabase-js v2+March 2026RapidDev Engineering Team
TL;DR

To upload files to Supabase Storage, create a storage bucket in the Dashboard or via the client, then use supabase.storage.from('bucket').upload() to send files. Set buckets as public or private, configure RLS policies on storage.objects to control who can upload, and use getPublicUrl() or createSignedUrl() to retrieve file URLs. Always validate file type and size before uploading.

Uploading Files to Supabase Storage with Access Control

Supabase Storage is an S3-compatible object storage system integrated with Supabase Auth and Row Level Security. This tutorial covers creating buckets, uploading files from the client, securing uploads with RLS policies, and generating URLs to access your files. You will build a complete file upload flow that works with authenticated users and respects access control rules.

Prerequisites

  • A Supabase project (free tier or above)
  • The @supabase/supabase-js library installed in your project
  • Basic knowledge of JavaScript/TypeScript and file handling
  • Supabase Auth configured with at least email/password login

Step-by-step guide

1

Create a storage bucket in the Dashboard

Open your Supabase Dashboard and click Storage in the left sidebar. Click New Bucket and enter a name like 'uploads'. Choose whether the bucket should be public (anyone with the URL can read files) or private (requires authentication). For most applications, start with a private bucket and use signed URLs for controlled access. You can also set allowed MIME types and a file size limit at creation time.

Expected result: A new storage bucket appears in the Dashboard Storage section.

2

Set up RLS policies for upload access

Once you create a bucket, you need RLS policies on the storage.objects table to control who can upload files. Without policies, all uploads will be denied. The most common pattern is to let authenticated users upload to a folder named after their user ID. This keeps files organized and prevents users from overwriting each other's files.

typescript
1-- Allow authenticated users to upload files to their own folder
2create policy "Users can upload to own folder"
3on storage.objects for insert
4to authenticated
5with check (
6 bucket_id = 'uploads'
7 and (storage.foldername(name))[1] = auth.uid()::text
8);
9
10-- Allow authenticated users to read their own files
11create policy "Users can read own files"
12on storage.objects for select
13to authenticated
14using (
15 bucket_id = 'uploads'
16 and (storage.foldername(name))[1] = auth.uid()::text
17);
18
19-- Allow authenticated users to delete their own files
20create policy "Users can delete own files"
21on storage.objects for delete
22to authenticated
23using (
24 bucket_id = 'uploads'
25 and (storage.foldername(name))[1] = auth.uid()::text
26);

Expected result: Authenticated users can upload, read, and delete files only within their own user-ID folder.

3

Upload a file from the client

Use the Supabase client to upload files. The upload() method takes the file path within the bucket and the File object. The path should include the user's ID as the first folder segment to match the RLS policy. Set cacheControl for browser caching and upsert to control whether existing files should be overwritten.

typescript
1import { createClient } from '@supabase/supabase-js'
2
3const supabase = createClient(
4 process.env.NEXT_PUBLIC_SUPABASE_URL!,
5 process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY!
6)
7
8async function uploadFile(file: File) {
9 const { data: { user } } = await supabase.auth.getUser()
10 if (!user) throw new Error('Not authenticated')
11
12 const filePath = `${user.id}/${Date.now()}-${file.name}`
13
14 const { data, error } = await supabase.storage
15 .from('uploads')
16 .upload(filePath, file, {
17 cacheControl: '3600',
18 upsert: false,
19 })
20
21 if (error) throw error
22 return data
23}

Expected result: The file is uploaded to the uploads bucket under the user's folder and the upload returns the file path.

4

Retrieve file URLs for display

After uploading, you need a URL to display or share the file. For public buckets, use getPublicUrl() which returns a permanent URL. For private buckets, use createSignedUrl() which generates a temporary URL that expires after a specified number of seconds. Signed URLs are more secure because they automatically expire.

typescript
1// For public buckets — permanent URL
2const { data } = supabase.storage
3 .from('public-bucket')
4 .getPublicUrl('user-id/photo.jpg')
5console.log(data.publicUrl)
6
7// For private buckets — temporary signed URL (1 hour)
8const { data: signedData, error } = await supabase.storage
9 .from('uploads')
10 .createSignedUrl('user-id/document.pdf', 3600)
11console.log(signedData?.signedUrl)

Expected result: You get a usable URL that can be embedded in HTML img tags or shared as download links.

5

Handle upload errors and validate input

Always validate files before uploading: check the file size against your bucket limit, verify the MIME type is allowed, and handle errors from the upload response. Common errors include 413 (file too large), 403 (RLS policy denied), and 409 (file already exists when upsert is false). Provide clear error messages to help users fix the issue.

typescript
1const ALLOWED_TYPES = ['image/png', 'image/jpeg', 'image/webp', 'application/pdf']
2const MAX_SIZE = 10 * 1024 * 1024 // 10 MB
3
4async function safeUpload(file: File) {
5 if (!ALLOWED_TYPES.includes(file.type)) {
6 return { error: `File type ${file.type} is not allowed.` }
7 }
8 if (file.size > MAX_SIZE) {
9 return { error: `File is too large. Maximum size is 10 MB.` }
10 }
11
12 const { data: { user } } = await supabase.auth.getUser()
13 if (!user) return { error: 'Please sign in to upload files.' }
14
15 const { data, error } = await supabase.storage
16 .from('uploads')
17 .upload(`${user.id}/${Date.now()}-${file.name}`, file, {
18 cacheControl: '3600',
19 upsert: false,
20 })
21
22 if (error) return { error: error.message }
23 return { data }
24}

Expected result: Invalid files are rejected before upload, and server errors are caught and reported clearly.

Complete working example

file-upload.ts
1import { createClient } from '@supabase/supabase-js'
2
3const supabase = createClient(
4 process.env.NEXT_PUBLIC_SUPABASE_URL!,
5 process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY!
6)
7
8const ALLOWED_TYPES = ['image/png', 'image/jpeg', 'image/webp', 'application/pdf']
9const MAX_FILE_SIZE = 10 * 1024 * 1024 // 10 MB
10const BUCKET = 'uploads'
11
12interface UploadResult {
13 success: boolean
14 path?: string
15 url?: string
16 error?: string
17}
18
19export async function uploadFile(file: File): Promise<UploadResult> {
20 // Validate file type
21 if (!ALLOWED_TYPES.includes(file.type)) {
22 return { success: false, error: `File type ${file.type} is not allowed.` }
23 }
24
25 // Validate file size
26 if (file.size > MAX_FILE_SIZE) {
27 const sizeMB = (file.size / (1024 * 1024)).toFixed(1)
28 return { success: false, error: `File is ${sizeMB} MB. Max is 10 MB.` }
29 }
30
31 // Verify authentication
32 const { data: { user } } = await supabase.auth.getUser()
33 if (!user) {
34 return { success: false, error: 'You must be signed in to upload files.' }
35 }
36
37 // Upload to user-scoped folder
38 const filePath = `${user.id}/${Date.now()}-${file.name}`
39 const { data, error } = await supabase.storage
40 .from(BUCKET)
41 .upload(filePath, file, {
42 cacheControl: '3600',
43 upsert: false,
44 })
45
46 if (error) {
47 return { success: false, error: error.message }
48 }
49
50 // Generate a signed URL (1 hour expiry)
51 const { data: urlData } = await supabase.storage
52 .from(BUCKET)
53 .createSignedUrl(data.path, 3600)
54
55 return {
56 success: true,
57 path: data.path,
58 url: urlData?.signedUrl,
59 }
60}

Common mistakes when uploading Files to Supabase Storage

Why it's a problem: Uploading files without RLS policies, resulting in silent 403 denials

How to avoid: Always create INSERT and SELECT policies on storage.objects for your bucket. Without policies, RLS blocks all operations and returns empty results or 403 errors.

Why it's a problem: Using the same file path for every upload, overwriting previous files

How to avoid: Prepend a timestamp or UUID to each filename to ensure uniqueness: `${Date.now()}-${file.name}`. Set upsert: false to get an error if a path collision occurs.

Why it's a problem: Exposing the service role key in client-side code for storage operations

How to avoid: Always use the anon key on the client side. The anon key respects RLS policies, which is the correct behavior. The service role key bypasses all security and should only be used server-side.

Why it's a problem: Not validating file type and size before upload, wasting bandwidth on rejected files

How to avoid: Check file.type against an allowed list and file.size against your bucket limit before calling storage.upload().

Best practices

  • Always create RLS policies on storage.objects before uploading files — without them, all operations are denied
  • Use user-scoped folders (userId/filename) to keep files organized and simplify RLS policies
  • Validate file type and size on the client before uploading to save bandwidth and provide instant feedback
  • Use signed URLs for private files instead of making entire buckets public
  • Set cacheControl on uploads to control browser caching behavior for static assets
  • Prepend timestamps or UUIDs to filenames to prevent naming conflicts
  • Never expose the SUPABASE_SERVICE_ROLE_KEY in client-side code — use the anon key which respects RLS

Still stuck?

Copy one of these prompts to get a personalized, step-by-step explanation.

ChatGPT Prompt

I want to upload files to Supabase Storage from a React app. Users should only be able to upload images (PNG, JPEG, WebP) up to 10 MB, and each user should only see their own files. Show me the bucket setup, RLS policies, and upload code.

Supabase Prompt

Create a private storage bucket called uploads with a 10 MB limit in Supabase. Add RLS policies so authenticated users can upload, read, and delete only files in their own user-ID folder. Write a TypeScript upload function with client-side file validation.

Frequently asked questions

What is the maximum file size I can upload to Supabase Storage?

The default limit is 50 MB on the free plan. Pro and Team plans support up to 5 GB per file. You can set a lower custom limit per bucket in the Dashboard.

Do I need RLS policies for public buckets?

Public buckets allow anyone to read files without authentication. However, you still need INSERT policies to control who can upload, and DELETE policies to control who can remove files.

Can I upload files without authenticating the user?

Yes, if you create an RLS policy that allows the anon role to insert into storage.objects. However, this is not recommended for production as it allows anyone to upload files to your bucket.

How do I upload multiple files at once?

The Supabase client uploads one file at a time. To upload multiple files, use Promise.all() to run multiple upload() calls in parallel. Each file needs its own unique path.

Why does my upload return a 403 error?

A 403 error means your RLS policy is blocking the upload. Check that you have an INSERT policy on storage.objects for your bucket and that the authenticated user's ID matches the folder path in the policy.

Can I overwrite an existing file?

Set the upsert option to true in the upload() call to overwrite a file at the same path. When upsert is false (the default), uploading to an existing path returns a 409 conflict error.

Can RapidDev help set up file storage with proper access controls?

Yes, RapidDev can configure your Supabase storage buckets, write RLS policies, build upload components with validation, and set up CDN caching for optimal file delivery.

RapidDev

Talk to an Expert

Our team has built 600+ apps. Get personalized help with your project.

Book a free consultation

Need help with your project?

Our experts have built 600+ apps and can accelerate your development. Book a free consultation — no strings attached.

Book a free consultation

We put the rapid in RapidDev

Need a dedicated strategic tech and growth partner? Discover what RapidDev can do for your business! Book a call with our team to schedule a free, no-obligation consultation. We'll discuss your project and provide a custom quote at no cost.