Skip to main content
RapidDev - Software Development Agency
flutterflow-tutorials

How to Create Custom Video Editing Experiences in FlutterFlow

Build video editing features in FlutterFlow using the video_editor package for a trim UI, Cloud Run FFmpeg for server-side clip merging and filters, and Flutter Stack with DragTarget for sticker overlays. Never try to merge multiple video clips client-side using Dart — mobile devices lack the processing power and the operation will block the UI thread.

What you'll learn

  • How to build a trim editor UI using the video_editor package in a FlutterFlow Custom Widget
  • How to send clips to Cloud Run for FFmpeg-based server-side merging and processing
  • How to build a text and sticker overlay editor using Stack, Positioned, and DragTarget widgets
  • How to implement brightness and contrast filter sliders that apply CSS-style effects to the video preview
Book a free consultation
4.9Clutch rating
600+Happy partners
17+Countries served
190+Team members
Beginner11 min read60-80 minFlutterFlow Pro+ (code export required for custom packages)March 2026RapidDev Engineering Team
TL;DR

Build video editing features in FlutterFlow using the video_editor package for a trim UI, Cloud Run FFmpeg for server-side clip merging and filters, and Flutter Stack with DragTarget for sticker overlays. Never try to merge multiple video clips client-side using Dart — mobile devices lack the processing power and the operation will block the UI thread.

Building Professional Video Editing UIs in FlutterFlow

Video editing is one of the most UI-intensive features in any mobile app. FlutterFlow's built-in VideoPlayer shows video but provides no editing capabilities — all editing UI must be built as Custom Widgets or Custom Actions backed by specialised Dart packages. This tutorial covers four specific editing experiences that together cover the most common use cases: a frame-level trim editor, a multi-clip merge workflow, a text and sticker overlay system, and a filter adjustment panel. Each is implemented differently — trim runs on-device, merge runs on Cloud Run, overlays are pure Flutter UI, and filters use video shader effects. Understanding which approach to use for which feature will save you from major architectural mistakes.

Prerequisites

  • FlutterFlow Pro plan with code export enabled
  • video_editor: ^3.0.1 and ffmpeg_kit_flutter packages added to pubspec.yaml
  • A Cloud Run instance or Cloud Function with FFmpeg installed for server-side processing
  • Firebase Storage configured for uploading source and output video files

Step-by-step guide

1

Build a trim editor with the video_editor package

Create a Custom Widget named VideoTrimEditor that accepts a videoPath (String — local file path) parameter. Import video_editor and initialise a VideoEditorController with VideoEditorController.file(File(videoPath)). Call controller.initialize() in initState. The video_editor package provides ready-made widgets: CropGridViewer for the trim timeline, VideoPlayer for the preview, and TrimSlider for the start/end handle. Arrange them in a Column: preview at the top, TrimSlider below, and a Save button at the bottom. When Save is tapped, export the trimmed clip using FFmpegVideoEditorExecute which runs FFmpeg on-device for trim operations (trim does not require server-side processing because it is a simple cut with no re-encoding). Return the exported file path to FlutterFlow via a callback parameter.

video_trim_editor_widget.dart
1// Custom Widget: VideoTrimEditor (simplified)
2// Requires: video_editor: ^3.0.1
3import 'package:flutter/material.dart';
4import 'package:video_editor/video_editor.dart';
5import 'dart:io';
6
7class VideoTrimEditor extends StatefulWidget {
8 final String videoPath;
9 final Function(String) onExported;
10 const VideoTrimEditor({Key? key, required this.videoPath, required this.onExported}) : super(key: key);
11
12 @override
13 State<VideoTrimEditor> createState() => _VideoTrimEditorState();
14}
15
16class _VideoTrimEditorState extends State<VideoTrimEditor> {
17 late VideoEditorController _controller;
18 bool _isExporting = false;
19
20 @override
21 void initState() {
22 super.initState();
23 _controller = VideoEditorController.file(
24 File(widget.videoPath),
25 minDuration: const Duration(seconds: 1),
26 maxDuration: const Duration(minutes: 3),
27 );
28 _controller.initialize().then((_) => setState(() {}));
29 }
30
31 @override
32 void dispose() {
33 _controller.dispose();
34 super.dispose();
35 }
36
37 Future<void> _exportTrimmed() async {
38 setState(() => _isExporting = true);
39 await _controller.exportVideo(
40 onCompleted: (file) {
41 widget.onExported(file.path);
42 setState(() => _isExporting = false);
43 },
44 );
45 }
46
47 @override
48 Widget build(BuildContext context) {
49 if (!_controller.initialized) return const CircularProgressIndicator();
50 return Column(
51 children: [
52 AspectRatio(aspectRatio: _controller.video.value.aspectRatio,
53 child: CropGridViewer.preview(controller: _controller)),
54 TrimSlider(controller: _controller, height: 60),
55 ElevatedButton(
56 onPressed: _isExporting ? null : _exportTrimmed,
57 child: _isExporting ? const CircularProgressIndicator() : const Text('Export Trimmed Video'),
58 ),
59 ],
60 );
61 }
62}

Expected result: A Custom Widget appears in FlutterFlow showing the video preview, a trim slider with draggable handles, and an export button. The exported clip is a shorter version of the original.

2

Merge multiple clips via Cloud Run with FFmpeg

Multi-clip merging requires server-side processing because re-encoding multiple video files into one is a CPU-intensive operation that would freeze the UI and could take minutes on a mobile device. Set up a Cloud Run container with FFmpeg installed. Write a Cloud Run service that accepts an array of Firebase Storage URLs in the request body, downloads each file, runs FFmpeg with the concat demuxer to join them, re-encodes to H.264/AAC at a consistent resolution and frame rate, and uploads the result back to Firebase Storage. In FlutterFlow, after users select multiple clips, show an upload progress indicator, POST the clip URLs to your Cloud Run endpoint, poll for completion, and then display the merged video URL.

merge_service.js
1// Cloud Run service handler (Node.js + Express)
2// Dockerfile: FROM node:18, RUN apt-get install -y ffmpeg
3const express = require('express');
4const { execSync } = require('child_process');
5const fs = require('fs');
6const path = require('path');
7const fetch = require('node-fetch');
8
9const app = express();
10app.use(express.json());
11
12app.post('/merge', async (req, res) => {
13 const { clipUrls, outputName } = req.body;
14 const tmpDir = '/tmp/' + Date.now();
15 fs.mkdirSync(tmpDir);
16
17 // Download all clips
18 const localPaths = [];
19 for (let i = 0; i < clipUrls.length; i++) {
20 const filePath = path.join(tmpDir, `clip_${i}.mp4`);
21 const response = await fetch(clipUrls[i]);
22 const buffer = await response.buffer();
23 fs.writeFileSync(filePath, buffer);
24 localPaths.push(filePath);
25 }
26
27 // Write concat list
28 const listPath = path.join(tmpDir, 'list.txt');
29 fs.writeFileSync(listPath, localPaths.map(p => `file '${p}'`).join('\n'));
30
31 // Run FFmpeg concat
32 const outputPath = path.join(tmpDir, outputName || 'merged.mp4');
33 execSync(`ffmpeg -f concat -safe 0 -i ${listPath} -c:v libx264 -c:a aac -movflags +faststart ${outputPath}`);
34
35 // Upload to Firebase Storage and return URL (implementation omitted for brevity)
36 res.json({ success: true, outputPath });
37});
38
39app.listen(8080);

Expected result: Posting an array of clip URLs to the Cloud Run endpoint returns a merged video URL in Firebase Storage within 30-90 seconds depending on clip length.

3

Build a text and sticker overlay editor with Stack and Positioned

Create a Custom Widget named OverlayEditor that displays the video using FlutterFlow's VideoPlayer widget inside a Stack. Add a second layer to the Stack containing all overlay widgets. Each overlay (text label or sticker image) is a Positioned widget with x and y coordinates stored in a List of overlay objects in Page State. Wrap each Positioned child in a GestureDetector with onPanUpdate to move the overlay: update its x and y coordinates in Page State as the user drags. For text overlays, include a TextField and font size Slider. For sticker overlays, show a bottom sheet sticker picker with emoji or image assets. When the user saves, capture the Stack as a PNG image using RenderRepaintBoundary and pass it to your Cloud Run service along with the video URL to composite the overlay onto the video using FFmpeg's overlay filter.

Expected result: Users can drag text and sticker overlays to any position on the video preview. The overlay positions persist in Page State while the user continues editing.

4

Add brightness and contrast filter sliders

For real-time filter preview in the FlutterFlow video player, use Flutter's ColorFiltered widget. Wrap the VideoPlayer in a ColorFiltered widget with a custom ColorFilter matrix. Brightness is controlled by adding a constant to the RGB channels. Contrast is controlled by scaling the RGB channels around the midpoint. Add two Slider widgets below the video: one for brightness (-1.0 to 1.0, default 0) and one for contrast (0.5 to 2.0, default 1.0). Bind each slider to a Page State double variable. In the Custom Widget, rebuild the ColorFilter matrix whenever the slider values change. When the user exports, pass the brightness and contrast values to Cloud Run where FFmpeg applies the same adjustments using the eq (equalizer) video filter: -vf eq=brightness={value}:contrast={value}.

color_filter_helper.dart
1// Brightness + contrast ColorFilter computation
2ColorFilter buildColorFilter(double brightness, double contrast) {
3 // contrast: scale factor around 0.5 midpoint
4 // brightness: additive offset to all channels
5 final b = brightness;
6 final c = contrast;
7 final t = (1.0 - c) / 2.0 + b; // translation term
8
9 return ColorFilter.matrix([
10 c, 0, 0, 0, t * 255,
11 0, c, 0, 0, t * 255,
12 0, 0, c, 0, t * 255,
13 0, 0, 0, 1, 0,
14 ]);
15}

Expected result: Moving the brightness and contrast sliders updates the video preview in real-time. The video preview shows a darker, lighter, or higher contrast version without any export delay.

Complete working example

color_filter_helper.dart
1// Brightness and contrast filter helper for FlutterFlow video editor
2// Use as a Custom Widget wrapper around VideoPlayer
3import 'package:flutter/material.dart';
4
5class FilteredVideoWrapper extends StatefulWidget {
6 final Widget videoPlayer;
7 final double initialBrightness;
8 final double initialContrast;
9 final double initialSaturation;
10
11 const FilteredVideoWrapper({
12 Key? key,
13 required this.videoPlayer,
14 this.initialBrightness = 0.0,
15 this.initialContrast = 1.0,
16 this.initialSaturation = 1.0,
17 }) : super(key: key);
18
19 @override
20 State<FilteredVideoWrapper> createState() => _FilteredVideoWrapperState();
21}
22
23class _FilteredVideoWrapperState extends State<FilteredVideoWrapper> {
24 late double _brightness;
25 late double _contrast;
26 late double _saturation;
27
28 @override
29 void initState() {
30 super.initState();
31 _brightness = widget.initialBrightness;
32 _contrast = widget.initialContrast;
33 _saturation = widget.initialSaturation;
34 }
35
36 ColorFilter _buildColorFilter() {
37 final c = _contrast;
38 final b = _brightness;
39 final t = (1.0 - c) / 2.0 + b;
40
41 // Saturation matrix
42 final sr = (1 - _saturation) * 0.2126;
43 final sg = (1 - _saturation) * 0.7152;
44 final sb = (1 - _saturation) * 0.0722;
45
46 return ColorFilter.matrix([
47 (sr + _saturation) * c, sg * c, sb * c, 0, t * 255,
48 sr * c, (sg + _saturation) * c, sb * c, 0, t * 255,
49 sr * c, sg * c, (sb + _saturation) * c, 0, t * 255,
50 0, 0, 0, 1, 0,
51 ]);
52 }
53
54 @override
55 Widget build(BuildContext context) {
56 return Column(
57 children: [
58 ColorFiltered(
59 colorFilter: _buildColorFilter(),
60 child: widget.videoPlayer,
61 ),
62 Padding(
63 padding: const EdgeInsets.symmetric(horizontal: 16, vertical: 8),
64 child: Column(
65 children: [
66 Row(children: [
67 const SizedBox(width: 80, child: Text('Brightness')),
68 Expanded(child: Slider(
69 value: _brightness,
70 min: -0.5, max: 0.5,
71 onChanged: (v) => setState(() => _brightness = v),
72 )),
73 ]),
74 Row(children: [
75 const SizedBox(width: 80, child: Text('Contrast')),
76 Expanded(child: Slider(
77 value: _contrast,
78 min: 0.5, max: 2.0,
79 onChanged: (v) => setState(() => _contrast = v),
80 )),
81 ]),
82 Row(children: [
83 const SizedBox(width: 80, child: Text('Saturation')),
84 Expanded(child: Slider(
85 value: _saturation,
86 min: 0.0, max: 2.0,
87 onChanged: (v) => setState(() => _saturation = v),
88 )),
89 ]),
90 ],
91 ),
92 ),
93 ],
94 );
95 }
96}

Common mistakes when creating Custom Video Editing Experiences in FlutterFlow

Why it's a problem: Trying to merge multiple video clips client-side using Dart

How to avoid: Upload the source clips to Firebase Storage, POST their URLs to a Cloud Run service with FFmpeg installed, process server-side (typically 10-30 seconds on a small Cloud Run instance), and return the merged output URL. Show a progress indicator while the server processes.

Why it's a problem: Applying video filters client-side at export time by re-processing the video in Dart

How to avoid: Use ColorFiltered wrapping only for the live preview in the UI. At export time, pass the filter values (brightness, contrast, saturation) to your Cloud Run FFmpeg service and apply them using FFmpeg's -vf eq=brightness=X:contrast=Y filter, which processes in a single pass.

Why it's a problem: Not initialising the VideoEditorController before building the trim slider widgets

How to avoid: In the Custom Widget's build() method, check controller.initialized and show a CircularProgressIndicator until it is true. Only render the editor widgets after initialization is confirmed.

Best practices

  • Use on-device FFmpeg (video_editor package) only for trim operations — it uses stream copy and is fast
  • Always process multi-clip merging and complex filters on Cloud Run with FFmpeg, not on the device
  • Apply ColorFiltered wrapper only for live preview — use FFmpeg eq filter for actual export quality
  • Add -movflags +faststart to all FFmpeg export commands so output videos stream progressively
  • Clean up temporary files in the device's temp directory after successful Firebase Storage uploads
  • Show a server-side processing progress indicator (animated or percentage-based) during Cloud Run jobs
  • Test video editing features on a physical device — iOS simulator and Android emulator do not accurately represent performance

Still stuck?

Copy one of these prompts to get a personalized, step-by-step explanation.

ChatGPT Prompt

I am building a video editing feature in a FlutterFlow app. I need to merge 2-5 short video clips (each 10-30 seconds) that users record within the app. The clips are uploaded to Firebase Storage. How do I set up a Cloud Run container with FFmpeg that accepts an array of Firebase Storage URLs, downloads and concatenates the clips using FFmpeg's concat demuxer, and uploads the merged video back to Firebase Storage?

FlutterFlow Prompt

Create a FlutterFlow Custom Widget called FilteredVideoWrapper that wraps a VideoPlayer widget in a ColorFiltered widget. It should have three Slider widgets below the video for Brightness (-0.5 to 0.5), Contrast (0.5 to 2.0), and Saturation (0.0 to 2.0). Changing any slider should update the ColorFilter matrix applied to the video preview in real-time.

Frequently asked questions

Can I use FFmpegKit on-device for merging instead of Cloud Run?

Technically yes — ffmpeg_kit_flutter runs FFmpeg natively on the device. For two 10-second clips at low resolution, on-device merging might take 20-30 seconds. For longer clips or higher resolution, it takes minutes and blocks the UI thread. Cloud Run processing is typically 10-30 seconds regardless of clip count and does not affect the app's responsiveness.

How do I show real-time FFmpeg processing progress to the user?

FFmpegKit supports a statistics callback that fires periodically with the current encoded frame number and time. Divide the current time by the total video duration to get a progress percentage. Pass this to a Dart StreamController and display it in a LinearProgressIndicator. On Cloud Run, you can use Server-Sent Events or periodic HTTP polling to send progress back to the app.

What video format should I export to for maximum compatibility?

Use H.264 video codec with AAC audio in an MP4 container. Add -movflags +faststart to enable progressive streaming. This combination works on all modern iOS, Android, and web browsers. Avoid HEVC/H.265 for broad compatibility — while smaller, it is not supported on all Android devices below API 26.

Can I add music or audio tracks to a video in FlutterFlow?

Yes, but only server-side. Upload the video and the chosen audio track to Firebase Storage, send both URLs to your Cloud Run FFmpeg service, and use FFmpeg's amix or amerge filter to combine the audio tracks. The command would be: ffmpeg -i video.mp4 -i music.mp3 -filter_complex amix=inputs=2:duration=first -c:v copy output.mp4.

How much does Cloud Run cost for video processing?

Cloud Run charges for CPU and memory usage during processing time. Processing a 60-second clip merge using a 2-CPU Cloud Run instance typically takes 15-30 seconds and costs approximately $0.002-0.004 per job. For an app processing 1,000 videos per month, the Cloud Run cost is around $2-4 per month — significantly less than the cost of customer complaints about a frozen UI.

Is the video_editor package compatible with all versions of FlutterFlow?

The video_editor package requires code export (FlutterFlow Pro plan). Since it is a Custom Widget backed by a pub.dev package, compatibility depends on the Flutter SDK version FlutterFlow uses, not the FlutterFlow version itself. As of early 2026, FlutterFlow uses Flutter 3.x and video_editor 3.x is compatible with Flutter 3.10+. Always check the package's pubspec.yaml for the minimum Flutter SDK requirement.

RapidDev

Talk to an Expert

Our team has built 600+ apps. Get personalized help with your project.

Book a free consultation

Need help with your project?

Our experts have built 600+ apps and can accelerate your development. Book a free consultation — no strings attached.

Book a free consultation

We put the rapid in RapidDev

Need a dedicated strategic tech and growth partner? Discover what RapidDev can do for your business! Book a call with our team to schedule a free, no-obligation consultation. We'll discuss your project and provide a custom quote at no cost.