Build audio editing in FlutterFlow at two levels: simple record-and-playback using the record package with a Custom Action, or advanced waveform-display-and-trim using audio_waveforms for visualization plus ffmpeg_kit_flutter for cutting segments. The simple approach adds about 30 lines of Dart. The advanced approach gives you a draggable trim UI with waveform rendering, but adds 20-40MB to your app due to the FFmpeg binary. Choose based on whether you need trimming or just recording.
Two approaches to audio editing in FlutterFlow
Audio editing ranges from simple record-and-play to full waveform trimming. This tutorial covers both. The simple path uses the record package in a Custom Action — call start(), stop(), and get a file path you can play with just_audio. The advanced path adds the audio_waveforms package for waveform visualization and ffmpeg_kit_flutter for non-destructive audio trimming. You will build a Custom Widget with a waveform display, draggable start/end trim markers, a preview button to hear the trimmed segment, and an export button that runs FFmpeg to produce the trimmed file.
Prerequisites
- FlutterFlow Pro plan or higher (Custom Code required)
- Firebase Storage enabled for uploading audio files
- A physical device for testing (microphone required)
- Basic understanding of Custom Actions and Custom Widgets in FlutterFlow
Step-by-step guide
Add the record package and create a recording Custom Action
Add the record package and create a recording Custom Action
Go to Custom Code → Pubspec Dependencies and add the record package (version ^5.0.4). Create a Custom Action named startAudioRecording. Inside, initialize AudioRecorder, check hasPermission(), then call start() with RecordConfig(encoder: AudioEncoder.aacLc) and an output path from getTemporaryDirectory(). Create a second Custom Action stopAudioRecording that calls stop() and returns the file path as a String. The parent page calls startAudioRecording on mic-button tap, and stopAudioRecording on stop-button tap, storing the returned path in Page State.
1import 'package:record/record.dart';2import 'package:path_provider/path_provider.dart';3import 'dart:io';45Future<String> startAudioRecording() async {6 final recorder = AudioRecorder();7 if (!await recorder.hasPermission()) return '';8 final dir = await getTemporaryDirectory();9 final path = '${dir.path}/recording_${DateTime.now().millisecondsSinceEpoch}.m4a';10 await recorder.start(RecordConfig(encoder: AudioEncoder.aacLc), path: path);11 return path;12}1314Future<String> stopAudioRecording(AudioRecorder recorder) async {15 final path = await recorder.stop();16 return path ?? '';17}Expected result: Tapping the mic button starts recording; tapping stop returns a file path to the recorded .m4a file.
Add audio_waveforms package and build the waveform Custom Widget
Add audio_waveforms package and build the waveform Custom Widget
Add audio_waveforms (version ^1.0.5) to Pubspec Dependencies. Create a Custom Widget named AudioWaveformEditor with parameters: audioFilePath (String, required) and onTrimComplete (Action Parameter). In initState(), create a PlayerController and call preparePlayer(path: widget.audioFilePath, shouldExtractWaveform: true). In build(), return AudioFileWaveforms(playerController: _controller, size: Size(width, 100)) which renders the waveform from the file data. The waveform is interactive — tapping seeks to that position.
1import 'package:audio_waveforms/audio_waveforms.dart';23class AudioWaveformEditor extends StatefulWidget {4 const AudioWaveformEditor({super.key, this.width, this.height, required this.audioFilePath, this.onTrimComplete});5 final double? width;6 final double? height;7 final String audioFilePath;8 final Future Function(String trimmedPath)? onTrimComplete;9 @override10 State<AudioWaveformEditor> createState() => _AudioWaveformEditorState();11}1213class _AudioWaveformEditorState extends State<AudioWaveformEditor> {14 late final PlayerController _controller;15 double _trimStart = 0.0; // 0.0 to 1.016 double _trimEnd = 1.0;1718 @override19 void initState() {20 super.initState();21 _controller = PlayerController();22 _controller.preparePlayer(23 path: widget.audioFilePath,24 shouldExtractWaveform: true,25 );26 }2728 @override29 void dispose() {30 _controller.dispose();31 super.dispose();32 }Expected result: The Custom Widget displays the audio file's waveform as an interactive bar chart.
Add draggable trim handles on the waveform
Add draggable trim handles on the waveform
Wrap the waveform in a Stack. Add two Positioned GestureDetector widgets — one for the left trim handle and one for the right. Each handle is a Container (8px wide, full height, colored accent) that the user drags horizontally. Track positions as _trimStart and _trimEnd (0.0-1.0 fractions of total duration). On drag update, setState with the new fraction calculated from dx / widget width. Draw a semi-transparent overlay outside the trim region to visually indicate the excluded portions. Add a Play Preview button that seeks to _trimStart position and plays until _trimEnd.
Expected result: Two draggable handles appear on the waveform. The area outside the handles is dimmed. Preview plays only the selected segment.
Cut the audio segment with ffmpeg_kit_flutter
Cut the audio segment with ffmpeg_kit_flutter
Add ffmpeg_kit_flutter (version ^6.0.3) to Pubspec Dependencies. WARNING: this adds 20-40MB to your app binary. Create a Custom Action named trimAudio that takes inputPath, startSeconds, and endSeconds. Run FFmpegKit.execute('-i $inputPath -ss $startSeconds -to $endSeconds -c copy $outputPath'). The -c copy flag does a lossless trim without re-encoding. Return the output path. Wire the Export button on the Custom Widget to call this action with the calculated start/end times (fraction × total duration in seconds).
1import 'package:ffmpeg_kit_flutter/ffmpeg_kit.dart';2import 'package:path_provider/path_provider.dart';34Future<String> trimAudio(String inputPath, double startSec, double endSec) async {5 final dir = await getTemporaryDirectory();6 final output = '${dir.path}/trimmed_${DateTime.now().millisecondsSinceEpoch}.m4a';7 final cmd = '-i "$inputPath" -ss $startSec -to $endSec -c copy "$output"';8 await FFmpegKit.execute(cmd);9 return output;10}Expected result: The trimmed audio file is saved to a temporary path, containing only the segment between the handles.
Upload the trimmed audio to Firebase Storage
Upload the trimmed audio to Firebase Storage
Create a Custom Action named uploadAudio that takes a file path, reads the file, and uploads to Firebase Storage under audio/{userId}/{timestamp}.m4a. Get the download URL and return it. In the parent page, wire the full flow: user records → waveform displays → user drags trim handles → taps Export → trimAudio runs → uploadAudio uploads → save download URL to a Firestore document field. Show a CircularProgressIndicator during the upload with Conditional Visibility.
Expected result: The trimmed audio file uploads to Firebase Storage and the download URL is saved in Firestore.
Complete working example
1import 'package:audio_waveforms/audio_waveforms.dart';2import 'package:ffmpeg_kit_flutter/ffmpeg_kit.dart';3import 'package:path_provider/path_provider.dart';45class AudioWaveformEditor extends StatefulWidget {6 const AudioWaveformEditor({7 super.key, this.width, this.height,8 required this.audioFilePath,9 this.onTrimComplete,10 });11 final double? width;12 final double? height;13 final String audioFilePath;14 final Future Function(String trimmedPath)? onTrimComplete;1516 @override17 State<AudioWaveformEditor> createState() => _State();18}1920class _State extends State<AudioWaveformEditor> {21 late final PlayerController _ctrl;22 double _trimStart = 0.0;23 double _trimEnd = 1.0;24 int _durationMs = 0;25 bool _exporting = false;2627 @override28 void initState() {29 super.initState();30 _ctrl = PlayerController();31 _ctrl.preparePlayer(32 path: widget.audioFilePath,33 shouldExtractWaveform: true,34 ).then((_) {35 _durationMs = _ctrl.maxDuration;36 setState(() {});37 });38 }3940 Future<void> _preview() async {41 final startMs = (_trimStart * _durationMs).round();42 await _ctrl.seekTo(startMs);43 _ctrl.startPlayer();44 // Stop at trim end after delay45 final playMs = ((_trimEnd - _trimStart) * _durationMs).round();46 Future.delayed(Duration(milliseconds: playMs), () => _ctrl.pausePlayer());47 }4849 Future<void> _export() async {50 setState(() => _exporting = true);51 final startSec = (_trimStart * _durationMs / 1000).toStringAsFixed(2);52 final endSec = (_trimEnd * _durationMs / 1000).toStringAsFixed(2);53 final dir = await getTemporaryDirectory();54 final out = '${dir.path}/trim_${DateTime.now().millisecondsSinceEpoch}.m4a';55 await FFmpegKit.execute(56 '-i "${widget.audioFilePath}" -ss $startSec -to $endSec -c copy "$out"',57 );58 setState(() => _exporting = false);59 widget.onTrimComplete?.call(out);60 }6162 @override63 void dispose() {64 _ctrl.dispose();65 super.dispose();66 }6768 @override69 Widget build(BuildContext context) {70 final w = widget.width ?? 300;71 return SizedBox(72 width: w, height: widget.height ?? 180,73 child: Column(children: [74 Expanded(75 child: Stack(children: [76 AudioFileWaveforms(playerController: _ctrl, size: Size(w, 100)),77 // Left trim region (dimmed)78 Positioned(left: 0, width: _trimStart * w, top: 0, bottom: 0,79 child: Container(color: Colors.black38)),80 // Right trim region (dimmed)81 Positioned(left: _trimEnd * w, right: 0, top: 0, bottom: 0,82 child: Container(color: Colors.black38)),83 // Left handle84 Positioned(left: _trimStart * w - 4, top: 0, bottom: 0,85 child: GestureDetector(86 onHorizontalDragUpdate: (d) => setState(() {87 _trimStart = ((_trimStart + d.delta.dx / w)).clamp(0.0, _trimEnd - 0.05);88 }),89 child: Container(width: 8, color: Theme.of(context).primaryColor),90 )),91 // Right handle92 Positioned(left: _trimEnd * w - 4, top: 0, bottom: 0,93 child: GestureDetector(94 onHorizontalDragUpdate: (d) => setState(() {95 _trimEnd = ((_trimEnd + d.delta.dx / w)).clamp(_trimStart + 0.05, 1.0);96 }),97 child: Container(width: 8, color: Theme.of(context).primaryColor),98 )),99 ]),100 ),101 Row(mainAxisAlignment: MainAxisAlignment.spaceEvenly, children: [102 IconButton(icon: const Icon(Icons.play_arrow), onPressed: _preview),103 _exporting104 ? const SizedBox(width: 24, height: 24, child: CircularProgressIndicator(strokeWidth: 2))105 : IconButton(icon: const Icon(Icons.content_cut), onPressed: _export),106 ]),107 ]),108 );109 }110}Common mistakes when creating Custom Audio Editing Experiences in FlutterFlow
Why it's a problem: Including ffmpeg_kit_flutter when you only need record and playback
How to avoid: Use only the record package for recording and just_audio for playback. Only add ffmpeg_kit_flutter if you actually need audio trimming or format conversion.
Why it's a problem: Not disposing the PlayerController when leaving the page
How to avoid: Call _controller.dispose() in the widget's dispose() method, just like any other media controller.
Why it's a problem: Trying to trim while the audio is still playing
How to avoid: Call _controller.pausePlayer() before starting the FFmpeg trim operation. Resume only after the trim completes.
Best practices
- Use the record package for simple record/playback — it is lightweight and well-maintained
- Only add FFmpeg if you need trimming, format conversion, or audio merging
- Show waveform extraction progress — preparePlayer with shouldExtractWaveform can take 1-3 seconds for long files
- Clamp trim handles so left never crosses right and minimum segment is 0.5 seconds
- Use -c copy in FFmpeg for lossless trimming — avoids re-encoding and is near-instant
- Compress audio before uploading — use AAC encoder (not WAV) for 5-10x smaller files
- Request microphone permission before recording and handle denial with a settings redirect
Still stuck?
Copy one of these prompts to get a personalized, step-by-step explanation.
Write Flutter Custom Widget code for an audio waveform editor that displays the waveform using audio_waveforms package, has draggable trim handles for selecting a segment, a preview button to play the selected segment, and an export button that trims the audio using ffmpeg_kit_flutter.
Create a page with a recording button, a container for the audio waveform editor Custom Widget, and an upload button below. The recording button should toggle between record and stop states.
Frequently asked questions
What audio formats does the record package support?
The record package supports AAC (recommended, small file size), WAV (lossless, large files), and OPUS. Use AAC (AudioEncoder.aacLc) for most cases — it produces files 5-10x smaller than WAV with negligible quality loss.
How much does FFmpeg add to app size?
The ffmpeg_kit_flutter full package adds 20-40MB to your APK/IPA. Use the min-gpl variant for a smaller binary (10-15MB) if you only need basic audio operations. If app size is critical and you only need trimming, consider the audio_trimmer package as a lighter alternative.
Can I display a live waveform while recording?
Yes. Use audio_waveforms with RecorderController instead of PlayerController. Call recorderController.record() and the AudioWaveforms widget shows live amplitude bars in real time as the user speaks.
Does audio recording work on web?
The record package has web support using the browser's MediaRecorder API. However, audio_waveforms does not support web for waveform display. For web, show a simple duration timer instead of a waveform during recording.
How do I limit recording duration?
Start a Timer when recording begins that calls stopAudioRecording() after your maximum duration (e.g., 60 seconds). Show a countdown in the UI bound to Page State secondsRemaining.
Can RapidDev help with advanced audio features?
Yes. Multi-track mixing, real-time audio effects, and background recording require complex Custom Code beyond this tutorial. RapidDev can build production-ready audio editing experiences for podcast, music, or education apps.
Talk to an Expert
Our team has built 600+ apps. Get personalized help with your project.
Book a free consultation