Did you know? All Video & Audio API plans include a $100 free usage credit each month so you can build and test risk-free. View Plans ->

Flutter Livestreaming Tutorial

The following tutorial shows you how to quickly build a Livestreaming app leveraging Stream's Video API and the Stream Video Flutter components. The underlying API is very flexible and allows you to build nearly any type of video experience.

example of flutter video and audio sdk

In this tutorial, we will cover the steps to quickly build a low-latency live-streaming experience in Flutter using Stream's Video SDK. The livestream is broadcast using Stream's edge network of servers around the world.

For this tutorial, we will cover the following topics:

  • Ultra low latency streaming
  • Multiple streams and co-hosts
  • RTMP in and WebRTC input
  • Exporting to HLS
  • Reactions, custom events and chat
  • Recording and Transcriptions

UI components are fully customizable, as demonstrated in the Flutter Video Cookbook.

You can find the full code for the video calling tutorial on the Flutter Video Tutorials repository.

Let's dive in! If you have any questions or need to provide feedback along the way, don't hesitate to use the feedback button - we're here to help!

Step 1 - Create a new project and add configuration

Let's begin by creating a new Flutter project and adding the required dependencies.

You can use the following command to achieve this:

bash
1
2
3
flutter create livestream_tutorial --empty cd livestream_tutorial flutter pub add stream_video stream_video_flutter

You should now have the dependencies in your pubspec.yaml file with the latest version:

yaml
1
2
3
4
5
6
dependencies: flutter: sdk: flutter stream_video: ^latest stream_video_flutter: ^latest

Stream has several packages that you can use to integrate video into your application.

In this tutorial, we will use the stream_video_flutter package which contains pre-built UI elements for you to use.

You can also use the stream_video package directly if you need direct access to the low-level client.

The stream_video_push_notification package helps in adding push notifications and an end-to-end call flow (CallKit).

Before you go ahead, you need to add the required permissions for video calling to your app.

In your AndroidManifest.xml file, add these permissions:

xml
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
<manifest xmlns:android="http://schemas.android.com/apk/res/android"> <uses-permission android:name="android.permission.INTERNET"/> <uses-feature android:name="android.hardware.camera"/> <uses-feature android:name="android.hardware.camera.autofocus"/> <uses-permission android:name="android.permission.CAMERA"/> <uses-permission android:name="android.permission.RECORD_AUDIO"/> <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/> <uses-permission android:name="android.permission.CHANGE_NETWORK_STATE"/> <uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS"/> <uses-permission android:name="android.permission.BLUETOOTH" android:maxSdkVersion="30"/> <uses-permission android:name="android.permission.BLUETOOTH_ADMIN" android:maxSdkVersion="30"/> <uses-permission android:name="android.permission.BLUETOOTH_CONNECT"/> ... </manifest>

For the corresponding iOS permissions, open the Info.plist file and add:

xml
1
2
3
4
5
6
7
8
9
10
11
12
13
14
<key>NSCameraUsageDescription</key> <string>$(PRODUCT_NAME) needs access to your camera for video calls.</string> <key>NSMicrophoneUsageDescription</key> <string>$(PRODUCT_NAME) needs access to your microphone for voice and video calls.</string> <key>UIApplicationSupportsIndirectInputEvents</key> <true/> <key>UIBackgroundModes</key> <array> <string>audio</string> <string>fetch</string> <string>processing</string> <string>remote-notification</string> <string>voip</string> </array>

Finally, you need to set the platform to iOS 14.0 or higher in your Podfile:

ruby
1
platform :ios, '14.0'

Step 2 - Setting up the Stream Video client

To actually run this sample we need a valid user token. The user token is typically generated by your server side API. When a user logs in to your app you return the user token that gives them access to the call. To make this tutorial easier to follow we'll generate a user token for you:

Please update REPLACE_WITH_API_KEY, REPLACE_WITH_USER_ID, REPLACE_WITH_TOKEN, and REPLACE_WITH_CALL_ID with the actual values:

Here are credentials to try out the app with:

PropertyValue
API KeyWaiting for an API key ...
Token Token is generated ...
User IDLoading ...
Call IDCreating random call ID ...

First, let's import the package into the project and then initialise the client with the credentials you received:

dart
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
import 'package:stream_video_flutter/stream_video_flutter.dart'; Future<void> main() async { // Ensure Flutter is able to communicate with Plugins WidgetsFlutterBinding.ensureInitialized(); // Initialize Stream video and set the API key for our app. StreamVideo( 'REPLACE_WITH_API_KEY', user: const User( info: UserInfo( name: 'John Doe', id: 'REPLACE_WITH_USER_ID', ), ), userToken: 'REPLACE_WITH_TOKEN', ); runApp( const MaterialApp( home: HomeScreen(), ), ); }

Step 3 - Building the home screen

To keep things simple, our sample application will only consist of two screens, a landing page to allow users the ability to create a livestream, and another page to view and control the livestream.

Let's start by creating a new file called home_screen.dart. We'll implement a simple home screen that displays a button in the center - when pressed, this button will create and start a new livestream:

dart
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
import 'package:flutter/material.dart'; class HomeScreen extends StatefulWidget { const HomeScreen({ super.key, }); State<HomeScreen> createState() => _HomeScreenState(); } class _HomeScreenState extends State<HomeScreen> { Widget build(BuildContext context) { return Scaffold( body: Center( child: ElevatedButton( onPressed: () => _createLivestream(), child: const Text('Create a Livestream'), ), ), ); } Future<void> _createLivestream() async {} }

Now, we can fill in the functionality to create a livestream whenever the button is pressed.

To do this, we have to do a few things:

  1. Create a call with a type of livestream and pass in an ID for the call.
  2. Set any connect options required and call call.getOrCreate() to create the livestream.
  3. If the call is successfully created, join the call and use call.goLive() to start the livestream immediately.
  4. Navigate to the page for displaying the livestream once everything is created properly.

⚠️ If you do not call call.goLive(), a livestream call is started in backstage mode, meaning the call hosts can join and see each other but the call will be invisible to others.

Here is what all of the above looks like in code:

dart
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
import 'package:stream_video/stream_video.dart'; Future<void> _createLivestream() async { // Set up our call object final call = StreamVideo.instance.makeCall( callType: StreamCallType.liveStream(), id: 'REPLACE_WITH_CALL_ID', ); final result = await call.getOrCreate(); // Call object is created if (result.isSuccess) { // Set some default behaviour for how our devices should be configured once we join a call final connectOptions = CallConnectOptions( camera: TrackOption.enabled(), microphone: TrackOption.enabled(), ); // Our local app user can join and receive events await call.join(connectOptions: connectOptions); // Allow others to see and join the call (exit backstage mode) await call.goLive(); Navigator.of(context).push( MaterialPageRoute( builder: (context) => LiveStreamScreen(livestreamCall: call), ), ); } else { debugPrint('Not able to create a call.'); } }

Step 4 - Building the livestream screen

Let's build a livestream screen that shows the live video feed and tracks viewer count in real-time. The screen will also include functionality to end the livestream.

To implement this, we'll create a widget that takes a livestream call object as a parameter. By subscribing to call.state.valueStream, our widget can react to any changes in the livestream state, such as viewers joining or leaving.

Let's create a new file called livestream_screen.dart and add the following code to implement our livestream screen:

dart
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
import 'package:flutter/material.dart'; import 'package:stream_video_flutter/stream_video_flutter.dart'; class LiveStreamScreen extends StatelessWidget { const LiveStreamScreen({ super.key, required this.livestreamCall, }); final Call livestreamCall; Widget build(BuildContext context) { return SafeArea( child: StreamBuilder( stream: livestreamCall.state.valueStream, initialData: livestreamCall.state.value, builder: (context, snapshot) { if (!snapshot.hasData) { return const Center( child: CircularProgressIndicator(), ); } final callState = snapshot.data!; final participant = callState.callParticipants.firstWhere( (e) => e.isVideoEnabled, ); // ... }, ), ); } }

Now, let's add the main video display to our StreamBuilder. We'll use a Stack widget to layer UI elements, with the StreamVideoRenderer showing the livestream feed as the base layer. We'll also add a status check to display a message when the stream is disconnected:

dart
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
return Scaffold( body: Stack( children: [ StreamVideoRenderer( call: livestreamCall, videoTrackType: SfuTrackType.video, participant: participant, ), if (callState.status.isDisconnected) const Center( child: Text('Stream not live'), ), ], ), );

Let's enhance the UI by adding two key elements to the Stack:

  1. A viewer count display in the top-left corner that shows the total number of participants
  2. An "End Call" button in the top-right corner to terminate the livestream

To implement these features, we'll add a Positioned widget to the Stack for each element:

dart
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
Stack( children: [ // previous code... Positioned( top: 12.0, left: 12.0, child: Material( shape: RoundedRectangleBorder( borderRadius: BorderRadius.circular(24), ), color: Colors.red, child: Center( child: Padding( padding: const EdgeInsets.all(8.0), child: Text( 'Viewers: ${callState.callParticipants.length}', style: const TextStyle( fontSize: 14, color: Colors.white, fontWeight: FontWeight.bold, ), ), ), ), ), ), Positioned( top: 12.0, right: 12.0, child: Material( shape: RoundedRectangleBorder( borderRadius: BorderRadius.circular(24), ), color: Colors.black, child: GestureDetector( onTap: () { livestreamCall.end(); Navigator.pop(context); }, child: const Center( child: Padding( padding: EdgeInsets.all(8.0), child: Text( 'End Call', style: TextStyle( fontSize: 14, color: Colors.white, fontWeight: FontWeight.bold, ), ), ), ), ), ), ), ], )

Here's the complete implementation of the livestream page:

dart
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
import 'package:flutter/material.dart'; import 'package:stream_video_flutter/stream_video_flutter.dart'; class LiveStreamScreen extends StatelessWidget { const LiveStreamScreen({super.key, required this.livestreamCall}); final Call livestreamCall; Widget build(BuildContext context) { return SafeArea( child: StreamBuilder( stream: livestreamCall.state.valueStream, initialData: livestreamCall.state.value, builder: (context, snapshot) { if (!snapshot.hasData) { return const Center(child: CircularProgressIndicator()); } final callState = snapshot.data!; final participant = callState.callParticipants.firstWhere( (e) => e.isVideoEnabled, ); return Scaffold( body: Stack( children: [ StreamVideoRenderer( call: livestreamCall, videoTrackType: SfuTrackType.video, participant: participant, ), if (callState.status.isDisconnected) const Center(child: Text('Stream not live')), Positioned( top: 12.0, left: 12.0, child: Material( shape: RoundedRectangleBorder( borderRadius: BorderRadius.circular(24), ), color: Colors.red, child: Center( child: Padding( padding: const EdgeInsets.all(8.0), child: Text( 'Viewers: ${callState.callParticipants.length}', style: const TextStyle( fontSize: 14, color: Colors.white, fontWeight: FontWeight.bold, ), ), ), ), ), ), Positioned( top: 12.0, right: 12.0, child: Material( shape: RoundedRectangleBorder( borderRadius: BorderRadius.circular(24), ), color: Colors.black, child: GestureDetector( onTap: () { livestreamCall.end(); Navigator.pop(context); }, child: const Center( child: Padding( padding: EdgeInsets.all(8.0), child: Text( 'End Call', style: TextStyle( fontSize: 14, color: Colors.white, fontWeight: FontWeight.bold, ), ), ), ), ), ), ), ], ), ); }, ), ); } }

If all works as intended, we will be able to create a livestream from the first device:

Step 5 - Viewing a livestream (WebRTC)

Stream uses a technology called SFU cascading to replicate your livestream over different SFUs around the world. This makes it possible to reach a large audience in realtime.

To view the livestream for testing, click Create livestream in the Flutter app and click the link below to watch the video in your browser:

For testing you can join the call on our web-app: Join Call

This will work if you used the token snippet above. You might need to update the url with the call id you used in your code.

If you want to view the livestream through a Flutter application, you can use the LivestreamPlayer widget that is built into the Flutter SDK.

Let's add second button in the home screen to allow users to view the livestream:

dart
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
Widget build(BuildContext context) { return Scaffold( body: Center( child: Column( mainAxisAlignment: MainAxisAlignment.center, spacing: 16, children: [ ElevatedButton( onPressed: () => _createLivestream(), child: const Text('Create a Livestream'), ), ElevatedButton( onPressed: () => _viewLivestream(), child: const Text('View a Livestream'), ), ], ), ), ); }

And implement the _viewLivestream method:

dart
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
Future<void> _viewLivestream() async { // Set up our call object final call = StreamVideo.instance.makeCall( callType: StreamCallType.liveStream(), id: 'REPLACE_WITH_CALL_ID', ); final result = await call.getOrCreate(); // Call object is created if (result.isSuccess) { // Set default behaviour for a livestream viewer final connectOptions = CallConnectOptions( camera: TrackOption.disabled(), microphone: TrackOption.disabled(), ); // Our local app user can join and receive events await call.join(connectOptions: connectOptions); Navigator.of(context).push( MaterialPageRoute( builder: (context) => Scaffold( appBar: AppBar( title: const Text('Livestream'), leading: IconButton( icon: Icon(Icons.arrow_back), onPressed: () { call.leave(); Navigator.of(context).pop(); }, ), ), body: LivestreamPlayer(call: call), ), ), ); } else { debugPrint('Not able to create a call.'); } }

With this implementationm the user will see our default UI for a livestream viewer with a back button to go back to the home screen. The LivestreamPlayer widget has most required controls and info for viewing a livestream and makes your job to create a livestream viewing interface effortless.

Step 6 (Optional) - Start HLS stream

Stream offers two flavors of livestreaming, WebRTC-based livestreaming and RTMP-based livestreaming. WebRTC based livestreaming allows users to easily start a livestream directly from their phone and benefit from ultra low latency. The final piece of livestreaming using Stream is support for HLS or HTTP Live Streaming. HLS, unlike WebRTC based streaming, tends to have a 10 to 20 second delay but offers video buffering under poor network conditions. To enable HLS support, your call must first be placed into "broadcasting" mode using the call.startHLS() method.

We can then obtain the HLS URL by querying the hlsPlaylistURL from call.state:

dart
1
2
3
4
5
6
7
final result = await call.startHLS(); if (result.isSuccess) { final url = call.state.value.egress.hlsPlaylistUrl; //... }

With the HLS URL, your call can be broadcast to most livestreaming platforms.

RTMP Livestreaming

For more advanced livestreaming configurations such as cases where multiple cameras may be required or different scenes and animations, streaming tools like OBS can be used together with Stream video using RTMP (Real Time Messaging Protocol).

By default, when a call is created, it is given a dedicated RTMP URL which can be used by most common streaming platforms to inject video into the call. To configure RTMP and OBS with Stream, two things are required:

  1. The RTMP URL of the call
  2. A "streaming key" comprised of your application's API Key and User Token in the format apikey/usertoken

With these two pieces of information, we can update the settings in OBS then select the "Start Streaming" option to view our livestream in the application.

⚠️ A user with the name and user token provided to OBS will appear in the call. It is worth creating a dedicated user object for OBS streaming.

Recap

Find the complete code for this tutorial on the Flutter Video Tutorials Repository.

Stream Video allows you to quickly build in-app low-latency livestreaming in Flutter. Our team is happy to review your UI designs and offer recommendations on how to achieve it with the Stream SDKs.

To recap what we've learned:

  • WebRTC is optimal for latency, while HLS is slower, but buffers better for users with poor connections.
  • You set up a call with var call = client.makeCall(callType: StreamCallType.liveStream(), id: callID).
  • The call type livestream controls which features are enabled and how permissions are set up.
  • The livestream call has backstage mode enabled by default. This allows you and your co-hosts to setup your mic and camera before allowing people in.
  • When you join a call, realtime communication is setup for audio & video: call.join().
  • Data in call.state and call.state.value.participants make it easy to build your own UI.

Calls run on Stream's global edge network of video servers. Being closer to your users improves the latency and reliability of calls. The SDKs enable you to build livestreaming, audio rooms and video calling in days.

We hope you've enjoyed this tutorial and please feel free to reach out if you have any suggestions or questions.

Final Thoughts

In this video app tutorial we built a fully functioning Flutter livestreaming app with our Flutter SDK component library. We also showed how easy it is to customize the behavior and the style of the Flutter video app components with minimal code changes.

Both the video SDK for Flutter and the API have plenty more features available to support more advanced use-cases.

Give us feedback!

Did you find this tutorial helpful in getting you up and running with your project? Either good or bad, we're looking for your honest feedback so we can improve.

Start coding for free

No credit card required.
If you're interested in a custom plan or have any questions, please contact us.