Flutter Livestreaming Tutorial
In this tutorial, we will cover the steps to quickly build a low-latency live-streaming experience in Flutter using Stream’s Video SDK. The livestream is broadcasted using Stream's edge network of servers around the world.
For this tutorial, we will cover the following topics:
- Ultra low latency streaming
- Multiple streams & co-hosts
- RTMP in and WebRTC input
- Exporting to HLS
- Reactions, custom events and chat
- Recording & Transcriptions
Create a new project
Confused about "Create a new project"?
Let us know how we can improve our documentation:
Let’s begin by creating a new Flutter project and adding the stream_video_flutter
package to the project.
Next, you can create a project on Stream’s dashboard to obtain an API key for your project.
For detailed instructions on how to create a project on the Stream Dashboard, please see our blog post.
Setting up the client
Confused about "Setting up the client"?
Let us know how we can improve our documentation:
First, let’s import the package into the project we created earlier:
Next, we can configure the Stream client and create a room for our call:
Setting up the UI
Confused about "Setting up the UI"?
Let us know how we can improve our documentation:
To keep things simple, our sample application will only consist of two screens, a landing page to allow users the ability to create a livestream and another option to allow users the ability to join an existing livestream.
Home page:
Livestream Page:
Viewing a livestream (Webrtc)
Confused about "Viewing a livestream (Webrtc)"?
Let us know how we can improve our documentation:
Stream offers two flavors of livestreaming, WebRTC-based livestreaming and RTMP-based livestreaming.
WebRTC based livestreaming allows users to easily start a livestream directly from their phone and benefit from ultra low latency.
To setup WebRTC based livestreaming, we can first make a call using makeCall
and set the call type to livestream
followed by a room ID (this can also be left blank).
Next, we can set some default behaviour for our livestream such as configuring whether the camera and microphone should be enabled by default.
Finally, we can create the call by invoking getOrCreate
on the object we just created. By default, a livestream
call is started in backstage mode, meaning the call hosts can join and see each other but the call will be invisible to others.
When the hosts are ready, the can make the call “live” by calling goLive
.
To join the call, we can run another instance of the app on a second device and copy the call ID from the console. When the app is ready, we can click “Join a livestream” and paste in the ID we copied.
If all works as intended, we will be able to view the video feed from the first device and observe the view count increase by one.
RTMP Livestreaming
Confused about "RTMP Livestreaming"?
Let us know how we can improve our documentation:
For more advanced livestreaming configurations such as cases where multiple cameras may be required or different scenes and animations, streaming tools OBS can be used together with Stream video using RTMP (Real Time Messaging Protocol).
By default, when a call is created, it is given a dedicated RTMP URL which can be used by most common streaming platforms to inject video into the call. To configure RTMP and OBS with Stream, two things are required:
- The RTMP URL of the call
- A “streaming key” comprised of your application’s API Key and User Token in the format
apikey/usertoken
With these two pieces of information, we can update the settings in OBS then select the “Start Streaming” option to view our livestream in the application!
:::note A user with the name and associated with the user token provided to OBS will appear in the call. It is worth creating a dedicated user object for OBS streaming. :::
Viewing a livestream (HLS)
Confused about "Viewing a livestream (HLS)"?
Let us know how we can improve our documentation:
The final piece of livestreaming using Stream is support for HLS or HTTP Live Streaming. HLS unlike WebRTC based streaming tends to have a 10 to 20 second delay but offers video buffering under poor network condition.
To enable HLS support, your call
must first be placed into “broadcasting” mode using the call.startHLS()
method.
Next, we can obtain the HLS URL by querying the hlsPlaylistURL
from call.state
:
With the HLS URL, your call can be broadcasted to most livestreaming platforms such as Youtube.
Recap
Confused about "Recap"?
Let us know how we can improve our documentation:
In just a few minutes, we were able to create our first livestreaming experience for our app. Please let us know if you ran into any issues during the process. Our team constantly reviews feedback and applies changes to improve the overall experience.
This is just a small example of what’s possible with Stream and livestreaming.
We hope you enjoyed this tutorial and look forward to hearing your suggestions and feedback.
Final Thoughts
In this video app tutorial we built a fully functioning Flutter livestreaming app with our Flutter SDK component library. We also showed how easy it is to customize the behavior and the style of the Flutter video app components with minimal code changes.
Both the video SDK for Flutter and the API have plenty more features available to support more advanced use-cases.
Give us Feedback!
Did you find this tutorial helpful in getting you up and running with Flutter for adding video to your project? Either good or bad, we’re looking for your honest feedback so we can improve.