final call = StreamVideo.instance.makeCall(
callType: StreamCallType.liveStream(),
id: 'your_call_id',
);
await call.getOrCreate();
LivestreamPlayer(call: call);Livestreaming
In this guide, we'll learn how to broadcast and watch a livestream using the Stream Video Flutter SDK. We will show you how to implement common livestream features, such as displaying the number of watchers, allowing users to wait before the livestream starts, handling different states, and much more.
For optimal mobile broadcasting performance, see our Mobile Livestreaming Broadcasting Guide which covers crucial mobile-specific considerations including thermal management, battery optimization, network connectivity, and app resource management during streaming.
You can find a step by step tutorial for building a livestream app here and a working project here.
Watching a livestream
In this guide we will see how to watch a WebRTC livestream. We also support HLS and RTMP-out.
Let's do a quick overview of these three technologies:
- WebRTC is ideal for real-time, low-latency streaming such as video calls or live auctions.
- HLS (HTTP Live Streaming) is great for large-scale distribution, offering broad compatibility and adaptive bitrate streaming. However, it typically has higher latency (5–30 seconds), making it less suitable for interactive use cases.
- RTMP (Real-Time Messaging Protocol) was once the standard for low-latency streaming to platforms like YouTube or Twitch. While it's being phased out in favor of newer protocols, it's still commonly used for ingesting streams due to its reliability and low latency (~2–5 seconds).
We will show you how to watch the WebRTC livestream and implement some common livestreaming features.
We offer a default component, LivestreamPlayer, that comes with a predefined UI, in case it fits your use-case.
Using the default livestream player is simple — you just need the Call object:
You can find more details about the built-in LivestreamPlayer on the Watching a Livestream page.
The rest of the guide will be focused on building your own livestream player UI.
Livestream states
Creating the call from the host's side is similar, with the difference that you might want to make a user createing a call a host:
final call = StreamVideo.instance.makeCall(
callType: StreamCallType.liveStream(),
id: 'your_call_id',
);
await call.getOrCreate(
members: [MemberRequest(userId: 'your_user_id', role: 'host')],
);
await call.join();Now let's define the possible states of the UI based on the state of the call. We will create a widget with the following code:
class LivestreamScreen extends StatelessWidget {
const LivestreamScreen({super.key, required this.call});
final Call call;
@override
Widget build(BuildContext context) {
return PartialCallStateBuilder(
call: call,
selector: (state) => (
isBackstage: state.isBackstage,
endedAt: state.endedAt,
),
builder: (context, callState) {
if (callState.endedAt != null) {
return CallEndedContent(call: call);
}
if (callState.isBackstage) {
return BackstageContent(call: call);
}
return CallLiveContent(call: call);
},
);
}
}This won't compile yet since we still need to define the inner widgets, but first let's describe the different states:
- When
state.isBackstageistrue, the call is not started yet (it's in backstage). By default, only hosts with thejoin-backstagecapability can join a call in this state. You can also specify thejoinAheadTimeSecondsparameter when creating a call to allow any type of user to join the livestream before it's started. - If
state.endedAtis notnull, it means that the livestream has already finished. - If it's not ended or in backstage, then the livestream is live. In this state, we will show the host's video and other relevant data.
Backstage mode
While in backstage, you can show a countdown, or the start date. By listening to the isBackstage property from the CallState, you can also automatically render the video track as soon as it's available.
Here's an example on how to handle the backstage mode by showing the start date and the number of participants that are waiting to join the livestream:
class BackstageContent extends StatelessWidget {
const BackstageContent({super.key, required this.call});
final Call call;
@override
Widget build(BuildContext context) {
return PartialCallStateBuilder(
call: call,
selector: (state) => (
startsAt: state.startsAt,
waitingCount: state.callParticipants
.where((p) => !p.roles.contains('host'))
.length,
),
builder: (context, data) {
return Center(
child: Column(
mainAxisAlignment: MainAxisAlignment.center,
children: [
Text(
data.startsAt != null
? 'Livestream starting at ${data.startsAt!.toLocal()}'
: 'Livestream starting soon',
style: Theme.of(context).textTheme.titleLarge,
),
if (data.waitingCount > 0) ...[
const SizedBox(height: 16),
Text('${data.waitingCount} participants waiting'),
],
const SizedBox(height: 32),
ElevatedButton(
onPressed: () => call.goLive(),
child: const Text('Go Live'),
),
],
),
);
},
);
}
}To transition from backstage to live, call call.goLive(). You can also start HLS broadcasting, recording, or transcription simultaneously:
await call.goLive(
startHls: true,
startRecording: true,
startTranscription: true,
);To allow users to join the call before it starts, set the joinAheadTimeSeconds parameter when updating the call:
await call.update(
startsAt: DateTime.now().toUtc().add(const Duration(minutes: 5)),
backstage: const StreamBackstageSettings(
enabled: true,
joinAheadTimeSeconds: 300,
),
);Call Ended
When a livestream has ended, its endedAt property is updated with the date and time when the livestream finished. You can use this to show a message to the users, and additionally show them recordings of the call, if available.
class CallEndedContent extends StatefulWidget {
const CallEndedContent({super.key, required this.call});
final Call call;
@override
State<CallEndedContent> createState() => _CallEndedContentState();
}
class _CallEndedContentState extends State<CallEndedContent> {
late Future<Result<List<CallRecording>>> _recordingsFuture;
@override
void initState() {
super.initState();
_recordingsFuture = widget.call.listRecordings();
}
@override
Widget build(BuildContext context) {
return Center(
child: Column(
mainAxisAlignment: MainAxisAlignment.center,
children: [
const Text('Livestream has ended'),
const SizedBox(height: 24),
FutureBuilder(
future: _recordingsFuture,
builder: (context, snapshot) {
if (!snapshot.hasData) {
return const CircularProgressIndicator();
}
final recordings = snapshot.data!.getDataOrNull();
if (recordings == null || recordings.isEmpty) {
return const Text('No recordings available');
}
return Column(
children: [
const Text('Watch recordings:'),
const SizedBox(height: 8),
...recordings.map(
(recording) => TextButton(
onPressed: () {
// Open recording URL
},
child: Text(recording.filename),
),
),
],
);
},
),
],
),
);
}
}Call Live
Let's provide an example implementation for the widget used when the livestream is active:
class CallLiveContent extends StatelessWidget {
const CallLiveContent({super.key, required this.call});
final Call call;
@override
Widget build(BuildContext context) {
return PartialCallStateBuilder(
call: call,
selector: (state) {
final participants = state.callParticipants;
final hostIds = participants
.where((m) => m.roles.contains('host'))
.map((m) => m.userId)
.toSet();
return (
host: state.callParticipants.where(
(p) => hostIds.contains(p.userId),
).firstOrNull,
totalParticipants: state.callParticipants.length,
);
},
builder: (context, data) {
final host = data.host;
return Stack(
children: [
if (host != null)
StreamCallParticipant(
call: call,
participant: host,
showConnectionQualityIndicator: false,
showParticipantLabel: false,
showSpeakerBorder: false,
)
else
const Center(
child: Text("The host's video is not available"),
),
Positioned(
top: 16,
left: 16,
child: StreamBuilder<Duration>(
stream: call.callDurationStream,
builder: (context, snapshot) {
final duration = snapshot.data ?? Duration.zero;
final minutes = duration.inMinutes;
final seconds = duration.inSeconds % 60;
return Text(
'${minutes}m ${seconds}s',
style: const TextStyle(color: Colors.white),
);
},
),
),
Positioned(
top: 16,
right: 16,
child: Text(
'Viewers: ${data.totalParticipants}',
style: const TextStyle(color: Colors.white),
),
),
],
);
},
);
}
}Let's break down the implementation:
Rendering the livestream track
Finding and rendering the participant whose video track is shown depends on your use-case — whether it supports only one streamer, whether you want to limit this functionality by role, or any other special requirements. In the example above, we're trying to find a user with the host role.
This example uses the StreamCallParticipant widget for rendering. You can also use the lower-level StreamVideoRenderer for full control over rendering, as described in our Video Renderer docs.
Livestream information
You can show various information about the livestream, such as the duration and the participant count:
- You can read the total participant count from
state.callParticipants.length. For anonymous users, you can also use the session participant count. - The call duration is available via
call.callDurationStream, which provides a stream of the elapsed time since the call went live.
You can also watch queried calls, as explained here. This allows you to present participant count (and other call data), even without joining a call.
Error states
Livestreaming depends on many factors, such as network conditions on both the publishing and viewer side.
Proper error handling is needed to be transparent about the potential issues the user might be facing.
When the network drops, the SDK tries to reconnect the user to the call. However, if it fails to do that, the status in the CallState becomes CallStatusDisconnected. This gives you the chance to show an alert to the user and provide some custom handling (e.g. a message to check the network connection and try again).
Here's an example on how to do that:
call.partialState((state) => state.status).listen((status) {
if (status is CallStatusReconnecting) {
// The default SDK components will aready show some indicators
// but you can also react to this status on your end
} else if (status is CallStatusReconnectionFailed) {
// Show "Cannot join livestream. Try again later" message
} else if (status is CallStatusDisconnected) {
// Show "You are disconnected" message
}
});Additionally, you should wrap your call actions (creating a call, joining a call, going live, etc.) with proper error handling to visually notify the user if there's a failure:
final result = await call.goLive();
if (result.isFailure) {
// Show error to the user
}Handling the camera and microphone
To offer the host the possibility to enable or disable the camera and microphone, use the call.setCameraEnabled() and call.setMicrophoneEnabled() properties:
await call.setCameraEnabled(enabled: true);
await call.setMicrophoneEnabled(enabled: true);You can observe the current state of devices to update your UI:
PartialCallStateBuilder(
call: call,
selector: (state) => (
isCameraEnabled: state.localParticipant?.isVideoEnabled ?? false,
isMicrophoneEnabled:
state.localParticipant?.isAudioEnabled ?? false,
),
builder: (context, data) {
return Row(
children: [
IconButton(
icon: Icon(
data.isCameraEnabled ? Icons.videocam : Icons.videocam_off,
),
onPressed: () =>
call.setCameraEnabled(enabled: !data.isCameraEnabled),
),
IconButton(
icon: Icon(
data.isMicrophoneEnabled ? Icons.mic : Icons.mic_off,
),
onPressed: () => call.setMicrophoneEnabled(
enabled: data.isMicrophoneEnabled,
),
),
],
);
},
);For more details on camera and microphone configuration, see the Camera and Microphone docs.
Broadcasting (HLS)
The Stream Video Flutter SDK has support for HLS broadcasting. This allows you to broadcast a livestream over HLS, which can then be watched using any HLS-compatible player.
Start and stop HLS broadcasting
// Start HLS broadcasting
final result = await call.startHLS();
if (result.isSuccess) {
final hlsUrl = result.getDataOrNull();
// hlsUrl contains the HLS playlist URL
}
// Stop HLS broadcasting
await call.stopHLS();After a few seconds of setup, broadcasting will start and the state of the call will be updated: the isBroadcasting flag will become true.
PartialCallStateBuilder(
call: call,
selector: (state) => state.isBroadcasting,
builder: (context, isBroadcasting) {
return Text(isBroadcasting ? 'Broadcasting' : 'Not broadcasting');
},
);Listening to broadcasting events
You can listen to broadcasting-related events via the callEvents stream on the Call object:
call.callEvents.on<StreamCallBroadcastingStartedEvent>((event) {
// Broadcasting has started
});
call.callEvents.on<StreamCallBroadcastingStoppedEvent>((event) {
// Broadcasting has stopped
});
call.callEvents.on<StreamCallBroadcastingFailedEvent>((event) {
// Broadcasting failed
});Retrieving the broadcast URL
The HLS playlist URL can be retrieved from the call state after broadcasting has started:
final hlsUrl = call.state.value.egress.hlsPlaylistUrl;This URL can be used by others to watch the broadcast in any HLS-compatible player.
Displaying HLS
To display an HLS stream in your Flutter app, you can use a video player package that supports HLS playback, such as video_player or media_kit:
import 'package:video_player/video_player.dart';
final controller = VideoPlayerController.networkUrl(
Uri.parse(hlsUrl),
);
await controller.initialize();
controller.play();Starting HLS when going live
You can also start HLS broadcasting automatically when the call goes live by passing startHls: true to the goLive method:
await call.goLive(startHls: true);This is a convenient way to ensure HLS broadcasting starts alongside the WebRTC stream.
RTMP-In
You can use RTMP streams as input for a call. This is useful for integrating professional broadcasting tools like OBS.
The RTMP address and stream key are available from the call state:
final rtmpAddress = call.state.value.rtmpIngress;The streaming key for OBS and other tools uses the format: apiKey/userToken.
To configure OBS with Stream:
- Set the RTMP URL to the
rtmpAddressfrom the call state - Set the streaming key to
your_api_key/user_token - Start streaming from OBS
A user with the name and token provided to OBS will appear in the call. Consider creating a dedicated user for OBS streaming.
We plan to add support for other livestreaming protocols in the future. If something is missing, be sure to let us know.
Going live and stopping
To start and stop the livestream programmatically:
// Go live (exits backstage mode)
final result = await call.goLive();
// Go live with additional options
final result = await call.goLive(
startHls: true,
startRecording: true,
startTranscription: true,
);
// Stop the livestream (returns to backstage mode)
await call.stopLive();
// End the call entirely
await call.end();The difference between stopLive() and end():
stopLive()returns the call to backstage mode — the host can go live again later.end()ends the call permanently — no one can rejoin.
User roles and permissions
For livestreaming, it's important to understand how roles and permissions work:
- Each user has a role that is scoped per call.
- The default role is
user. You can assign a different role when creating a call or adding members viagetOrCreate(). - In the Stream Dashboard under Roles & Permissions, permissions are configured per call type and per role. Review the settings for the
livestreamcall type to ensure they align with your use case. - By default, the
userrole may not have theCreateCallpermission, so users who should create/start livestreams need thehostrole (or the permission needs to be granted touser). - The same applies to the Join Backstage permission: by default only hosts can join when the call is not live yet. For regular users,
join()will fail in that state.
final call = StreamVideo.instance.makeCall(
callType: StreamCallType.liveStream(),
id: 'my_livestream',
);
await call.getOrCreate(
members: [
MemberRequest(
userId: StreamVideo.instance.currentUser.id,
role: 'host',
),
],
);