LivestreamPlayer(type: "livestream", id: "123")
Livestreaming
In this guide, we’re going to see how we can watch a livestream using Stream Video’s iOS SDK. We will also show you how to implement other common livestream features, such as displaying the number of watchers, how to allow users to wait before the livestream started, handle different states and much more.
You can find a working project that uses the examples below here.
Watching a livestream
In this guide, we will see how to watch WebRTC livestream. We also support HLS and RTMP-out.
Let’s do a quick overview of the three technologies:
- WebRTC is ideal for real-time, low-latency streaming such as video calls or live auctions.
- HLS (HTTP Live Streaming) is great for large-scale distribution, offering broad compatibility and adaptive bitrate streaming. However, it typically has higher latency (5–30 seconds), making it less suitable for interactive use cases.
- RTMP (Real-Time Messaging Protocol) was once the standard for low-latency streaming to platforms like YouTube or Twitch. While it’s being phased out in favor of newer protocols, it’s still commonly used for ingesting streams due to its reliability and low latency (~2–5 seconds).
We will show you how to watch the WebRTC livestream and implement some common livestreaming features.
We also offer a default component LivestreamPlayer
, that comes with a predefined UI, in case it fits your use-case.
Integrating the default livestream player is very simple, you just need the call type and id:
You can find more details about the built-in LivestreamPlayer
in the following page.
The rest of the guide will be focused on building your own livestream player view.
Livestream states
We will create a SwiftUI view, called LivestreamView
. Let’s define the possible states of the view, based on the state of a call. In the body
of the view, add the following code:
@StateObject var state: CallState
var body: some View {
VStack {
if state.backstage {
backstageView
} else if state.endedAt != nil {
callEndedView
} else {
livestreamInfoView
videoRendererView
}
}
}
This won’t compile for now, since we also need to define the helper views.
First, let’s describe the different states:
- when
state.backstage
is true, the call is still not started (it’s in backstage). By default, only hosts with the capabilityjoin-backstage
can join a call in this state. You can also setup thejoinAheadTimeSeconds
to allow any user to join the livestream before it’s started. - if
state.endedAt
is notnil
, it means that the livestream has already finished. - if it’s not ended or in backstage, then the livestream is live. In this state, we will show the host’s video and other relevant data.
Backstage mode
While in backstage, you can show a countdown, or the start date. By listening to the backstage
property in the CallState
, you can also automatically render the video track as soon as it’s available.
Here’s an example on how to handle the backstage mode, by showing the start date and the number of participants that are waiting to join the livestream:
@ViewBuilder
var backstageView: some View {
if let startedAt = state.startsAt {
Text("Livestream starting at \(startedAt.formatted())")
} else {
Text("Livestream starting soon")
}
if let session = state.session {
let waitingCount = session.participants.filter({ $0.role != "host" }).count
if waitingCount > 0 {
Text("\(waitingCount) participants waiting")
.font(.headline)
.padding(.horizontal)
}
}
}
Call Ended
When a livestream has ended, its endedAt
property is updated with the date and time when the livestream has finished.
You can use this to show a message to the users, and additionally show them recordings of the call, if available.
@State var recordings: [CallRecording]?
@ViewBuilder
var callEndedView: some View {
Text("Call ended")
.onAppear {
if recordings == nil {
Task {
do {
recordings = try await call?.listRecordings()
} catch {
print("Error fetching recordings: \(error)")
recordings = []
}
}
}
}
if let recordings, recordings.count > 0 {
Text("Watch recordings:")
ForEach(recordings, id: \.self) { recording in
Button {
if let url = URL(string: recording.url), UIApplication.shared.canOpenURL(url) {
UIApplication.shared.open(url)
}
} label: {
Text(recording.url)
}
}
}
}
In the example above, we are using the call.listRecordings
API method to fetch any recordings available (if the call was recorded).
Call live view
Next, let’s provide an implementation of the views shown when the livestream is active. As shown in the initial code above, this state will consist of 2 views - livestreamInfoView
and videoRendererView
.
Let’s first show some information about the livestream, such as the duration and the participant count.
@Injected(\.formatters.mediaDuration) private var formatter: MediaDurationFormatter
@ViewBuilder
var livestreamInfoView: some View {
HStack {
if let duration = formatter.format(state.duration) {
Text("Live for \(duration)")
.font(.headline)
.padding(.horizontal)
}
Spacer()
Text("Live \(state.participantCount)")
.bold()
.padding(.all, 4)
.foregroundColor(.white)
.background(Color.blue)
.cornerRadius(8)
.opacity(state.backstage ? 0 : 1)
.padding(.horizontal)
}
}
As shown above, you can access the total participant count (including anonymous users), with the participantCount
in the CallState
. You can also watch queried calls, as explained here. This allows you to present participant count (and other call data), even without joining a call.
Next, let’s see how to implement the view that will render the livestream.
@ViewBuilder
var videoRendererView: some View {
GeometryReader { reader in
if let first = state.participants.first(where: { hostIds.contains($0.userId) }) {
VideoRendererView(id: first.id, size: reader.size) { renderer in
renderer.handleViewRendering(for: first) { size, participant in }
}
} else {
Text("The host's video is not available")
}
}
.padding()
}
var hostIds: [String] {
state.members.filter { $0.role == "host" }.map(\.id)
}
Finding and rendering the participant whose video track is shown depends on your use-case - whether it supports only one streamer, whether you want to limit this functionality by role or you have any other special requirements. In the example above, we’re trying to find a user with the “host” role.
This example uses our lower level VideoRendererView
for rendering tracks. You can also use the higher level VideoCallParticipantView
(that also comes with a fallback background). You can find an example of its usage in our livestream tutorial.
Error states
Livestreaming depends on many factors, such as the network conditions on both the user publishing the stream, as well as the viewers.
A proper error handling is needed, to be transparent to the potential issues the user might be facing.
When the network drops, the SDK tries to reconnect the user to the call. However, if it fails to do that, the reconnectionStatus
in the CallState
becomes disconnected
. This gives you the chance to show an alert to the user and provide some custom handling (e.g. a message to check the network connection and try again).
Here’s an example how to do that, by attaching a onChange
modifier to your view:
.onChange(of: state.reconnectionStatus) { oldValue, newValue in
if oldValue == .reconnecting && newValue == .disconnected {
errorShown = true
}
}
.alert("You were disconnected from the call", isPresented: $errorShown, actions: {
// add a custom error handling behaviour
})
Additionally, you should add a catch
block to all the actions in a Task
(creating a call, joining a call, going live, etc), to visually notify the user in case some of these actions fail.
Handling Volume
The SDK respects the volume controls on the device. One note - if you are either sharing video or audio, you can’t fully disable the audio, because of the audio session mode of video chat.
However, the SDK dynamically updates the audio session type when you are not sharing video and audio, allowing viewers in a livestream to completely disable the audio via the hardware buttons.
In order to support that behaviour, you should use the OwnCapabilitiesAudioSessionPolicy
, as described here.
Additionally, you can disable the audio with a UI element in your app:
Button(
action: {
callSettings = callSettings.withUpdatedSpeakerState(!callSettings.speakerOn)
Task {
try await call.speaker.toggleSpeakerPhone()
}
},
label: {
CallIconView(
icon: callSettings.speakerOn
? images.speakerOn
: images.speakerOff,
size: size,
iconStyle: callSettings.speakerOn
? .primary
: .transparent
)
}
)
Similarly, you can mute yourself with the following code:
Button(
action: {
callSettings = callSettings.withUpdatedAudioState(!callSettings.audioOn)
}
Task {
try await call.microphone.toggle()
}
},
label: {
CallIconView(
icon: callSettings.audioOn
? images.micTurnOn
: images.micTurnOff,
size: size,
iconStyle: callSettings.audioOn
? .transparent
: .disabled
)
}
)