Livestreaming

In this guide, we’re going to learn how we can broadcast and watch a livestream using the Stream Video Android SDK. We will also show you how to implement other common livestream features, such as displaying the number of watchers, how to allow users to wait before the livestream started, handle different states and more.

You can find a working project that uses the examples below here.

Watching a livestream

In this guide we will see how to watch a WebRTC livestream. We also support HLS and RTMP-out.

Let’s do a quick overview of these three technologies:

  • WebRTC is ideal for real-time, low-latency streaming such as video calls or live auctions.
  • HLS (HTTP Live Streaming) is great for large-scale distribution, offering broad compatibility and adaptive bitrate streaming. However, it typically has higher latency (5–30 seconds), making it less suitable for interactive use cases.
  • RTMP (Real-Time Messaging Protocol) was once the standard for low-latency streaming to platforms like YouTube or Twitch. While it’s being phased out in favor of newer protocols, it’s still commonly used for ingesting streams due to its reliability and low latency (~2–5 seconds).

We will show you how to watch the WebRTC livestream and implement some common livestreaming features.

We offer a default component, LivestreamPlayer, that comes with a predefined UI, in case it fits your use-case.

Using the default livestream player is very simple, you just need the call object:

val call = streamVideo.call("livestream", "your_call_id")

LivestreamPlayer(call = call)

You can find more details about the built-in LivestreamPlayer on the following page.

The rest of the guide will be focused on building your own livestream player UI.

Livestream states

Let’s define the possible states of the UI based on the state of the call. We will create a composable with the following code:

@Composable
fun LivestreamScreenContent(call: Call) {
    val isCallInBackstage by call.state.backstage.collectAsStateWithLifecycle()
    val endedAt by call.state.endedAt.collectAsStateWithLifecycle()

    Box(
        modifier = Modifier.fillMaxSize(),
        contentAlignment = Alignment.Center
    ) {
        if (endedAt != null) {
            CallEndedContent(call)
        } else if (isCallInBackstage) {
            Backstage(call)
        } else {
            CallLiveContent(call)
        }
    }
}

This won’t compile for now, since we also need to define the inner composables, but, first, let’s describe the different states:

  • When call.state.backstage is true, the call is not started yet (it’s in backstage). By default, only hosts with the join-backstage capability can join a call in this state. You can also specify the joinAheadTimeSeconds parameter when creating a call to allow any type of user to join the livestream before it’s started.
  • If call.state.endedAt is not null, it means that the livestream has already finished.
  • If it’s not ended or in backstage, then the livestream is live. In this state, we will show the host’s video and other relevant data.

Backstage mode

While in backstage, you can show a countdown, or the start date. By listening to the backstage property from the CallState, you can also automatically render the video track as soon as it’s available.

Here’s an example on how to handle the backstage mode by showing the start date and the number of participants that are waiting to join the livestream:

@Composable
fun Backstage(call: Call) {
    Column {
        val startsAt by call.state.startsAt.collectAsStateWithLifecycle()
        val formattedStartsAt = startsAt?.format(
            DateTimeFormatter.ofPattern("MMM dd HH:mm")
        )

        if (formattedStartsAt != null) {
            Text("Livestream will start at $formattedStartsAt")
        } else {
            Text("Livestream will start soon")
        }

        val waitingCount by call.state.session.map {
            it?.participants?.count { it.role != "host" }
        }.collectAsStateWithLifecycle(null)

        waitingCount?.let {
            Spacer(Modifier.height(16.dp))
            Text("$it users waiting")
        }
    }
}

Call Ended

When a livestream has ended, its endedAt property is updated with the date and time when the livestream has finished. You can use this to show a message to the users, and additionally show them recordings of the call, if available.

In the example below, we are using the call.listRecordings API method to fetch any recordings available (if the call was recorded).

@Composable
fun CallEndedContent(call: Call) {
    Column {
        Text("Livestream ended")
        Spacer(Modifier.height(24.dp))
        Recordings(call)
    }
}

@Composable
fun Recordings(call: Call) {
    var recordings by remember { mutableStateOf(emptyList<CallRecording>()) }
    val context = LocalContext.current

    LaunchedEffect(Unit) {
        call.listRecordings()
            .onSuccess { recordings = it.recordings }
            .onError { recordings = emptyList<CallRecording>() }
    }

    if (recordings.isEmpty()) {
        Text("No recordings available")
    } else {
        Text("Recordings available")
        Spacer(Modifier.height(8.dp))

        recordings.forEach {
            Text(
                text = it.filename,
                modifier = Modifier.clickable {
                    try {
                        val intent = Intent(Intent.ACTION_VIEW, it.url.toUri())
                        context.startActivity(intent)
                    } catch (e: Exception) {
                        Log.e(TAG, "Error opening recording: $e")
                    }
                }
            )
            Spacer(Modifier.height(8.dp))
        }
    }
}

Call Live

Let’s provide an example implementation for the composables used when the livestream is active. As shown in LivestreamScreenContent above, this state is represented by the CallLiveContent composable:

@Composable
fun CallLiveContent(call: Call) {
    val members by call.state.members.collectAsStateWithLifecycle()
    val hostIds = members.filter { it.role == "host" }.map { it.user.id }
    val participants by call.state.participants.collectAsStateWithLifecycle()
    val host = participants.firstOrNull { it.userId.value in hostIds }
    val videoTrack = host?.video?.collectAsStateWithLifecycle()?.value

    val totalParticipants by call.state.totalParticipants.collectAsStateWithLifecycle()
    val viewers = max(0, totalParticipants - 1)
    val duration by call.state.duration.collectAsStateWithLifecycle()

    Box(modifier = Modifier.fillMaxSize()) {
        VideoRenderer(call = call, video = videoTrack)
        Text(
            text = "${duration ?: "0m 0s"}",
            modifier = Modifier
                .align(Alignment.TopStart)
                .padding(24.dp)
        )
        Text(
            text = "Viewers: $viewers",
            modifier = Modifier
                .align(Alignment.TopEnd)
                .padding(24.dp)
        )
    }
}

Next, let’s break down the implementation details of the code snippet above.

Rendering the livestream track

Finding and rendering the participant whose video track is shown depends on your use-case - whether it supports only one streamer, whether you want to limit this functionality by role or you have any other special requirements. In the example above, we’re trying to find a user with the host role.

This example uses our low level VideoRenderer component (docs here) for rendering the track. You can also use the higher level ParticipantVideo (that also contains a label and a connection quality indicator). You can find an example of its usage in our docs.

Livestream Information

You can show various information about the livestream, such as the duration and the participant count:

  • You can read the total participant count (including anonymous users) from the totalParticipants property of CallState.
  • Frequently, the call duration is also presented in a livestream. This information is available with the call.state.duration property, as you can see above.

You can also watch queried calls, as explained here. This allows you to present participant count (and other call data), even without joining a call.

Error states

Livestreaming depends on many factors, such as network conditions on both the publishing and viewer side.

A proper error handling is needed and is good to be transparent with the potential issues the user might be facing.

When the network drops, the SDK tries to reconnect the user to the call. However, if it fails to do that, the connection flow in the CallState becomes Disconnected. This gives you the chance to show an alert to the user and provide some custom handling (e.g. a message to check the network connection and try again).

Here’s an example on how to do that:

val connection by call.state.connection.collectAsStateWithLifecycle()

Text(
    text = when (connection) {
        is RealtimeConnection.Reconnecting -> "Reconnecting, please wait"
        is RealtimeConnection.Disconnected -> "You are disconnected"
        is RealtimeConnection.Failed -> "Cannot join livestream. Try again later"
        else -> "A connection error occurred"
    }
)

Additionally, you should use a try-catch block for actions like creating a call, joining a call, going live etc., to visually notify the user if there’s a failure.

Handling the Camera and Microphone

To offer the host the possibility to enable or disable the camera and microphone, use the call.camera and call.microphone properties, together with the ToggleCameraAction and ToggleMicrophoneAction UI components.

val isCameraEnabled by call.camera.isEnabled.collectAsStateWithLifecycle()
val isMicrophoneEnabled by call.microphone.isEnabled.collectAsStateWithLifecycle()

ToggleCameraAction(isCameraEnabled = isCameraEnabled) {
    call.camera.setEnabled(it.isEnabled)
}

ToggleMicrophoneAction(isMicrophoneEnabled = isMicrophoneEnabled) {
    call.microphone.setEnabled(it.isEnabled)
}

Handling the Volume

The SDK respects the device volume controls. In order to let viewers completely mute the livestream audio, you have to instruct the SDK to use the AudioAttributes.USAGE_MEDIA audio usage type. You can do this when building the SDK by passing the livestreamGuestCall configuration in the callServiceConfigRegistry parameter:

val registry = CallServiceConfigRegistry()
registry.register(
    CallType.Livestream.name, DefaultCallConfigurations.livestreamGuestCall
)

StreamVideoBuilder(
    context = context,
    apiKey = "your_api_key",
    user = User(id = "your_user_id", name = "your_user_name"),
    token = "your_user_token",
    notificationConfig = /* ... */,
    callServiceConfigRegistry = registry,
).build()

You can find more configuration options in our docs.

© Getstream.io, Inc. All Rights Reserved.