Livestreaming

In this guide, we’re going to see how we can watch a livestream using Stream Video’s React Native SDK. We will also show you how to implement other common livestream features, such as displaying the number of watchers, how to allow users to wait before the livestream started, handle different states and much more.

You can find a working project that uses the examples below here.

Watching a livestream

In this guide we will see how to watch a WebRTC livestream. We also support HLS and RTMP-out.

Let’s do a quick overview of these three technologies:

  • WebRTC is ideal for real-time, low-latency streaming such as video calls or live auctions.
  • HLS (HTTP Live Streaming) is great for large-scale distribution, offering broad compatibility and adaptive bitrate streaming. However, it typically has higher latency (5–30 seconds), making it less suitable for interactive use cases.
  • RTMP (Real-Time Messaging Protocol) was once the standard for low-latency streaming to platforms like YouTube or Twitch. While it’s being phased out in favor of newer protocols, it’s still commonly used for ingesting streams due to its reliability and low latency (~2–5 seconds).

We will show you how to watch the WebRTC livestream and implement some common livestreaming features.

We also offer a default component, LivestreamPlayer, that comes with a predefined UI, in case it fits your use-case.

Integrating the default livestream player is very simple, you just need the call type and id:

import { LivestreamPlayer } from "@stream-io/video-react-native-sdk";

// Use the component in your app
<LivestreamPlayer callType="livestream" callId="your_call_id" />;

You can find more details about the built-in LivestreamPlayer in the following page.

The rest of the guide will be focused on building your own livestream player view.

Livestream states

A livestream can be in different states, which your UI needs to handle appropriately:

  • Backstage - The livestream is created but not yet started
  • Live - The livestream is active and viewers can watch
  • Ended - The livestream has finished

The React Native SDK provides hooks to detect these states:

import React from "react";
import { useEffect } from "react";
import { View } from "react-native";
import { useCallStateHooks, useCall } from "@stream-io/video-react-sdk";

export const LivestreamContent = () => {
  const { useCallEndedAt, useIsCallLive, useCallCallingState } =
    useCallStateHooks();
  const endedAt = useCallEndedAt();
  const isLive = useIsCallLive();
  const callingState = useCallCallingState();
  const call = useCall();

  // to immediately join the call as soon as it is possible
  useEffect(() => {
    const handleJoinCall = async () => {
      try {
        await call?.join();
      } catch (error) {
        console.error("Failed to join call", error);
      }
    };

    if (call && isLive && callingState === CallingState.IDLE) {
      handleJoinCall();
    }
  }, [call, callingState, isLive]);

  return (
    <View>
      {!isLive && <Backstage />}
      {endedAt != null && <CallEnded />}
      {endedAt == null && <CallLiveContent />}
    </View>
  );
};

This won’t compile for now, since we also need to define the helper views.

First, let’s describe the different states:

  • when isLive is false, the call is still not started (it’s in backstage). By default, only hosts with the capability join-backstage can join a call in this state. You can also setup the joinAheadTimeSeconds to allow any user to join the livestream before it’s started.
  • if endedAt is not null, it means that the livestream has already finished.
  • if it’s not ended or in backstage, then the livestream is live. In this state, we will show the host’s video and other relevant data.

Backstage mode

While in backstage, you can show a countdown, or the start date. By listening to the useIsCallLive hook from useCallStateHooks, you can also automatically render the video track as soon as it’s available.

Here’s an example on how to handle the backstage mode, by showing the start date and the number of participants that are waiting to join the livestream:

import React from "react";
import { View, Text, StyleSheet } from "react-native";
import { useCallStateHooks } from "@stream-io/video-react-native-sdk";

export const Backstage = () => {
  const { useCallSession, useCallStartsAt } = useCallStateHooks();
  const startsAt = useCallStartsAt();
  const session = useCallSession();

  // participants who are waiting
  const waitingCount = session?.participants_count_by_role["user"] || 0;

  const formattedStartsAt =
    startsAt &&
    new Date(startsAt).toLocaleDateString(undefined, {
      month: "short",
      day: "2-digit",
      hour: "2-digit",
      minute: "2-digit",
      hour12: false,
    });

  return (
    <View style={styles.container}>
      {startsAt ? (
        <Text style={styles.title}>
          Livestream starting at {formattedStartsAt}
        </Text>
      ) : (
        <Text style={styles.title}>Livestream starting soon</Text>
      )}

      {waitingCount > 0 && (
        <Text style={styles.waitingCount}>
          {waitingCount} participants waiting
        </Text>
      )}
    </View>
  );
};

const styles = StyleSheet.create({
  container: {
    flex: 1,
    justifyContent: "center",
    alignItems: "center",
  },
  title: {
    fontSize: 18,
    fontWeight: "bold",
    marginBottom: 8,
  },
  waitingCount: {
    fontSize: 16,
    paddingHorizontal: 16,
  },
});

Call Ended

When a livestream has ended, its endedAt property is updated with the date and time when the livestream has finished. You can use this to show a message to the users, and additionally show them recordings of the call, if available.

export const CallEnded = () => {
  const call = useCall();
  const [recordingsResponse, setRecordingsResponse] = useState<
    ListRecordingsResponse | undefined
  >(undefined);

  useEffect(() => {
    const fetchRecordings = async () => {
      if (recordingsResponse == null) {
        try {
          const callRecordingsResponse = await call?.queryRecordings();
          setRecordingsResponse(callRecordingsResponse);
        } catch (error) {
          console.log("Error fetching recordings:", error);
          setRecordingsResponse(undefined);
        }
      }
    };

    fetchRecordings();
  }, [call, recordingsResponse]);

  const openUrl = (url: string) => {
    Linking.canOpenURL(url).then((supported) => {
      if (supported) {
        Linking.openURL(url);
      } else {
        console.log("Cannot open URL:", url);
      }
    });
  };

  const showRecordings =
    recordingsResponse && recordingsResponse.recordings.length > 0;

  return (
    <View style={styles.container}>
      <Text style={styles.title}>The livestream has ended.</Text>

      {showRecordings && (
        <>
          <Text style={styles.subtitle}>Watch recordings:</Text>
          <View style={styles.recordingsContainer}>
            <FlatList
              data={recordingsResponse.recordings}
              keyExtractor={(item) => item.session_id}
              renderItem={({ item }) => (
                <Pressable
                  style={styles.recordingButton}
                  onPress={() => openUrl(item.url)}
                >
                  <Text style={styles.recordingText}>{item.url}</Text>
                </Pressable>
              )}
            />
          </View>
        </>
      )}
    </View>
  );
};

Call Live View

Let’s provide an example implementation for the component used when the livestream is active. As shown in LivestreamContent above, this state is represented by the CallLiveContent component:

import React, { useEffect, useState } from 'react';
import { useCallStateHooks } from '@stream-io/video-react-native-sdk';
import { View, Text, StyleSheet } from 'react-native';
import { VideoRenderer } from '../../Participant';

export const CallLiveContent = () => {
  const { useParticipants, useCallSession, useParticipantCount } =
    useCallStateHooks();
  const participants = useParticipants();
  const hosts = participants.filter((p) => p.roles.includes('host'));

  const session = useCallSession();
  const [duration, setDuration] = useState(() => {
    if (!session || !session.live_started_at) {
      return 0;
    }
    const liveStartTime = new Date(session.live_started_at);
    const now = new Date();
    return Math.floor((now.getTime() - liveStartTime.getTime()) / 1000);
  });

  const totalParticipants = useParticipantCount();
  const viewers = Math.max(0, totalParticipants - 1);

  const formatDuration = (durationInMs: number) => {
    const days = Math.floor(durationInMs / 86400);
    const hours = Math.floor(durationInMs / 3600);
    const minutes = Math.floor((durationInMs % 3600) / 60);
    const seconds = durationInMs % 60;

    return `${days ? days + ' ' : ''}${hours ? hours + ':' : ''}${
      minutes < 10 ? '0' : ''
    }${minutes}:${seconds < 10 ? '0' : ''}${seconds}`;
  };

  useEffect(() => {
    let intervalId: NodeJS.Timeout;
    const handleLiveStarted = () => {
      intervalId = setInterval(() => {
        setDuration((d) => d + 1);
      }, 1000);
    };

    handleLiveStarted();

    return () => {
      if (intervalId) {
        clearInterval(intervalId);
      }
    };
  }, []);

  return (
    <View style={styles.container}>
      {hosts.length > 0 && (
        <VideoRenderer participant={hosts[0]} trackType="videoTrack" />
      )}
      <Text style={styles.durationText}>{formatDuration(duration)}</Text>
      <Text style={styles.viewersText}>Viewers: {viewers}</Text>
    </View>
  );
};

const styles = StyleSheet.create({
  container: {
    flex: 1,
    position: 'relative',
  },
  videoRenderer: {
    flex: 1,
  },
  durationText: {
    position: 'absolute',
    bottom: 24,
    left: 24,
    color: 'red',
  },
  viewersText: {
    position: 'absolute',
    bottom: 24,
    right: 24,
    color: 'red',
  },
});

Next, let’s break down the implementation details of the code snippet above.

Rendering the livestream track

Finding and rendering the participant whose video track is shown depends on your use-case - whether it supports only one streamer, whether you want to limit this functionality by role or you have any other special requirements. In the example above, we’re trying to find a user with the host role.

This example uses our low level VideoRenderer component (docs here) for rendering the host participant. You can also use the higher level ParticipantView (that also contains a label, connection quality indicator and a fallback background). You can find an example of its usage in our docs.

Livestream Information

You can show various information about the livestream, such as the duration and the participant count:

  • To calculate the total participant count (including anonymous users):
const { useParticipants, useAnonymousParticipantCount } = useCallStateHooks();
const participants = useParticipants();
const anonymousParticipantCount = useAnonymousParticipantCount();
const totalParticipantCount = participants.length + anonymousParticipantCount;
  • Frequently, the call duration is also presented in a livestream. This information can be calculated from the call session using the useCallSession(), as you can see in the CallLiveContent snippet above.

You can also watch queried calls, as explained here. This allows you to present participant count (and other call data), even without joining a call.

Error states

Livestreaming depends on many factors, such as the network conditions on both the user publishing the stream, as well as the viewers.

A proper error handling is needed, to be transparent to the potential issues the user might be facing.

When the network drops, the SDK tries to reconnect the user to the call. However, if it fails to do that, the callingState in the CallState becomes RECONNECTING_FAILED. This gives you the chance to show an alert to the user and provide some custom handling (e.g. a message to check the network connection and try again).

Here’s an example how to do that:

const ConnectionStatus = () => {
  const { useCallCallingState } = useCallStateHooks();
  const callingState = useCallCallingState();

  let statusMessage;

  switch (callingState) {
    case CallingState.RECONNECTING:
      statusMessage = "Reconnecting, please wait";
      break;
    case CallingState.RECONNECTING_FAILED:
      statusMessage = "Cannot join livestream. Try again later";
      break;
    case CallingState.OFFLINE:
      statusMessage = "You are disconnected";
      break;
    default:
      statusMessage = "A connection error occurred";
  }

  return <Text>{statusMessage}</Text>;
};

Handling Volume

The SDK respects the volume controls on the device. One note - if you are either sharing video or audio, you can’t fully disable the audio, because of the audio session mode of video chat.

However, the SDK dynamically updates the audio session type when you are not sharing video and audio, allowing viewers in a livestream to completely disable the audio via the hardware buttons.

We do not support control of the volume of specific audio elements or individual participants through our React Native SDK as React Native WebRTC doesn’t support the setVolume and setParticipantVolume methods from our SpeakerManager.

For more info on this you can check here.

On the other hand, you can mute yourself with the following code bellow, with more info here:

const { useMicrophoneState } = useCallStateHooks();
const { optimisticIsMute, microphone } = useMicrophoneState();

const onPress = async () => {
  await microphone.toggle();
};

return (
  <CallControlsButton
    onPress={onPress}
    color={!optimisticIsMute ? colors.buttonSecondary : colors.buttonWarning}
    style={toggleAudioPublishingButton}
  >
    <IconWrapper>
      {!optimisticIsMute ? (
        <Mic color={colors.iconPrimary} size={defaults.iconSize} />
      ) : (
        <MicOff color={colors.iconPrimary} size={defaults.iconSize} />
      )}
    </IconWrapper>
  </CallControlsButton>
);
© Getstream.io, Inc. All Rights Reserved.