Android Livestreaming Tutorial

This video teaches you how to quickly build a low-latency WebRTC-based livestreaming in-app experience for your Android app using Jetpack Compose. We will leverage the Android/Compose video SDK from getstream.io.

In this tutorial, we'll guide you through creating a low-latency in-app livestreaming experience using Stream's video SDKs. The livestream leverages Stream's robust global edge network to ensure high performance and reliability. We'll explore key topics including setup, broadcasting, viewer interaction, and advanced features:

  • Ultra low latency streaming
  • Multiple streams & co-hosts
  • RTMP in and WebRTC input
  • Exporting to HLS
  • Reactions, custom events and chat
  • Recording & Transcriptions

UI components are fully customizable, as demonstrated in the Android Video Cookbook. This tutorial demonstrates the Compose Video SDK, but you have the option to use the core library without Compose based on your preference.

⚠️ The livestreaming tutorial is a bit different than the usual video calls since there are two types of users, host and guest. For the purpose of our tutorial we are going to create two separate apps, called LivestreamingHost and LivestreamingGuest, that you'll run on two devices (or emulators).

Let's dive in! If you have any questions or need to provide feedback along the way, don't hesitate to use the feedback button - we're here to help!

Livestreaming host (broadcaster)

Step 1 - Create a new project in Android Studio

Please note that this tutorial was crafted using Android Studio Giraffe. While the setup steps outlined here are generally applicable across different versions, there may be slight variations. For the best experience, we recommend using Android Studio Giraffe or a newer version.

  1. Create a new project
  2. Select Phone & Tablet -> Empty Activity
  3. Name your project LivestreamingHost.

Step 2 - Install the SDK & Setup the client

The Stream Video SDK has two main artifacts:

  • Core Client: io.getstream:stream-video-android-core - includes only the core part of the SDK.
  • Compose UI Components: io.getstream:stream-video-android-ui-compose - includes the core + Compose UI components.

For this tutorial, we'll use the Compose UI Components.

Add the Video Compose SDK dependency to the app/build.gradle.kts file. If you're new to Android, note that there are 2 build.gradle.kts files, you want to open the one located in the app folder.

⚠️ Replace <latest-version> for the Stream Video Compose SDK with the version number indicated below. Also, you can check the Releases page.

⚠️ Make sure compileSdk (or compileSdkVersion - if you're using the older syntax) is set to 34 or newer in your app/build.gradle.kts file.

⚠️ If you get Compose-related errors when building your project, expand this section and follow the steps.
  1. Add the following dependencies in the app/build.gradle.kts file, if needed:
  1. Add the following lines to the app/build.gradle.kts file, if needed:
  1. Make sure you're using a Kotlin version that's compatible with the Compose compiler version specified by kotlinCompilerExtensionVersion above. Check this page to see which Kotlin version is compatible with your Compose Compiler version. If needed, change the Kotlin version in your project's build.gradle.kts file:

⚠️ Make sure you sync the project after doing these changes. Click on the Sync Now button above the file contents.

Step 3 - Setup the app for host functionality

To keep this tutorial short and easy to understand, we'll place all the code in MainActivity.kt. For a production app you'd want to initialize the client in your Application class or DI module. You'd also want to use a View Model.

Open the MainActivity.kt file and replace the MainActivity class with the code below.

Also, you can find below the import statements that are used throughout this tutorial. Replace the already existing import lines to follow along more easily.

You can delete the other functions that were created by Android Studio.

Import statements

Review the code

We set up the user:

You can also use a guest or anonymous user. For most apps it's convenient to match your own system of users to grant and remove permissions.

Next we create the client:

Notice the userToken variable. Your backend typically generates the user token on signup or login. For this tutorial we're using a development token. See this page for more details.

The most important step to review is how we create the call. Stream uses the same call object for livestreaming, audio rooms and video calling. Have a look at the code snippet below:

To create the first call object, specify the call type as livestream and provide a unique callId. The livestream call type comes with default settings that are usually suitable for livestreams, but you can customize features, permissions, and settings in the Dashboard. Additionally, the Dashboard allows you to create new call types as required.

Finally, using call.join(create = true) will not only create the call object on our servers but also initiate the real-time transport for audio and video. This allows for seamless and immediate engagement in the livestream.

Note that you can also add members to a call and assign them different roles. For more information, see the call creation docs.

Stream uses a technology called SFU cascading to replicate your livestream over different servers around the world. This makes it possible to reach a large audience in realtime.

State & Participants

Call.state exposes all the StateFlow objects you need. The participant state docs show all the available fields.

In this example we use:

  • call.state.connection - to show if we're connected to the realtime video. You can use this for implementing a loading interface.
  • call.state.backstage - a boolean that returns if the call is in backstage mode or not
  • call.state.duration - how long the call has been running
  • call.state.totalParticipants - the number of participants watching the livestream
  • call.state.localParticipant - the state of the participant on this device

Also, call.state.participants can be used to access all participant-related state. It contains more information about who's watching the stream. If you have multiple people broadcasting video this also contain the video tracks.

  • participant.user: the user's name, image and custom data
  • participant.video: the video for this user
  • participant.roles: the roles for the participant. it enables you to have co-hosts etc

There are many possibilities and the participant state docs explain this in more detail.

Backstage mode

In the example above you might have noticed the call.goLive() method and the call.state.backstage stateflow. The backstage functionality is enabled by default on the livestream call type. It makes it easy to build a flow where you and your co-hosts can setup your camera and equipment before going live. Only after you call call.goLive(), will regular users be allowed to join the livestream.

This is convenient for many livestreaming and audio-room use cases. If you want calls to start immediately when you join them that's also possible. Simply go the Stream Dashboard, click the livestream call type and disable backstage mode.

Run the app

To actually run this sample, we need a call ID. Please update REPLACE_WITH_CALL_ID with the actual value shown below:

Call IDCreating random call ID ...

Run the app and accept the required permissions. You will be see an interface that looks similar to this:

Livestream

We'll start the livestream later, after we create the guest app.

Livestreaming guest (viewer)

Step 1 - Create a new project in Android Studio

Follow Step 1 above, but set the name of the app to LivestreamingGuest.

Step 2 - Install the SDK & Setup the client

Follow Step 2 above.

Step 3 - Setup the app for guest functionality

Open the MainActivity.kt file and replace the MainActivity class with the code below.

Also, you can find below the import statements that are used throughout this tutorial. Replace the already existing import lines to follow along more easily.

You can delete the other functions that were created by Android Studio.

Import statements

Review the code

Watching a livestream is even easier than broadcasting. Compared to the code in the host app, you:

  • Don't need to request permissions or enable the camera.
  • You pass runForegroundServiceForCalls = false to the StreamVideoBuilder.
  • Don't render the local video, but instead render the remote video (the livestream).
  • Typically include some small UI elements like viewer count, a button to mute, etc.

The livestream layout is built using standard Jetpack Compose. The LiveStreamPlayer component is provided by our SDK. It renders the incoming video or a different content if the call is in backstage mode. If you want to learn more about building UIs, check out the Android Video Cookbook.

Run the app

Before running the app, update REPLACE_WITH_CALL_ID with the value shown below (it should be the same as the host call ID above):

Call IDCreating random call ID ...

Now run the app on a different device or emulator. You should have the host app running on device 1 and the guest app on device 2.

You will see a Call is in backstage mode message on the guest app (device 2). Now press on Go Live on the host app (device 1) to exit backstage mode and start seeing the livestream.

(Optional) Publishing RTMP using OBS

The example above showed how to publish your phone's camera to the livestream. Almost all livestream software and hardware supports RTMPS. OBS is one of the most popular livestreaming software packages and we'll use it to explain how to import RTMPS.

A. Log the URL & Stream Key

In the host app, add the following code right after defining the call variable:

B. Open OBS and go to Settings -> Stream

  • Select Custom service
  • Server: enter the RTMP URL from the log output
  • Stream key: enter the RTMP Stream Key from the log output

C. Start Streaming

  • Press Start Streaming in OBS. The RTMP stream should now show up in the guest app.
  • Make sure you turn off Backstage mode from the host app.

(Optional) Viewing a livestream with HLS

Another way to watch a livestream is using HLS. HLS tends to have a 10-20 seconds delay, while the above WebRTC approach is realtime. The benefit that HLS offers is better buffering under poor network conditions.

So HLS can be a good option when:

  • A 10-20 second delay is acceptable
  • Your users want to watch the Stream in poor network conditions

This is how you broadcast your call to HLS:

You can play the HLS video feed using any HLS capable video player, such as ExoPlayer.

Advanced Features

This tutorial covered broadcasting and watching a livestream. It also went into more details about HLS & RTMP-in.

There are several advanced features that can improve the livestreaming experience:

  • Co-hosts You can add members to your livestream with elevated permissions. So you can have co-hosts, moderators etc.
  • Custom events You can use custom events on the call to share any additional data. Think about showing the score for a game, or any other realtime use case.
  • Reactions & Chat Users can react to the livestream, and you can add chat. This makes for a more engaging experience.
  • Notifications You can notify users via push notifications when the livestream starts
  • Recording The call recording functionality allows you to record the call with various options and layouts

Recap

It was fun to see just how quickly you can build in-app low latency livestreaming. Please let us know if you ran into any issues. Our team is also happy to review your UI designs and offer recommendations on how to achieve it with the Stream SDKs.

To recap what we've learned:

  • WebRTC is optimal for latency, while HLS is slower, but buffers better for users with poor connections.
  • You setup a call with val call = client.call("livestream", callId).
  • The call type livestream controls which features are enabled and how permissions are set up.
  • The livestream call has backstage mode enabled by default. This allows you and your co-hosts to setup your mic and camera before allowing people in.
  • When you join a call, realtime communication is setup for audio & video: call.join().
  • StateFlow objects in call.state and call.state.participants make it easy to build your own UI

Calls run on Stream's global edge network of video servers. Being closer to your users improves the latency and reliability of calls. The SDKs enable you to build livestreaming, audio rooms and video calling in days.

We hope you've enjoyed this tutorial and please feel free to reach out if you have any suggestions or questions.

Final Thoughts

In this video app tutorial we built a fully functioning Android messaging app with our Android SDK component library. We also showed how easy it is to customize the behavior and the style of the Android video app components with minimal code changes.

Both the video SDK for Android and the API have plenty more features available to support more advanced use-cases.

Give us Feedback!

Did you find this tutorial helpful in getting you up and running with Android for adding video to your project? Either good or bad, we’re looking for your honest feedback so we can improve.

Next Steps

Create your free Stream account to start building with our Video & Audio SDKs, or contact our team if you have additional questions.

Chat Messaging

Build any kind of chat messaging experience without scalability or reliability issues.

Learn more about $ Chat Messaging

Enterprise

Available 99.999% uptime SLAs and industry-leading security to power the world's largest apps.

Learn more about $ Enterprise