Android Livestreaming Tutorial
In this tutorial, we'll guide you through creating a low-latency in-app livestreaming experience using Stream's video SDKs. The livestream leverages Stream's robust global edge network to ensure high performance and reliability. We'll explore key topics including setup, broadcasting, viewer interaction, and advanced features:
- Ultra low latency streaming
- Multiple streams & co-hosts
- RTMP in and WebRTC input
- Exporting to HLS
- Reactions, custom events and chat
- Recording & Transcriptions
Let's dive in! If you have any questions or need to provide feedback along the way, don't hesitate to use the feedback button - we're here to help!
Step 1 - Create a new project in Android Studio
Confused about "Step 1 - Create a new project in Android Studio"?
Let us know how we can improve our documentation:
Please note that this tutorial was crafted using Android Studio Giraffe. While the setup steps outlined here are generally applicable across different versions, there may be slight variations. For the best experience, we recommend using Android Studio Giraffe or a newer version.
- Create a new project
- Select Phone & Tablet -> No Activity (we'll create the activities later)
- Name your project Livestream.
Step 2 - Install the SDK & Setup the client
Confused about "Step 2 - Install the SDK & Setup the client"?
Let us know how we can improve our documentation:
Add the Video Compose SDK and Jetpack Compose dependencies to your app's build.gradle.kts
file found in app/build.gradle.kts
.
If you're new to android, note that there are 2 build.gradle
files, you want to open the build.gradle
in the app folder.
:exclamation: Replace <latest-version>
with the version number indicated below (e.g. 0.5.7
). Also, you can check the Releases page.
:exclamation: Make sure compileSdk
(or compileSdkVersion
if you're the older syntax) is set to 34
in your app/build.gradle.kts
file.
:exclamation: Make sure you sync the project after doing these changes. Click on the Sync Now button above the file contents. This tutorial demonstrates the Compose Video SDK, but you have the option to use the core library without Compose based on your preference.
Step 3 - Setup the tutorial for host & guest functionality
Confused about "Step 3 - Setup the tutorial for host & guest functionality"?
Let us know how we can improve our documentation:
The livestreaming tutorial is a bit different than the usual video calls since there are two types of users, host and guest.
For the purpose of our tutorial we are going to create two separate activities called MainActivityHost
and MainActivityGuest
The default activity that runs when you start the project in AndroidStudio is MainActivityHost
however you can find both app icons
in the main application drawer on the device.
Create two files
- MainActivityHost.kt
- MainActivityGuest.kt
To declare the two activities, simply copy the following code into your AndroidManifest.xml
into the <application>
tag.
:exclamation: Note that both activities have the LAUNCHER category this will prompt the system to add two icons in the application drawer.
Step 4 - Broadcast a livestream from your phone
Confused about "Step 4 - Broadcast a livestream from your phone"?
Let us know how we can improve our documentation:
The following code shows how to publish from your phone's camera.
Let's open MainActivityHost.kt
and replace the MainActivityHost
class with the following code:
You can find below the import
statements that are used throughout this tutorial. Paste them into your MainActivityHost.kt
file to follow along easily.
import statements
Replace them with your own values if you wish to use your own api key and users.
Review the code so far
In the first step we setup the user:
If you don't have an authenticated user you can also use a guest or anonymous user. For most apps it's convenient to match your own system of users to grant and remove permissions.
Next we create the client:
You'll see the userToken
variable. Your backend typically generates the user token on signup or login.
The most important step to review is how we create the call. Stream uses the same call object for livestreaming, audio rooms and video calling. Have a look at the code snippet below:
To create the first call object, specify the call type as livestream and provide a unique callId. The livestream call type comes with default settings that are usually suitable for livestreams, but you can customize features, permissions, and settings in the dashboard. Additionally, the dashboard allows you to create new call types as required.
Finally, using call.join(create = true)
will not only create the call object on our servers but also initiate the real-time transport for audio and video. This allows for seamless and immediate engagement in the livestream.
Note that you can also add members to a call and assign them different roles. For more information, see the call creation docs
Upon running your app, you will be greeted with an interface that looks like this:
Stream uses a technology called SFU cascading to replicate your livestream over different servers around the world. This makes it possible to reach a large audience in realtime.
Now let's press Go live in the android app and click the link below to watch the video in your browser.
State & Participants
Let's take a moment to review the Compose code above. Call.state
exposes all the stateflow objects you need.
The participant state docs show all the available fields.
In this example we use:
call.state.connection
: to show if we're connected to the realtime video. you can use this for implementing a loading interfacecall.state.backstage
: a boolean that returns if the call is in backstage mode or notcall.state.duration
: how long the call has been runningcall.state.totalParticipants
: the number of participants watching the livestreamcall.state.participants
: the list of participants
The call.state.participants
can optionally contain more information about who's watching the stream.
If you have multiple people broadcasting video this also contain the video tracks.
participant.user
: the user's name, image and custom dataparticipant.video
: the video for this userparticipant.roles
: the roles for the participant. it enables you to have co-hosts etc
There are many possibilities and the participant state docs explain this in more detail.
Backstage mode
In the example above you might have noticed the call.goLive()
method and the call.state.backstage
stateflow.
The backstage functionality is enabled by default on the livestream call type.
It makes it easy to build a flow where you and your co-hosts can setup your camera and equipment before going live.
Only after you call call.goLive()
will regular users be allowed to join the livestream.
This is convenient for many livestreaming and audio-room use cases. If you want calls to start immediately when you join them that's also possible. Simply go the Stream dashboard, click the livestream call type and disable the backstage mode.
Step 5 - Watching the livestream
Confused about "Step 5 - Watching the livestream"?
Let us know how we can improve our documentation:
Watching a livestream is even easier than broadcasting. Compared to the current code in in MainActivityHost.kt you:
Don't need to request permissions or enable the camera
Don't render the local video, but instead render the remote video
Typically include some small UI elements like viewer count, a button to mute etc
While MainActivityHost
will publish the host stream, we need a way to enter the call as 'guest' and watch the stream.MainActivityHost
The livestream layout is built using standard Jetpack Compose. The VideoRenderer component is provided by Stream. VideoRenderer renders the video and a fallback. You can use it for rendering the local and remote video.
If you want to learn more about building an advanced UIs for watching a livestream, check out Cookbook: Watching a livestream.
In the MainActivityGuest.kt
that we've created, copy the following code.
:exclamation: Note that you have to replace API keys and users here as well and the guest
user must be different than the host.
To better follow the example you may wish to copy the imports for the MainActivityGuest
class
import statements
Step 6 - (Optional) Publishing RTMP using OBS
Confused about "Step 6 - (Optional) Publishing RTMP using OBS"?
Let us know how we can improve our documentation:
The example above showed how to publish your phone's camera to the livestream. Almost all livestream software and hardware supports RTMPS. OBS is one of the most popular livestreaming software packages and we'll use it to explain how to import RTMPS.
A. Log the URL & Stream Key
B. Open OBS and go to settings -> stream
- Select "custom" service
- Server: equal to the server URL from the log
- Stream key: equal to the stream key from the log
Press start streaming in OBS. The RTMP stream will now show up in your call just like a regular video participant. Now that we've learned to publish using WebRTC or RTMP let's talk about watching the livestream.
Step 7 - (Optional) Viewing a livestream with HLS
Confused about "Step 7 - (Optional) Viewing a livestream with HLS"?
Let us know how we can improve our documentation:
Another way to watch a livestream is using HLS. HLS tends to have a 10 to 20 seconds delay, while the above WebRTC approach is realtime. The benefit that HLS offers is better buffering under poor network conditions. So HLS can be a good option when:
- A 10-20 second delay is acceptable
- Your users want to watch the Stream in poor network conditions
Let's show how to broadcast your call to HLS:
You can play the HLS video feed using any HLS capable video player, such as ExoPlayer.
Advanced Features
Confused about "Advanced Features"?
Let us know how we can improve our documentation:
This tutorial covered broadcasting and watching a livestream. It also went into more details about HLS & RTMP-in.
There are several advanced features that can improve the livestreaming experience:
- Co-hosts You can add members to your livestream with elevated permissions. So you can have co-hosts, moderators etc.
- Custom events You can use custom events on the call to share any additional data. Think about showing the score for a game, or any other realtime use case.
- Reactions & Chat Users can react to the livestream, and you can add chat. This makes for a more engaging experience.
- Notifications You can notify users via push notifications when the livestream starts
- Recording The call recording functionality allows you to record the call with various options and layouts
Recap
Confused about "Recap"?
Let us know how we can improve our documentation:
It was fun to see just how quickly you can build in-app low latency livestreaming. Please do let us know if you ran into any issues. Our team is also happy to review your UI designs and offer recommendations on how to achieve it with Stream.
To recap what we've learned:
- WebRTC is optimal for latency, HLS is slower but buffers better for users with poor connections
- You setup a call: (val call = client.call("livestream", callId))
- The call type "livestream" controls which features are enabled and how permissions are setup
- The livestream by default enables "backstage" mode. This allows you and your co-hosts to setup your mic and camera before allowing people in
- When you join a call, realtime communication is setup for audio & video: (call.join())
- Stateflow objects in call.state and call.state.participants make it easy to build your own UI
- For a livestream the most important one is call.state.???
Calls run on Stream's global edge network of video servers. Being closer to your users improves the latency and reliability of calls. The SDKs enable you to build livestreaming, audio rooms and video calling in days.
We hope you've enjoyed this tutorial and please do feel free to reach out if you have any suggestions or questions.
Final Thoughts
In this video app tutorial we built a fully functioning Android messaging app with our Android SDK component library. We also showed how easy it is to customize the behavior and the style of the Android video app components with minimal code changes.
Both the video SDK for Android and the API have plenty more features available to support more advanced use-cases.
Give us Feedback!
Did you find this tutorial helpful in getting you up and running with Android for adding video to your project? Either good or bad, we’re looking for your honest feedback so we can improve.