Live Streaming With Mux, Stream, and Flutter

13 min read
Nash R.
Nash R.
Published January 4, 2021 Updated January 8, 2021

Livestreams are everywhere these days, from popular gaming sites such as Twitch to more casual everyday apps like Instagram. Apps use live streaming and live video to help connect users and add another level of interactivity to their platform.

This post is still useful, but out of date. Stream now offers a Live Video Streaming API!

If you think about modern live streaming, there are many different details to consider when building your application. For instance, most popular streaming platforms also include live chats in addition to live videos. For developers, this adds more overhead in terms of time and cost.

For this article, we will briefly look at technologies powering live video and build a small streaming application in Flutter πŸ’™.

If you're in a hurry, here is the TL;DR:

Live streaming is very complicated. It involves a lot of different moving pieces to implement correctly. Before we start writing code, it's essential to understand some of the higher-level concepts behind live streaming functions. Raw videos are converted, compressed, and transcoded before reaching your computer.

Background

Live streaming can be traced back to the late 1990s and early 2000s. The protocol used for live streaming, Real-Time Messaging Protocol or RTMP, was developed by Macromedia a company that was later purchased by Adobe in late 2005.

The protocol was proprietary and intended for streaming video and audio data between Flash server and multiple Flash clients through the internet in the early days. This early RTMP version had a very low latency, usually 3 - 5 seconds from server to client.

As the years progressed, Flash's dominance started to fade, and new devices such as mobile, smart TVs, etc., started gaining popularity. Eventually, Adobe decided to open the RTMP protocol, which gave it new life.

Its ubiquity among web browsers helped it become the gold standard for streaming servers, but with the rise of new frontend protocols like HTTPS Live Streaming (HLS), RTMP's client-side popularity started to fade.

If you are curious to learn more about RTMP, the full specification can be found on Adobe's website at the following URL:

Real-Time Messaging Protocol (RTMP) specification

Here is a summary of RTMP. The protocol sits on top of the internet's Transmission Control Protocol, commonly referred to as TCP. By default, it uses port 1935 for communication. As the years passed, updates to the protocol included adding transport layer security (TLS) support (RTMPS) and a proprietary form of encryption developed and in house by Adobe (RTMPE).

Today, RTMP is not as dominant as it used to be. It is still widely used by streaming servers to ingest raw video from a source at the beginning of the live stream cycle. However, the stream is then encoded, compressed, and repacked to suit the end device better.

Basic Streaming Pipeline

Live streaming lifecycle

Pipeline Steps

Before you can enjoy the latest version of your favorite TV show or watch your favorite sport as it happens, there are a few steps a raw video must take before it reaches your device.

At the start, raw video is captured from a camera or digital recorder. The raw input can be massive and not suitable for transfer over the internet. To reduce the content's size and make it more accessible, it is encoded and compressed to an open codec such as H.264, VP8. This can vary depending on the user's needs, but H.264 is the preferred video encoding option.

Next, the encoded video is distributed to media servers using a streaming protocol. These days, the most popular streaming protocol is RTMP. Other protocols can be used, such as MPEG-DASH or Secure Reliable Transport (SRT).

Streams created by these protocols are then sent to a media server where they are transcoded, resized, and split into different resolutions and formats for delivery to the end-user. In most cases, the stream is repackaged into various forms of quality and bitrates to better serve users of other internet connections. This process is known as "transmuxing".

Finally, the stream is sent to the end-user using a method such as MPEG-DASH or Apple's HLS protocol. These are two of the most widely used and most compatible delivery methods for live streaming. It is not uncommon for streams to be distributed via a content delivery network or CDN to reduce latency and load to the streaming server.

Streaming in Practice

If you can't already tell, building an end-to-end streaming platform is no easy task. On top of the technical complexities involved in creating and maintaining an end-to-end pipeline, there is the added cost and time to develop and scale your pipeline and maintain servers in different regions to achieve low latency playback.

Luckily, some companies and services excel at doing just that. Streaming platforms such as Mux allow developers and businesses to integrate live video into their app while reducing their time to market and the technical/financial hurdles traditionally associated with developing a live video.

For our streaming app, we are going to use Mux and Stream for integrating live video and chat into our application.

Before we begin, let's outline some of the goals for our app:

  • Playback custom HLS and RTMP streams
  • Show an on-demand archive of past streams
  • Live messaging and chat under videos

Project Setup

As I mentioned previously, we will be using Mux and Stream to handle video streaming and live messaging. We can create free accounts on both services to obtain our API keys. In the case of Mux, enter your email, then follow the instructions sent to your inbox.

The process is similar for Stream: Pick a username, then enter your email and password.

Next, we can create a live test stream on Mux to verify everything is working correctly. Navigate to the side menu on the left of the screen and select the "Live Streams" subcategory under the video option.

Here, we can view ongoing streams or create new streams directly from the dashboard. For our test, let's select "Create New Live Stream" in the top right.

We are now presented with a console that allows us to create a new live stream. Since we are using a free account, there is a warning message informing us that our stream will be limited to 5 minutes. However, this limit just applies to the length of the video and doesn't limit access to other features. We can customize different aspects of the live stream, such as the stream and asset's privacy settings, once the stream concludes. There are lots of options and configurations you can set for a stream. To learn more about the different options, I highly recommend checking out the getting started guide on the Mux docs.

After running the request, you'll see a response similar to the image below. The most notable keys for us are the stream_key and playback_ids.id. These will be used later to publish our stream and view our stream, respectively.

πŸ“ Note: You should never publish your stream key; this value should always be kept private.

Finally, we can view our newly created stream's details either by clicking the live streams option in the side menu or the "View Live Stream" option at the bottom.

This page contains information about the current stream. In our case, it shows our stream as "idle" since we are not broadcasting. We can also see the unique live stream id, a thumbnail preview on the side, and playback ids for the stream from our live stream overview.

To quickly test our stream, we can use a broadcasting app such as Larix to broadcast from our mobile device.

Larix on Google Play: https://play.google.com/store/apps/details?id=com.wmspanel.larix_broadcaster&hl=en

Larix in Apple App Store: https://apps.apple.com/us/app/larix-broadcaster/id1042474385

In Larix, we can create a new connection using the URL rtmps://global-live.mux.com:443/app/<YOUR-STREAM-KEY>. I am also using RTMP authorization using my Mux credentials; however, this step is optional.

Once the connection is saved, we can start broadcasting using the red "record" button on Larix.

To verify the stream is working, try refreshing the live stream preview page or visit the URL https://stream.mux.com/YOUR-PLAYBACK-ID.m3u8.

Congratulations πŸŽ‰ , you've taken the first steps in building a live video app. Next comes building the skeleton of our application!

Building Our App Layout

Let start by creating a new Flutter project. Feel free to name this anything you like. Once it's finished creating, add the following to your pubspec.yaml:

400: Invalid request

To keep things simple, our app is going to have three screens. The first screen will be a landing page shown to all users when they launch the application. Here, users can either enter a custom URL and nickname to watch live videos with friends or go directly to the app's home page to view a list of current and past live streams.

For our video playback screen, we will divide the screen into two parts, a video player at the top and a live chat at the bottom.

Coding the Landing Page

Our landing page layout consists of a few widgets, a Column used to design two TextFields, an Icon, and an action button to perform navigation. If the user enters a custom URL, we change the button from a rectangular ElevatedButton to a circular button with an icon.

The code for this page will look similar to this:

400: Invalid request

πŸ’‘ Did you know you can use a ShaderMask over a widget to apply a gradient? Notice in the code above, we use a ShaderMask to apply a gradient on our landing page icon.

Next, we can move on to creating the layout of the home page.

Home Page Layout

The home page is a little complex. To keep things manageable, I chose to use two individual widgets.

The first widget contains a static method used to simplify our navigation and initialization of our page controller. Later, we will be integrating cubit to handle our state changes.

400: Invalid request

Our home page's actual content will consist of a CustomScrollView, PageView, and SliverGrid. The page view displays a list of current streams while past live streams are displayed in the sliver grid. To help differentiate between the different sections, we can use a CupertinoSliverNavigationBar with a large title.

Building your own app? Get early access to our Livestream or Video Calling API and launch in days!
400: Invalid request

Finally, we can implement our FeaturedStreamCard widget using a simple AspectRatio and a card. An AspectRatio is really useful for maintaining an image ratio as the widget is scaled and resized.

400: Invalid request

Bringing everything together, we can configure the navigator in our previous screen, LandingPage, to route to our new screen by implementing the function onContinueToHomePressed.

400: Invalid request

Video Page Layout

The final screen in our application is by far the most important. After all, what's good for a live streaming app without a video player and a chat? 😝

The player's design contains two distinct elements: the player itself and a chat list at the bottom. For now, let's focus on the player. If you recall, we initially added a few packages to the project. One of the packages we added was the video player plugin yoyo_player. I choose this package since it comes with a friendly UI out of the box and supports HLS/RTMP stream. However, you can also use the official video_player package if you like.

400: Invalid request

As you can probably tell, we will use a Column and two Expandeds to create our layout. For now, I only created the video player. We will look at the chat later on.

Backend Setup

Like all applications, we need to create a backend for our app. In our case, the term backend refers to a layer of the application responsible for communicating with external services. Since we will be using Mux and Stream in this tutorial, we will create a simple backend that implements the functionality necessary for our application.

First, let's configure our Mux backend. The code needed to implement this backend is minimal, as there are only two functions to implement: fetchPastLivestreams and fetchLivestreams. Both functions will return a list of objects which we can convert to an object by creating a simple model.

πŸ’‘ Notice we are passing the Mux API string to our class. This is a very simple api which queries mux for active and past live streams. You can find the sample api here. As good practice, you can pass these values to your app at run time using Dart Define in a config file.

Class:

import 'dart:convert' show jsonDecode;
import 'dart:developer' show log;

import 'package:flutter/cupertino.dart';
import 'package:hfs/models/video_model.dart';
import 'package:http/http.dart' as http;
import 'package:meta/meta.dart';

@immutable
class MuxBackend {
  MuxBackend({
    @required this.client,
    @required this.muxApi,
  }) : assert(client != null);

Model:

400: Invalid request

πŸ’‘ Mux generates thumbnails for our videos! We can access these thumbnails by concatenating our playback ID with their image url.

Next, it's time to configure Stream for handling our chat.

Stream Overview

Stream provides an easy to integrate chat platform with client SDKs in multiple languages. Stream's goal is to help users seamlessly integrate high-performance, low latency chat into their apps with minimal setup and overhead.

In our case, Stream already supports live streaming out of the box as a predefined type. You can view and customize these settings by going to your Stream dashboard for the project we created earlier.

⚑️: getstream.io > project > chat > overview > channel types

Implementing the Stream Backend

Implementing our backend for Stream is very similar to implementing our Mux backend. In this case, we have three functions for generating a token and configuring a user and channel. Since some of the Stream terminologies may be unfamiliar, think of a channel as a box containing all the messages for a given conversation. A channel generally has a type, in our case "livestream" and a unique ID.

You may notice we are creating channels with an ID generated from the video URL in our example below. This ensures that if a video is shared with friends, they can all enter and participate in the same channel.

400: Invalid request

As a matter of personal preference, I like to combine my different services/backends into a single class conveniently called Backend.

Here we can create an initializer for creating our classes.

import 'package:hfs/backend/mux_backend.dart';
import 'package:hfs/backend/stream_backend.dart';
import 'package:hfs/config.dart';
import 'package:http/http.dart' as http;
import 'package:meta/meta.dart';
import 'package:stream_chat_flutter/stream_chat_flutter.dart';

@immutable
class Backend {
  final StreamBackEnd streamBackEnd;
  final MuxBackend muxBackend;

  const Backend._({
    @required this.streamBackEnd,
    @required this.muxBackend,

Finally, we can initialize our backend in our primary function and pass it to our application.

400: Invalid request

State Management

Finally, we get to the most exciting part of developing a Flutter application: picking a state management pattern. πŸ˜„

I will keep things boring and use the wildly popular bloc package for handling our application. More specifically, I will be using cubit since I like its ease of use and minimal setup.

Examining the use cases and features of our application, we can identify a few candidates for cubits:

  • User management
  • Archived videos
  • Live videos
  • Channels

I am not going to cover the implementation of each cubit in-depth, but the general flow looks similar to the sample below:

Cubit:

400: Invalid request

State:

400: Invalid request

To view the implementation of each bloc, you can look at the repository on GitHub:

Once we're finish implementing our cubits, we can move on to registering them using a BlocProvider for use in our application.

In main.dart, let's remove the default MyApp widget and create a new Stateless widget. In this widget, we will create the material app for our application and register our bloc providers.

400: Invalid request

For convenience, we can create a MultiBlocProvider to register all four of our cubit classes. Finally, we can wrap our MaterialApp in a StreamChat and pass the Stream client we created earlier as the required argument.

πŸ’‘ Did you know you can trigger a function immediately after a cubit is created by using Dart's cascade notation? We use this format to load the initial live and archived video.

Bringing It All Together

Excellent work, you're doing great! The live streaming puzzle's final piece is to replace the static content in our application using live data from the cubit and backend we created.

Let's start with the easiest part, updating the landing page. We can modify the onPress function to configure a user and channel before navigating to the home page.

400: Invalid request

Notice we are using await in onCustomUrlGoPressed when configuring a user. This is because the Stream SDK requires us to have an active WebSocket connection before configuring a channel.

Next, we can update our HomePage by replacing the contents of build with:

400: Invalid request

Here we are handling a few use-cases by showing a loader when the state is loading and a Text if an error occurs.

We can now move on to _HomePageContent to implement the live stream and archive cubits.

400: Invalid request

Finally, we can replace the temporary container in our video player page with an instance of StreamChannel. This is a widget provided to us by the Stream SDK, which can display a list of messages. Stream channel requires two parameters, a channel and a child. The child of the Stream channel is used to display the messages in the channel.

400: Invalid request

Awesome β€” we're done! Let's run our application and start a live stream using the Larix app!


Congratulations πŸ₯³ πŸŽ‰

Wow! We covered a lot in this article, from learning about live streaming's inner workings to building a simple app with Flutter πŸ“±.

This is just the tip of the iceberg. If you'd like to learn more and try building the project for yourself, the code can be found on my Github here.

Be sure to checkout Mux and Stream to learn more about video streaming and chats. Both services offer a free trial without requiring a credit card. I encourage you to check them and try creating your Live Flutter application.

Thank you for reading πŸ’™

~ Nash
πŸ₯: https://www.twitter.com/Nash0x7e2
πŸ”—: https://www.linkedin.com/in/neevash-ramdial/

decorative lines
Integrating Video With Your App?
We've built a Video and Audio solution just for you. Check out our APIs and SDKs.
Learn more ->