Building a Live Streaming App with WebRTC and Vue.js

18 min read
Stefan B.
Stefan B.
Published May 21, 2024

The Stream Video and Audio SDK offers a JavaScript library that can be integrated with any web framework.

Building a project using plain JavaScript and HTML can be a valid approach, and we have multiple tutorials and sample apps showing exactly how to do that. However, in the modern web, it has become common to use a framework. The most popular ones include React, Angular, and Vue.js.

We have plenty of tutorials and even a custom SDK for showing how to build a live streaming app using React. Also, we have a video demonstrating how to integrate the SDK into an Angular app.

This article covers integrating into a Vue.js application. We will build a live streaming application that can be used to broadcast and view a stream. Here’s a demo of what we will build:

The final project is also on GitHub. Feel free to check it out (and give it a ⭐️).

Creating a New Vue.js Project

To set up a new project, we will follow the general recommendations on the Vue website, specifically the Quick Start section.

They recommend using Vite, so in our terminal of choice, let’s execute the following command:

bash
1
2
3
4
5
# using yarn yarn create vue@latest # using npm npm create vue@latest

We need to answer a few questions on the type of setup:

bash
1
2
3
4
5
6
7
8
9
10
11
12
13
✔ Project name: vue-livestreaming ✔ Add TypeScript? Yes ✔ Add JSX Support? No ✔ Add Vue Router for Single Page Application development? No ✔ Add Pinia for state management? Yes ✔ Add Vitest for Unit testing? No ✔ Add an End-to-End Testing Solution? No ✔ Add ESLint for code quality? Yes ✔ Add Prettier for code formatting? Yes ✔ Add Vue DevTools 7 extension for debugging? (experimental) No Scaffolding project in ./vue-livestreaming Done.

This shows our setup, notably Typescript (because we prefer having type annotations) and Pinia for state management. Notice that these are optional; we can 100% build the project in plain Javascript and without Pinia.

We’ve also enabled ESLint for static code quality checking and Prettier for code formatting.

This completes the setup. To test if we can run the project, we run the following commands in our terminal:

bash
1
2
cd vue-livestreaming yarn && yarn dev

We see the default page and can start with the Vue.js project implementation.

Setting Up a Store

As we already mentioned, we are using Pinia to have an easy-to-access data model. It offers a way to define one or multiple stores that can serve different purposes. We can easily access these stores from anywhere in our component tree. For our purpose, we will only define one store as sufficient for our needs.

If we have just set up the new project and selected Yes when asked to add Pinia for state management, we can directly create it.

If we have not done that (or want to add it to an existing project), we can easily add it using the command line. We install it with this command:

bash
1
2
3
4
5
# using yarn yarn add pinia # using npm npm install pinia

Then, inside of our main.ts (or .js if you’re more of a JavaScript person), we need to initialize it and add it to our app object like this:

ts
1
2
3
4
5
6
7
import { createPinia } from 'pinia' const pinia = createPinia() const app = createApp(App) app.use(pinia) app.mount('#app')

Again, if we’ve configured the project with Pinia enabled, this is all done for us.

We can now proceed to define our store object. If it is not already there, we create a new folder at the root of our project and call it stores (even though we create one, we should follow the best practices).

Then, we’ll set up an empty store that we will fill with functionality in the next chapters.

Create a new file called streamStore.ts and fill it with the following content:

ts
1
2
3
4
5
import { defineStore } from 'pinia' export const useStreamStore = defineStore('stream', () => { return {} })

We initialize the store using the defineStore function from the Pinia SDK. We give it a name - in our case, stream - and then return all the elements we want to expose.

These can be properties and functions (which, in turn, can modify properties). We will see how to handle that and integrate refs into the process later. Our store is now set up.

Note that this is one way to use stores in Pinia called Setup Stores. We have chosen this option over the alternative called Option Stores. If you’re interested in a comparison, check the links for each option and the documentation on which to pick.

Initializing Stream Video

We will need access to a StreamVideoClient object inside our app for different purposes. Creating live streams, joining them, going live, and more all rely on having a client object at our disposal.

That’s why we want to initialize it when we create the store so we can access it later. We will cover the usage later on, but first, let’s initialize the StreamVideoClient.

The first thing to do is to install the Stream Video Javascript SDK using the following command:

bash
1
yarn add @stream-io/video-client

Inside the defineStore function, we need to have three objects:

  1. an apiKey: this is used to identify the Stream project on the backend; we can retrieve that using the Stream Dashboard.
  2. a user: an object to identify the currently logged-in user; this is of type User from the Stream package; it consists of 2 necessary properties (an id and a name), and we add one optional property (an imageUrl).
  3. a token: using the user’s id and the project secret (retrieved from the Dashboard), we need to issue a JWT for authentication.

To retrieve the API key, we open up the project in the Dashboard and copy it from there. See the demo below:

We can set up a server for the token using one of the SDKs. For development purposes only, we can also generate a token using Stream’s JWT Generator to generate a token for a specific user.

The only requirement is to have this user registered on the backend. We can do this programmatically or generate a user on the dashboard. Going to ExplorerUsersCreate new user allows us to create a user we can reference in code.

We want to put these values into an environment file so they can be used properly. Let’s create a new file at the root of our project called .env.local and define two values here:

bash
1
2
VITE_APP_API_KEY=<insert-api-key> VITE_APP_TOKEN=<insert-token>

Notice that we’ve been using vite for the project generation, so we need to preface our variables with VITE_APP_, which might vary for your project. For more information, visit the Vue documentation's Modes and Environment Variables section.

Next, write the code to initialize a StreamVideoClient instance inside the defineStore function in streamStore.ts. we begin by accessing the VITE_APP_API_KEY and VITE_APP_TOKEN from the import.meta.env object:

ts
1
2
const apiKey = import.meta.env.VITE_APP_API_KEY; const token = import.meta.env.VITE_APP_TOKEN;

We then perform validation checks to ensure that both apiKey and token are defined. If either of these variables is undefined, an error is thrown to halt execution and alert the developer:

ts
1
2
3
4
5
6
if (apiKey === undefined) { throw new Error('API key is not defined'); } if (token === undefined) { throw new Error('Token is not defined'); }

Once the environment variables are validated, we create a new instance of StreamVideoClient. This client is configured with the retrieved API key, token, and predefined user object. The user object includes static details, specifically an ID, a name, and an image URL:

ts
1
2
3
4
5
6
7
8
9
const streamVideoClient: StreamVideoClient = new StreamVideoClient({ apiKey: apiKey, token: token, user: { id: 'Stefan', name: 'Stefan', image: 'https://getstream.io/random_svg/?id=Stefan&name=Stefan' } });

With this, we securely load the credentials from an environment file, check if they are present, and then initialize the StreamVideoClient object safely using a hardcoded user.

Note that we are not covering a dynamic signup flow here. If you’re interested, please let us know (for example, on X), and we’ll do our best to cover this as well.

The last thing we do is include the streamVideoClient object in the return statement like this:

ts
1
2
3
return { streamVideoClient }

The setup is finished, and we can continue building our livestreaming application.

Switching Between Broadcasting and Viewing

Before implementing the different features, we create a simple method to switch between broadcasting and viewing modes. We create a tab-switching mechanism with two buttons. For this, we first replace the content of the script tag inside of App.vue with the following content:

ts
1
2
3
4
5
6
7
import { ref } from 'vue' const tabShowing = ref<'broadcast' | 'viewer'>('broadcast') function switchTab(tab: 'broadcast' | 'viewer') { tabShowing.value = tab }

We have a value for tabShowing that can be changed between two modes (the default being broadcast).

Then, inside the <main> element of the script tag, we add this code:

html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
<div class="tabList"> <button class="tabButton" :class="{ activeTab: tabShowing === 'broadcast' }" @click="switchTab('broadcast')" > Broadcast </button> <button class="tabButton" :class="{ activeTab: tabShowing === 'viewer' }" @click="switchTab('viewer')" > Viewer </button> </div> <section v-show="tabShowing === 'broadcast'"><h2>Broadcast</h2></section> <section v-show="tabShowing === 'viewer'"><h2>Viewer</h2></section>

First, we show two buttons that call the switchTab function when clicked with their respective values. One activates the broadcast option, and the other activates the viewer option.

Then, we add two <section> elements, checking if tabShowing equals broadcast or viewer, and show an <h2> with the respective text (we will replace those with our custom components once we’ve built them).

We added CSS classes for styling options, so let’s complete this by adding the following to the bottom of the file:

css
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
<style scoped> .tabList { display: flex; justify-content: center; align-items: center; } .tabButton { --background-color: white; color: var(--vue-green); padding: 0.5rem 2rem; background: var(--background-color); border-radius: 0; border: 1px solid transparent; transition: all 0.2s ease-in-out; } .tabButton:hover { border: 1px solid var(--vue-green); } .activeTab { --background-color: var(--vue-green); color: white; } </style>

For this to work, we added two CSS variables for the Vue.js colors inside the :root element of base.css:

css
1
2
3
4
5
:root { /** other content **/ --vue-green: hsla(160, 100%, 37%, 1); --vue-green-light: hsla(160, 80%, 57%, 1); }

With that, we have two buttons to switch between the features.

Let’s start with implementing the first one.

Adding a Form to Start a Livestream Using WebRTC

We want to start the application using the broadcasting feature and add the viewing option later. We also want to allow users to share a live stream using WebRTC. We’ll create a form to enter an ID.

Once that is entered, we show the user how the stream will look but keep it in backstage mode. With a go-live button, they can share the stream online.

Before we start writing the UI, let’s first define the logic in our store. We need three properties.

  1. The first one is a call object that resembles the live stream itself (in the Stream world, the type is still considered a Call object since it’s very versatile).
  2. The second one is a localParticipant, which describes the local user since we want to show a preview of what they will stream to the public on screen. We will want to subscribe to that value of our call (using rxjs), so we will need a reference to that subscription.
  3. The last property is called isBackstage. This is a principle in the Stream SDK that allows people who stream to check their settings and setup before starting the stream for viewers. Per default, a call is created in backstage mode and will only be available to viewers after being manually taken live. Since we subscribe to updates of that value and want to clean that up properly later on, we need a reference to that subscription as well.

We open up streamStore.ts, and inside our defineStore function, we add the following code at the beginning:

ts
1
2
3
4
5
const call = ref<Call | undefined>() const isBackstage = ref<boolean>(false) const isBackstageSub = ref<Subscription | undefined>() const localParticipant = ref<StreamVideoParticipant | undefined>() const localParticipantSub = ref<Subscription | undefined>()

If you’re unsure how ref works or what it is, we recommend going through the reactivity fundamentals of Vue.js again.

Also, we added a property to store a reference to a Subscription object. We will want to listen to updates on changes in the localParticipant object and hold onto a reference to clean that up properly.

We need to import the types because otherwise, it will show errors. Add the imports to the top of the file

ts
1
2
import { ref } from 'vue' import type { Call, StreamVideoParticipant } from '@stream-io/video-client'

Next, we want to define two functions: create a new call (or live stream) and end a call (or live stream).

We start by creating a new call, which takes a few steps. The first is to create a new Call object using the streamVideoClient we defined earlier, combined with the id the user entered. We need to give it the call type livestream since this will provide many default settings under the hood. Check the Call Types documentation page to learn more.

Then, we join the call (creating it if it’s not already there) and enable both the camera and microphone. Then, we subscribe to the localParticipant Observable of the newly created call’s state object and update our localParticipant ref whenever the emitted value from the Observable changes.

Building your own app? Get early access to our Livestream or Video Calling API and launch in days!

Finally, we must update the call ref defined with the created call.

Here is the full code for the createCall function, which we can add below the streamVideoClient initialization:

ts
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
async function createCall(id: string) { const newCall = streamVideoClient.call('livestream', id) await newCall.join({ create: true }) await newCall.camera.enable() await newCall.microphone.enable() // Subscribe to the local participant localParticipantSub.value = newCall.state.localParticipant$.subscribe((updatedLocalParticipant) => { localParticipant.value = updatedLocalParticipant } ) // Subscribe to backstage property isBackstageSub.value = newCall.state.backstage$.subscribe((backstage) => { isBackstage.value = backstage }) // Update the local call value call.value = newCall }

The code for the endCall function is even shorter. We use the local call object's current value and endCall function. We call the unsubscribe functions of both of our subscriptions. Then, we update the value to be undefined.

Here’s the code:

ts
1
2
3
4
5
6
7
async function endCall() { await call.value?.endCall() localParticipantSub.value?.unsubscribe() isBackstageSub.value?.unsubscribe() call.value = undefined }

The last thing to do here is to return the newly created properties and values at the end of the function:

ts
1
2
3
4
5
6
7
8
return { call, isBackstage, localParticipant, streamVideoClient, createCall, endCall }

We can start building the UI, and the first thing we do is create a new file called BroadcastComponent.vue inside the components directory.

This component shows UI in two states: a form to enter an ID when no call is active and the stream preview otherwise.

Let’s build the form with its logic first. In the script, we need access to the call and localParticipant of the streamStore. To access their values, we need to extract them from the store object using Pinia’s storeToRefs function (see documentation).

Here’s the code for the script first, and we will explain the rest of its functionality afterward:

ts
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
import { computed, ref } from 'vue' import { storeToRefs } from 'pinia' import { useStreamStore } from '@/stores/streamStore' const store = useStreamStore() const { call, localParticipant } = storeToRefs(store) const callId = ref('') const isCallLive = computed(() => { return call.value && localParticipant.value }) function startBroadcast() { if (callId.value) { store.joinCall(callId.value) } }

We hold a reference to the callId that we’ll use in the form. Also, the computed property isCallLive will determine whether to show the form or the call itself. The startBroadcast function checks if the callId contains a value and, if that’s the case, calls the createCall function we defined in the store.

The form's UI is conditionally shown (using a v-if check) if the computed isCallLive property is false. It has an input tied to the callId ref using the v-model directive. A button is added that calls the startBroadcast function on click.

This is the code we add to the template for now:

html
1
2
3
4
5
6
<template> <section v-if="!isCallLive" class="content-section input-form"> <input type="text" v-model="callId" placeholder="Enter call ID" /> <button @click="startBroadcast">Start Broadcast</button> </section> </template>

Note that it adds two CSS classes to the <section> element. We’ll add them to the main.css file to be able to re-use them for the viewer component later on:

css
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
.content-section { display: flex; flex-direction: column; margin: 2rem auto; border: 1px solid var(--vue-green); border-radius: 0.5rem; gap: 1rem; background: var(--color-background); overflow: hidden; } .input-form { display: flex; gap: 2rem; margin: 1rem; padding: 2rem; }

With that, the form to join the live stream is done.

Displaying the Broadcasted Livestream

We want to tackle the UI elements before we write the logic of when our broadcasted live stream should be displayed. We branch this into its component and create a new file inside the components folder called VideoComponent.vue. This will be re-used for the viewing feature.

We explain how to build up the component step-by-step and build up the code accordingly. We start by adding a script tag:

ts
1
2
<script setup lang="ts"> </script>

We begin by importing the necessary functions from Vue and Pinia:

ts
1
2
import { ref, onMounted, onUnmounted } from 'vue'; import type { Call, StreamVideoParticipant } from '@stream-io/video-client';

The reason we need the lifecycle methods (onMounted, onUnmounted) will become obvious in a second.

We define the props to accept both a call and a participant object (note that these are the same types as the call and the localParticipant from our store):

ts
1
2
3
4
const props = defineProps<{ call: Call | undefined; participant: StreamVideoParticipant | undefined; }>();

We need the respective HTML elements to display video and audio elements. But we also need to have a reference to them because we need to manually bind (and unbind once the component is destroyed) the video and audio streams from the call to the respective elements. We will later reference the videoElement and audioElement objects in the HTML.

Here is the code for the definitions:

ts
1
2
3
4
const videoElement = ref<HTMLVideoElement | null>(null) const audioElement = ref<HTMLAudioElement | null>(null) const unbindVideoElement = ref<(() => void) | undefined>() const unbindAudioElement = ref<(() => void) | undefined>()

The onMounted lifecycle hook binds the video and audio elements to the call when the component is mounted. It checks if the video and audio elements exist, then calls the bindVideoElement and bindAudioElement methods from the call object, storing the return values in the unbinding functions:

ts
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
onMounted(() => { if (videoElement.value) { unbindVideoElement.value = props.call?.bindVideoElement( videoElement.value, props.participant?.sessionId || 'sessionId', 'videoTrack' ) } if (audioElement.value) { unbindAudioElement.value = props.call?.bindAudioElement( audioElement.value, props.participant?.sessionId || 'sessionId' ) } })

The onUnmounted lifecycle hook ensures that both the video and audio bindings are unbound when the component is unmounted:

ts
1
2
3
4
onUnmounted(() => { unbindVideoElement.value?.() unbindAudioElement.value?.() })

In the template, all we need to do is create the video and audio elements and ensure we correctly add the ref. We also define a width and a height for the video element.

html
1
2
3
4
<template> <video ref="videoElement" width="400" height="300" /> <audio ref="audioElement" /> </template>

Finally, we ensure the video fits correctly within its container by adjusting the object-fit CSS property of the video element:

css
1
2
3
4
5
<style scoped> video { object-fit: contain; } </style>

This reusable component effectively manages the video and audio elements’ lifecycle for a live call and cleans up when the component is unmounted.

We can now add it to the BroadcastComponent, starting with the import at the top (inside the script tag):

ts
1
import VideoComponent from '../VideoComponent.vue'

We want to add a button that goes live with the current call or enters backstage mode when clicking it, so let’s add this:

ts
1
2
3
4
5
6
7
async function goLiveClicked() { if (isBackstage.value) { await call.value?.goLive() } else { await call.value?.stopLive() } }

The button will have a text that says 'Go live' (when the call is in backstage mode) or 'End broadcast' (when it is live). We can add a computed property for that:

ts
1
2
3
const buttonText = computed(() => { return isBackstage.value ? 'Go live' : 'End broadcast' })

Then, inside the template, we add a new section that we only display if isCallLive is true (using the v-if directive):

html
1
2
3
4
5
6
7
<section class="content-section" v-if="isCallLive"> <VideoComponent :call="call" :participant="localParticipant" /> <div class="button-row"> <button @click="goLiveClicked">{{ buttonText }}</button> <button @click="store.endCall()">End stream</button> </div> </section>

It binds the call and localParticipant to the properties of our newly created VideoComponent. It also adds two buttons, one for going live (or ending the broadcast) and one for ending the call.

For the button row, we added a CSS class that we define at the bottom of the file:

css
1
2
3
4
5
6
7
8
9
<style scoped> .button-row { display: flex; gap: 1rem; align-items: center; justify-content: space-between; margin: 1rem; } </style>

With this, we have the broadcasting component finished and only need to import it into our App.vue component and show it. At the top, we do the import:

ts
1
import BroadcastComponent from './components/BroadcastComponent.vue'

In the <section> element that checks if tabShowing is equal to 'broadcast' we then replace the h2 element with the BroadcastComponent:

html
1
<section v-show="tabShowing === 'broadcast'"><BroadcastComponent /></section>

We can see it in action now:

We cannot view this livestream now, so let’s add that next.

Building the Viewing Feature

The viewing feature will re-use many of the principles we encountered when creating the broadcasting feature, so we will go over them more quickly.

First, we’ll add the functionality to the streamStore. As we’ve done for the localParticipant, we now need a remoteParticipant, the person sharing the stream. In addition, we also need to hold onto a reference to the subscription later. We add this after the localParticipant definition in streamStore.ts:

ts
1
2
const remoteParticipant = ref<StreamVideoParticipant | undefined>() const remoteParticipantSub = ref<Subscription | undefined>()

Now, we need to add a function to watch a stream, so let’s add this next. It will differ slightly from the createCall. We don’t want to create a call if it’s not there, and we don’t want to enable our camera and microphone since we’re only consuming the content. In fact, we want to disable the camera and the microphone actively. This is because, by default, they will get enabled (which is a property we can change in the Stream dashboard; see documentation for more information).

Lastly, we don’t subscribe to the localParticipant Observable but to the one for remoteParticipants. We’re simplifying things by only using the first remoteParticipant we find. More people might be sharing their stream in production applications, so we need to consider that.

Here’s the code for the watchStream function:

ts
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
async function watchStream(id: string) { const newCall = streamVideoClient.call('livestream', id) await newCall.camera.disable() await newCall.microphone.disable() await newCall.join() remoteParticipantSub.value = newCall.state.remoteParticipants$.subscribe( (newRemoteParticipants) => { if (newRemoteParticipants && newRemoteParticipants.length > 0) { remoteParticipant.value = newRemoteParticipants[0] } else { remoteParticipant.value = undefined } } ) call.value = newCall }

We also need a function to leave the stream again, which looks very similar to the endCall function, only that we leave the call (instead of ending it) and we unsubscribe from the remoteParticipantSub:

ts
1
2
3
4
5
6
async function leaveStream() { await call.value?.leave() remoteParticipantSub.value?.unsubscribe() call.value = undefined }

Our return function needs to be updated to look like this:

ts
1
2
3
4
5
6
7
8
9
10
11
return { call, isBackstage, localParticipant, remoteParticipant, streamVideoClient, createCall, endCall, watchStream, leaveStream }

We can now create a new component in our components folder called ViewerComponent.vue. It will look similar to the BroadcastComponent, so we want to highlight only the differences here.

It has one function: it starts watching a live stream with a given ID by calling the watchStream function we defined in our streamStore.

Here is the logic for the entire component:

ts
1
2
3
4
5
6
7
8
9
10
11
12
13
14
const store = useStreamStore() const { call, remoteParticipant } = storeToRefs(store) const callId = ref('') const showRemoteVideo = computed(() => { return call.value && remoteParticipant.value }) function watchStream() { if (callId.value) { store.watchStream(callId.value) } }

It shows a form for entering a callId when no remoteVideo is available from the streamStore. Otherwise, the VideoComponent is configured with the remoteParticipant and a button to leave the stream.

Here’s the code for the UI:

html
1
2
3
4
5
6
7
8
9
10
11
12
<template> <section class="content-section" v-if="showRemoteVideo"> <VideoComponent :call="call" :participant="remoteParticipant" /> <div class="button-row"> <button @click="store.leaveStream">Leave</button> </div> </section> <section class="input-form content-section" v-else> <input type="text" v-model="callId" placeholder="Enter stream id to join" /> <button @click="watchStream">Watch Stream</button> </section> </template>

The last remaining thing is to replace the <h2> tag in the App.vue for the case when tabShowing equals viewer. Here’s what we’ll replace it with (remember to add the import for the ViewerComponent at the top):

html
1
<section v-show="tabShowing === 'viewer'"><ViewerComponent /></section>

With that, we have a fully functional application where we can broadcast live streams and view them. Here’s a demo of the viewing experience:

Summary

In this article, we’ve covered how to build a Vue.js livestreaming application. We’ve built a way to start a new live stream by only entering an ID. Users can see a preview of the stream (using the backstage mode) and then start streaming with a button click.

We also built a feature to view these live streams easily. We haven’t talked much about the underlying technology. The entire architecture runs on WebRTC by default, but we also provide options to use HLS since it’s still a very popular technology.

We hope you learned how to integrate Stream’s JavaScript SDK (which is open-source) into the very popular Vue.js framework. This can be adapted to other frameworks as well. We’re demonstrating this in this video about creating a calling application using Angular (and Angular Signals).

If you’re interested in the integration into other frameworks or want to learn more about topics like HLS streaming, feel free to reach out on X. Consider giving the GitHub repo a star if you enjoyed this article, and let us know about the cool things you build with our SDK.

Integrating Video With Your App?
We've built a Video and Audio solution just for you. Check out our APIs and SDKs.
Learn more ->