await call.startHLS();
await call.stopHLS();
Broadcasting
Broadcasting serves as a means of transmitting live or pre-recorded content to a wide audience.
We can choose from two approaches to broadcasting the media:
It is up to the integrators to decide, what approach will be used in their apps for the audience to consume the streams.
We have built a livestream app tutorial that relies on the broadcasting feature. The demo expands on how to implement both, the HLS and the WebRTC approach to streaming.
Call type for broadcasting
Stream infrastructure recognizes few pre-built call types. Among them, the type livestream
type is the best suited for broadcasting events. When a livestream
call is created, it is set to backstage
mode by default. The backstage
mode makes it easy to build a flow where hosts can set up cameras and equipment before going live.
Starting and stopping the broadcasting
We have the following Call
methods at our disposal to start and stop the broadcasting:
alternatively:
await call.goLive({ start_hls: true });
await call.stopLive({ continue_hls: false });
Once started broadcasting, the data source URL is available through playlist_url
property accessible through the Call
state:
import { useCallStateHooks } from "@stream-io/video-react-sdk";
// omitted code ...
const YourComponent = () => {
const { useCallEgress } = useCallStateHooks();
const egress = useCallEgress();
const m3u8Playlist = egress?.hls.playlist_url;
// omitted code ...
};
To play the video over HLS, a third-party library is required (for example, HLS.js).
Sample HLS.js integration
Below is a sample integration of HLS.js
with Stream’s HLS broadcasting feature.
The integration is broken down into the following steps:
- Install and initialize
HLS.js
- Get the HLS
m3u8
playlist URL fromcall.state
- Attach the HLS stream to a
<video />
element and start playing - Handle
HLS.js
events and errors - Allow the user to select the quality level (e.g.,
720p
,1080p
)
import HLS from "hls.js";
import { useEffect, useMemo } from "react";
import { useCallStateHooks } from "@stream-io/video-react-sdk";
export const HLSPlayer = () => {
const { useCallEgress } = useCallStateHooks();
const egress = useCallEgress();
// will point to an m3u8 playlist URL
const playlistUrl = egress?.hls.playlist_url;
// start by creating a new HLS.js player instance
const hls = useMemo(() => new HLS(), []);
// get the video element where HLS.js will render the video
const [videoRef, setVideoRef] = useState<HTMLVideoElement | null>(null);
// store the available quality levels (720p, 1080p, etc.)
// see how they are updated in the HLS.js event listeners below
const [qualityLevels, setQualityLevels] = useState<HLS.Level[]>([]);
// handle the user's stream quality change event
// and instruct hls.js to load the selected level
const onQualityChange = (e) => {
const selectedLevel = parseInt(e.target.value, 10);
if (hls.currentLevel !== selectedLevel) {
hls.loadLevel = selectedLevel;
}
};
useEffect(() => {
// HLS broadcasting is not available, do nothing...
if (!videoRef || !playlistUrl) return;
// listen to the hls.js error event and naively attempt to recover
hls.on(HLS.Events.ERROR, (e, data) => {
console.error("HLS error, attempting to recover", e, data);
setTimeout(() => {
hls.loadSource(playlistUrl);
}, 1000);
});
// listen to LEVELS_LOADED event, this will tell you when
// the available quality levels (720p, 1080p, etc.) have been loaded.
hls.on(HLS.Events.LEVELS_LOADED, (e, data) => {
console.log("HLS levels loaded", e, data);
setQualityLevels(data.levels);
});
// listen to the LEVELS_UPDATED event, this will tell you when
// the available quality levels (720p, 1080p, etc.) have changed.
hls.on(HLS.Events.LEVELS_UPDATED, (e, data) => {
console.log("HLS levels updated", e, data);
setQualityLevels(data.levels);
});
// listen to buffer end of stream event, this will tell you when
// the stream has ended. e.g. the broadcaster has stopped streaming.
// keep in mind, the viewer may continue to watch the stream.
// this event signals that there won't be any new data coming in.
hls.on(HLS.Events.BUFFER_EOS, (e, data) => {
console.log("HLS buffer eos", e, data);
});
// load the m3u8 playlist URL
hls.loadSource(playlistUrl);
// and attach it to the video element
hls.attachMedia(videoRef);
}, [videoRef, playlistUrl, hls]);
// render a simple UI (video player and quality selector)
return (
<>
<video ref={setVideoRef} controls autoPlay />
<select onChange={onQualityChange}>
{qualityLevels.map((level) => (
<option key={level.name} value={level.id}>
{level.name || `${level.height}p}`}
</option>
))}
</select>
</>
);
};
For more advanced integration, please refer to the HLS.js documentation or, take a look at our livestreaming demo application (source code).
Broadcasting via RTMP
Our systems provide first-class support for streaming from RTMP clients as OBS. To connect your OBS project in a Stream Call, please follow the next steps:
RTMP URL and stream key
Our call
instance exposes its RTMP address through call.state.ingress
and useCallIngress()
call state hook.
Stream Key
in our case is a standard user token.
You can take this information and use it to configure OBS:
import { useCallStateHooks } from "@stream-io/video-react-sdk";
const { useCallIngress } = useCallStateHooks();
const ingress = useCallIngress();
const rtmpURL = ingress?.rtmp.address;
const streamKey = myUserAuthService.getUserToken(rtmpUserId);
console.log("RTMP url:", rtmpURL, "Stream key:", streamKey);
Configure OBS
- Go to Settings > Stream
- Select
custom
service - Server: enter the
rtmpURL
logged in the console - Stream Key: enter the
streamKey
logged in the console
Press Start Streaming
in OBS and the RTMP stream will now show up in your call just like a regular video participant.