await call.getOrCreate(
transcription: const StreamTranscriptionSettings(
transcriptionMode: TranscriptionSettingsMode.available,
closedCaptionMode: ClosedCaptionSettingsMode.available,
),
);
Closed Captions
The Stream API supports adding real-time closed captions (subtitles for participants) to your calls. This guide shows you how to implement this feature on the client side.
Prerequisites
Make sure that the closed caption feature is enabled in your app’s dashboard. The closed caption feature can be set on the call type level, and the available options are:
available
: the feature is available for your call and can be enabled.disabled
: the feature is not available for your call. In this case, it’s a good idea to “hide” any UI element you have related to closed captions.auto-on
: the feature is available and will be enabled automatically once the user is connected to the call.
It is also possible to override the call type’s default settings when creating a call:
You can check the current value like this:
print(call.state.value.settings.transcription.closedCaptionMode);
Enabling, disabling and tweaking closed captions
If you set closedCaptionMode
to available
, you need to enable closed caption events to view captions:
await call.startClosedCaptions(); // enable closed captions
await call.stopClosedCaptions(); // disable closed captions
You can adjust the settings that control when closed captions appear by changing the Call Preferences.
streamVideo.makeCall(
callType: StreamCallType.defaultType(),
id: 'my-call-id',
preferences: DefaultCallPreferences(
closedCaptionsVisibleCaptions: 2, // number of captions that can be stored in the queue
closedCaptionsVisibilityDurationMs: 2700, // the duration a caption can stay in the queue
),
);
Check if closed captions are enabled
final isCaptioningInProgress = call.state.value.isCaptioning;
Closed Captions UI
When enabled, the SDK provides access to closed captions through the Call.closedCaptions
stream. This stream emits an array of StreamClosedCaption
objects representing the captions to display, based on the settings specified in CallPreferences
.
The following example demonstrates how to use the closedCaptions stream with a StreamBuilder
widget to display closed captions in a UI:
StreamBuilder(
stream: call.closedCaptions,
builder: (context, snapshot) {
if (snapshot.hasData) {
final closedCaptions = snapshot.data as List<StreamClosedCaption>;
if (!call.state.value.isCaptioning) {
return const SizedBox.shrink();
}
return Container(
color: Colors.black.withOpacity(0.5),
padding: const EdgeInsets.all(8),
child: Column(
children: closedCaptions.map((caption) {
return Row(
children: [
Text(
"${caption.user.name}: ",
style: const TextStyle(
color: Colors.white,
fontWeight: FontWeight.bold,
fontSize: 16,
),
),
Expanded(
child: Text(
caption.text.trim(),
maxLines: 3,
style: const TextStyle(
color: Colors.white,
fontSize: 16,
),
),
),
],
);
}).toList(),
),
);
}
return const SizedBox();
},
)
Toggling closed captions
Only users with the permission to start or stop closed captions can toggle closed captions on and off. This can be set up in the Stream Dashboard, under the Permissions section.
To give your users the ability to turn captioning on/off you can use the ToggleClosedCaptionsOption
widget from our SDK or create your own that will call the Call.startClosedCaptions()
and Call.stopClosedCaptions()
methods.
Advanced usage
If the SDK’s default behavior does not meet your requirements, you can directly subscribe to closed caption events and implement custom logic.
Here’s how you can subscribe to the closed caption events:
call.callEvents.on<StreamCallClosedCaptionsEvent>((event) {
print('Closed caption event: $event');
});
This provides direct access to events emitted by the backend for each generated caption. Using these events, you can build your own logic to determine what and for how long to display the captions.