Video Compositing

The video compositing feature lets you embed custom visuals (scoreboards, logos, watermarks, dynamic data) directly into the video stream before it is published over WebRTC. Because the compositing happens before the track reaches the SFU, everything you draw is visible to all participants, captured in recordings, and present in HLS/RTMP livestream outputs.

This builds on top of the custom video filter pipeline. You intercept every frame from the camera, composite your overlay onto it, and return the result.

For a complete, runnable example that burns a live scoreboard overlay into a livestream, check out the Livestream Overlay sample app. It demonstrates all the concepts covered in this guide: native filters, method-channel state sync, mirroring, and caching.

Step 1: Implement the compositing filter natively

Extend BitmapVideoFilter. You receive a Bitmap for each captured frame that you can draw into with a standard Canvas. Wrap it in VideoFrameProcessorWithBitmapFilter and expose it via a VideoFrameProcessorFactoryInterface, just like in the video filters guide.

OverlayVideoFilterFactory.kt
import android.graphics.Bitmap
import android.graphics.Canvas
import io.getstream.video.flutter.stream_video_filters.common.BitmapVideoFilter
import io.getstream.video.flutter.stream_video_filters.common.VideoFrameProcessorWithBitmapFilter
import io.getstream.webrtc.flutter.videoEffects.VideoFrameProcessor
import io.getstream.webrtc.flutter.videoEffects.VideoFrameProcessorFactoryInterface

class OverlayVideoFilterFactory : VideoFrameProcessorFactoryInterface {
    override fun build(): VideoFrameProcessor {
        return VideoFrameProcessorWithBitmapFilter { OverlayVideoFilter() }
    }
}

private class OverlayVideoFilter : BitmapVideoFilter() {
    override fun applyFilter(videoFrameBitmap: Bitmap) {
        val state = OverlayState.snapshot()

        val canvas = Canvas(videoFrameBitmap)
        val w = videoFrameBitmap.width.toFloat()
        val h = videoFrameBitmap.height.toFloat()

        // TODO: Draw your overlay here.
        // Use `w` and `h` to scale with the frame resolution.
    }
}

Step 2: Keep the overlay state in sync with Dart

Your overlay almost always depends on application state that lives on the Dart side, such as score values, player names, a timer, or the current speaker. Push state from Dart to a native store that the filter reads once per frame.

The recommended pattern is:

  • A thread-safe snapshot object on native (backed by a mutex / lock).
  • A monotonic version bumped on every update, so iOS can invalidate its cached overlay image.
  • A method channel through which Dart pushes partial updates.

Native state store

OverlayState.kt
object OverlayState {
    data class Snapshot(
        // TODO: Declare the fields your overlay needs, e.g.
        // val title: String,
        // val value: Int,
        // val mirror: Boolean,
    )

    private val lock = Any()
    private var current = Snapshot(
        // TODO: Sensible defaults so the first frame already has something to draw.
    )

    fun snapshot(): Snapshot = synchronized(lock) { current }

    fun update(
        // TODO: Nullable parameters. `null` means "leave this field unchanged".
        // title: String? = null,
        // value: Int? = null,
        // mirror: Boolean? = null,
    ) {
        synchronized(lock) {
            current = current.copy(
                // title = title ?: current.title,
                // value = value ?: current.value,
                // mirror = mirror ?: current.mirror,
            )
        }
    }
}

Method channel plumbing

Expose two methods on the native side: one to register the filter (as in the custom filter guide) and one to push overlay updates.

MainActivity.kt
import io.flutter.embedding.android.FlutterActivity
import io.flutter.embedding.engine.FlutterEngine
import io.flutter.plugin.common.MethodChannel
import io.getstream.webrtc.flutter.videoEffects.ProcessorProvider

class MainActivity : FlutterActivity() {
    private val CHANNEL = "sample.app.channel"

    override fun configureFlutterEngine(flutterEngine: FlutterEngine) {
        super.configureFlutterEngine(flutterEngine)

        MethodChannel(flutterEngine.dartExecutor.binaryMessenger, CHANNEL)
            .setMethodCallHandler { call, result ->
                when (call.method) {
                    "registerOverlayEffect" -> {
                        ProcessorProvider.addProcessor("overlay", OverlayVideoFilterFactory())
                        result.success(null)
                    }
                    "updateOverlayState" -> {
                        OverlayState.update(
                            // TODO: Read each field from `call.argument(...)` and
                            // forward to `OverlayState.update(...)`.
                        )
                        result.success(null)
                    }
                    else -> result.notImplemented()
                }
            }
    }
}

Dart channel wrapper

Wrap the channel in a plain Dart class so the rest of your app doesn't need to know about MethodChannel:

overlay_channel.dart
import 'package:flutter/services.dart';

class OverlayChannel {
  static const _platform = MethodChannel('sample.app.channel');

  /// Registers the native overlay filter with WebRTC's processor provider.
  Future<void> registerOverlayEffect() async {
    await _platform.invokeMethod('registerOverlayEffect');
  }

  /// Pushes a partial update. Omitted fields keep their current native value.
  Future<void> updateOverlayState({
    // TODO: Add parameters matching the fields in your native `OverlayState`.
  }) async {
    final args = <String, dynamic>{};
    // TODO: Populate `args` with non-null fields only.
    await _platform.invokeMethod('updateOverlayState', args);
  }
}

Step 3: Apply the filter from Dart

Apply the filter through StreamVideoEffectsManager.applyCustomEffect, using the same name you registered natively (overlay in the snippets above). Push the current Dart-side state inside the register callback so the very first frame after registration already reflects your app's values.

import 'package:stream_video_filters/video_effects_manager.dart';

final effectsManager = StreamVideoEffectsManager(call);
final overlayChannel = OverlayChannel();

// Enable the overlay.
await effectsManager.applyCustomEffect(
  'overlay',
  registerEffectProcessorCallback: () async {
    await overlayChannel.registerOverlayEffect();

    // Seed native with the current Dart-side state.
    await overlayChannel.updateOverlayState(
      // title: myTitle,
      // value: myValue,
    );
  },
);

// Push a live update. The next captured frame reflects the new state.
await overlayChannel.updateOverlayState(/* ... */);

// Disable the overlay.
await effectsManager.disableAllFilters();

Mirroring and the local selfie preview

When the local participant is on the front camera, StreamVideoRenderer applies a horizontal flip at render time so the selfie preview looks correct. That flip only affects the local view. The pixels on the wire (and therefore remote participants and HLS/RTMP egress) are never mirrored.

Because compositing bakes pixels into the frame before that local flip, an overlay that reads correctly on remote views will read backwards on the local preview (and vice versa). You have two options:

  1. Disable the selfie mirror when joining the call:

    final options = CallConnectOptions(
      camera: TrackOption.enabled(
        constraints: const CameraConstraints(mirrorMode: MirrorMode.off),
      ),
    );
    await call.join(connectOptions: options);
  2. Keep the selfie mirror and pre-flip the overlay horizontally inside the filter when mirroring is active. Expose a mirror field in your overlay state and toggle it from Dart based on the active camera. Note that pre-flipping makes the overlay read correctly on your local preview, but it will appear mirrored for every remote viewer, in recordings, and in HLS/RTMP egress. In practice, option 1 is almost always the better choice for overlays that contain text or directional content.

Performance tips

  • Read state once per frame. Acquire the lock, copy the snapshot, release. Avoid holding the lock while drawing.
  • Scale with frame size. Derive overlay sizes from min(width, height) so the overlay looks consistent across device resolutions and camera qualities.
  • Cache what you can on iOS. Text rendering and UIBezierPath stroking are expensive. The (width, height, version) cache shown above means you only re-render when the overlay state actually changes.
  • Prefer BitmapVideoFilter for simple overlays on Android. It's less performant than a pure YUV filter due to ARGB↔YUV conversions, but for a handful of shapes and text per frame the difference is negligible and the Canvas API is much easier to work with.
  • Respect the aspect ratio. On iOS, orient the upright overlay with oriented(input.originalImageOrientation) before compositing. Otherwise your overlay will be rotated or mirrored relative to the underlying frame depending on the device orientation.