# Video Compositing

The video compositing feature lets you embed custom visuals (scoreboards, logos, watermarks, dynamic data) directly into the video stream before it is published over WebRTC. Because the compositing happens before the track reaches the SFU, everything you draw is visible to all participants, captured in recordings, and present in HLS/RTMP livestream outputs.

This builds on top of the [custom video filter](/video/docs/flutter/advanced/apply-video-filters/#advanced-adding-custom-video-filters) pipeline. You intercept every frame from the camera, composite your overlay onto it, and return the result.

<admonition type="tip">
For a complete, runnable example that burns a live scoreboard overlay into a livestream, check out the [Livestream Overlay sample app](https://github.com/GetStream/flutter-video-samples/tree/main/packages/video_livestream_overlay). It demonstrates all the concepts covered in this guide: native filters, method-channel state sync, mirroring, and caching.
</admonition>

## Step 1: Implement the compositing filter natively

<tabs groupId="current-os" queryString>

<tabs-item value="android" label="Android">

Extend `BitmapVideoFilter`. You receive a `Bitmap` for each captured frame that you can draw into with a standard `Canvas`. Wrap it in `VideoFrameProcessorWithBitmapFilter` and expose it via a `VideoFrameProcessorFactoryInterface`, just like in the [video filters guide](/video/docs/flutter/advanced/apply-video-filters/#advanced-adding-custom-video-filters).

```kotlin title="OverlayVideoFilterFactory.kt"
import android.graphics.Bitmap
import android.graphics.Canvas
import io.getstream.video.flutter.stream_video_filters.common.BitmapVideoFilter
import io.getstream.video.flutter.stream_video_filters.common.VideoFrameProcessorWithBitmapFilter
import io.getstream.webrtc.flutter.videoEffects.VideoFrameProcessor
import io.getstream.webrtc.flutter.videoEffects.VideoFrameProcessorFactoryInterface

class OverlayVideoFilterFactory : VideoFrameProcessorFactoryInterface {
    override fun build(): VideoFrameProcessor {
        return VideoFrameProcessorWithBitmapFilter { OverlayVideoFilter() }
    }
}

private class OverlayVideoFilter : BitmapVideoFilter() {
    override fun applyFilter(videoFrameBitmap: Bitmap) {
        val state = OverlayState.snapshot()

        val canvas = Canvas(videoFrameBitmap)
        val w = videoFrameBitmap.width.toFloat()
        val h = videoFrameBitmap.height.toFloat()

        // TODO: Draw your overlay here.
        // Use `w` and `h` to scale with the frame resolution.
    }
}
```

</tabs-item>

<tabs-item value="ios" label="iOS">

Extend `VideoFilter` from `stream_video_filters`. You receive each frame as a `CIImage` in `input.originalImage` along with its `originalImageOrientation`. Draw your overlay with `UIGraphicsImageRenderer`, turn it into a `CIImage`, and composite it over the original.

Cache the rendered overlay keyed on `(width, height, stateVersion)` to avoid re-rendering when the overlay state hasn't changed.

```swift title="OverlayVideoFrameProcessor.swift"
import Foundation
import stream_video_filters
import UIKit

final class OverlayVideoFrameProcessor: VideoFilter {
    @available(*, unavailable)
    override public init(filter: @escaping (Input) -> CIImage) { fatalError() }

    private var cached: (key: String, image: CIImage)?
    private let cacheQueue = DispatchQueue(label: "overlay.cache")

    init() {
        super.init(filter: { input in input.originalImage })
        self.filter = { [weak self] input in
            guard let self = self else { return input.originalImage }

            let orientation = input.originalImageOrientation
            let rawExtent = input.originalImage.extent

            let isSideways =
                orientation == .left || orientation == .right
                || orientation == .leftMirrored || orientation == .rightMirrored
            let displayWidth = isSideways ? rawExtent.height : rawExtent.width
            let displayHeight = isSideways ? rawExtent.width : rawExtent.height

            let state = OverlayState.shared.snapshot()
            let uprightOverlay = self.overlay(
                displayWidth: displayWidth, displayHeight: displayHeight, state: state
            )

            return uprightOverlay
                .oriented(orientation)
                .composited(over: input.originalImage)
                .cropped(to: input.originalImage.extent)
        }
    }

    private func overlay(displayWidth: CGFloat, displayHeight: CGFloat, state: OverlayState.Snapshot) -> CIImage {
        let key = "\(Int(displayWidth))x\(Int(displayHeight))-\(state.version)"
        return cacheQueue.sync {
            if let cached = cached, cached.key == key { return cached.image }

            let uiImage = Self.render(width: displayWidth, height: displayHeight, state: state)
            let ciImage = CIImage(image: uiImage) ?? CIImage.empty()
            cached = (key: key, image: ciImage)
            return ciImage
        }
    }

    private static func render(width: CGFloat, height: CGFloat, state: OverlayState.Snapshot) -> UIImage {
        let format = UIGraphicsImageRendererFormat()
        format.scale = 1
        format.opaque = false
        let renderer = UIGraphicsImageRenderer(size: CGSize(width: width, height: height), format: format)

        return renderer.image { ctx in
            // TODO: Draw your overlay here.
            // Use `width` and `height` to scale with the frame resolution.
        }
    }
}
```

</tabs-item>

</tabs>

## Step 2: Keep the overlay state in sync with Dart

Your overlay almost always depends on application state that lives on the Dart side, such as score values, player names, a timer, or the current speaker. Push state from Dart to a native store that the filter reads once per frame.

The recommended pattern is:

- A **thread-safe snapshot** object on native (backed by a mutex / lock).
- A **monotonic version** bumped on every update, so iOS can invalidate its cached overlay image.
- A **method channel** through which Dart pushes partial updates.

### Native state store

<tabs groupId="current-os" queryString>

<tabs-item value="android" label="Android">

```kotlin title="OverlayState.kt"
object OverlayState {
    data class Snapshot(
        // TODO: Declare the fields your overlay needs, e.g.
        // val title: String,
        // val value: Int,
        // val mirror: Boolean,
    )

    private val lock = Any()
    private var current = Snapshot(
        // TODO: Sensible defaults so the first frame already has something to draw.
    )

    fun snapshot(): Snapshot = synchronized(lock) { current }

    fun update(
        // TODO: Nullable parameters. `null` means "leave this field unchanged".
        // title: String? = null,
        // value: Int? = null,
        // mirror: Boolean? = null,
    ) {
        synchronized(lock) {
            current = current.copy(
                // title = title ?: current.title,
                // value = value ?: current.value,
                // mirror = mirror ?: current.mirror,
            )
        }
    }
}
```

</tabs-item>

<tabs-item value="ios" label="iOS">

```swift title="OverlayState.swift"
final class OverlayState {
    static let shared = OverlayState()

    struct Snapshot {
        // TODO: Declare the fields your overlay needs, e.g.
        // var title: String
        // var value: Int
        // var mirror: Bool
        var version: Int
    }

    private let lock = NSLock()
    private var current = Snapshot(
        // TODO: Sensible defaults so the first frame already has something to draw.
        version: 0
    )

    func snapshot() -> Snapshot {
        lock.lock(); defer { lock.unlock() }
        return current
    }

    func update(_ args: [String: Any]) {
        lock.lock(); defer { lock.unlock() }
        // TODO: Apply only the keys present in `args`:
        // if let v = args["title"] as? String { current.title = v }
        // if let v = args["value"] as? Int { current.value = v }
        // if let v = args["mirror"] as? Bool { current.mirror = v }
        current.version &+= 1
    }
}
```

The monotonic `version` is used as part of the cache key in `OverlayVideoFrameProcessor.overlay(...)`, so the next frame after `update(...)` invalidates the cached overlay image and re-renders automatically.

</tabs-item>

</tabs>

### Method channel plumbing

Expose two methods on the native side: one to register the filter (as in the [custom filter guide](/video/docs/flutter/advanced/apply-video-filters/#step-2---register-this-filter-in-your-native-module)) and one to push overlay updates.

<tabs groupId="current-os" queryString>

<tabs-item value="android" label="Android">

```kotlin title="MainActivity.kt"
import io.flutter.embedding.android.FlutterActivity
import io.flutter.embedding.engine.FlutterEngine
import io.flutter.plugin.common.MethodChannel
import io.getstream.webrtc.flutter.videoEffects.ProcessorProvider

class MainActivity : FlutterActivity() {
    private val CHANNEL = "sample.app.channel"

    override fun configureFlutterEngine(flutterEngine: FlutterEngine) {
        super.configureFlutterEngine(flutterEngine)

        MethodChannel(flutterEngine.dartExecutor.binaryMessenger, CHANNEL)
            .setMethodCallHandler { call, result ->
                when (call.method) {
                    "registerOverlayEffect" -> {
                        ProcessorProvider.addProcessor("overlay", OverlayVideoFilterFactory())
                        result.success(null)
                    }
                    "updateOverlayState" -> {
                        OverlayState.update(
                            // TODO: Read each field from `call.argument(...)` and
                            // forward to `OverlayState.update(...)`.
                        )
                        result.success(null)
                    }
                    else -> result.notImplemented()
                }
            }
    }
}
```

</tabs-item>

<tabs-item value="ios" label="iOS">

```swift title="AppDelegate.swift"
import Flutter
import UIKit
import stream_video_filters
import stream_webrtc_flutter

@main
@objc class AppDelegate: FlutterAppDelegate {
    private let CHANNEL = "sample.app.channel"

    override func application(
        _ application: UIApplication,
        didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?
    ) -> Bool {
        GeneratedPluginRegistrant.register(with: self)

        let controller = window?.rootViewController as! FlutterViewController
        let channel = FlutterMethodChannel(name: CHANNEL, binaryMessenger: controller.binaryMessenger)
        channel.setMethodCallHandler { [weak self] (call, result) in
            self?.handleMethodCall(call: call, result: result)
        }

        return super.application(application, didFinishLaunchingWithOptions: launchOptions)
    }

    func handleMethodCall(call: FlutterMethodCall, result: @escaping FlutterResult) {
        switch call.method {
        case "registerOverlayEffect":
            ProcessorProvider.addProcessor(OverlayVideoFrameProcessor(), forName: "overlay")
            result(nil)
        case "updateOverlayState":
            if let args = call.arguments as? [String: Any] {
                OverlayState.shared.update(args)
            }
            result(nil)
        default:
            result(FlutterMethodNotImplemented)
        }
    }
}
```

<admonition type="warning">

Remember to `import stream_webrtc_flutter`. `ProcessorProvider` is declared there, not in `stream_video_filters`.

</admonition>

</tabs-item>

</tabs>

### Dart channel wrapper

Wrap the channel in a plain Dart class so the rest of your app doesn't need to know about `MethodChannel`:

```dart title="overlay_channel.dart"
import 'package:flutter/services.dart';

class OverlayChannel {
  static const _platform = MethodChannel('sample.app.channel');

  /// Registers the native overlay filter with WebRTC's processor provider.
  Future<void> registerOverlayEffect() async {
    await _platform.invokeMethod('registerOverlayEffect');
  }

  /// Pushes a partial update. Omitted fields keep their current native value.
  Future<void> updateOverlayState({
    // TODO: Add parameters matching the fields in your native `OverlayState`.
  }) async {
    final args = <String, dynamic>{};
    // TODO: Populate `args` with non-null fields only.
    await _platform.invokeMethod('updateOverlayState', args);
  }
}
```

## Step 3: Apply the filter from Dart

Apply the filter through `StreamVideoEffectsManager.applyCustomEffect`, using the same name you registered natively (`overlay` in the snippets above). Push the current Dart-side state **inside** the register callback so the very first frame after registration already reflects your app's values.

```dart
import 'package:stream_video_filters/video_effects_manager.dart';

final effectsManager = StreamVideoEffectsManager(call);
final overlayChannel = OverlayChannel();

// Enable the overlay.
await effectsManager.applyCustomEffect(
  'overlay',
  registerEffectProcessorCallback: () async {
    await overlayChannel.registerOverlayEffect();

    // Seed native with the current Dart-side state.
    await overlayChannel.updateOverlayState(
      // title: myTitle,
      // value: myValue,
    );
  },
);

// Push a live update. The next captured frame reflects the new state.
await overlayChannel.updateOverlayState(/* ... */);

// Disable the overlay.
await effectsManager.disableAllFilters();
```

## Mirroring and the local selfie preview

When the local participant is on the front camera, `StreamVideoRenderer` applies a horizontal flip at render time so the selfie preview looks correct. That flip only affects the **local view**. The pixels on the wire (and therefore remote participants and HLS/RTMP egress) are never mirrored.

Because compositing bakes pixels into the frame _before_ that local flip, an overlay that reads correctly on remote views will read **backwards on the local preview** (and vice versa). You have two options:

1. **Disable the selfie mirror** when joining the call:

   ```dart
   final options = CallConnectOptions(
     camera: TrackOption.enabled(
       constraints: const CameraConstraints(mirrorMode: MirrorMode.off),
     ),
   );
   await call.join(connectOptions: options);
   ```

2. **Keep the selfie mirror** and pre-flip the overlay horizontally inside the filter when mirroring is active. Expose a `mirror` field in your overlay state and toggle it from Dart based on the active camera. Note that pre-flipping makes the overlay read correctly on _your_ local preview, but it will appear **mirrored for every remote viewer**, in recordings, and in HLS/RTMP egress. In practice, **option 1 is almost always the better choice** for overlays that contain text or directional content.

## Performance tips

- **Read state once per frame.** Acquire the lock, copy the snapshot, release. Avoid holding the lock while drawing.
- **Scale with frame size.** Derive overlay sizes from `min(width, height)` so the overlay looks consistent across device resolutions and camera qualities.
- **Cache what you can on iOS.** Text rendering and `UIBezierPath` stroking are expensive. The `(width, height, version)` cache shown above means you only re-render when the overlay state actually changes.
- **Prefer `BitmapVideoFilter` for simple overlays on Android.** It's less performant than a pure YUV filter due to ARGB↔YUV conversions, but for a handful of shapes and text per frame the difference is negligible and the Canvas API is much easier to work with.
- **Respect the aspect ratio.** On iOS, orient the upright overlay with `oriented(input.originalImageOrientation)` before compositing. Otherwise your overlay will be rotated or mirrored relative to the underlying frame depending on the device orientation.


---

This page was last updated at 2026-04-22T16:43:16.716Z.

For the most recent version of this documentation, visit [https://getstream.io/video/docs/flutter/ui-cookbook/video-compositing/](https://getstream.io/video/docs/flutter/ui-cookbook/video-compositing/).