Custom Video Filters with React Native Community CLI

Learn how to create custom video filters in a React Native CLI app, using a grayscale filter as an example.

Step 1 - Add your custom filter natively in Android and iOS

Create a video filter by implementing VideoFrameProcessorFactoryInterface from @stream-io/react-native-webrtc. Example rotation filter:

import com.oney.WebRTCModule.videoEffects.VideoFrameProcessor
import com.oney.WebRTCModule.videoEffects.VideoFrameProcessorFactoryInterface
import org.webrtc.VideoFrame

class RotationFilterFactory : VideoFrameProcessorFactoryInterface {
    override fun build(): VideoFrameProcessor {
        return VideoFrameProcessor { frame, textureHelper ->
            VideoFrame(
                frame.buffer.toI420(),
                180, // apply rotation to the video frame
                frame.timestampNs
            )
        }
    }
}

For easier Bitmap processing, use VideoFrameProcessorWithBitmapFilter from @stream-io/video-filters-react-native. Extend BitmapVideoFilter to receive a Bitmap for each frame that you can manipulate directly.

BitmapVideoFilter is less performant due to YUV <-> ARGB conversion overhead.

Example: grayscale video filter

Create a grayscale filter by extending BitmapVideoFilter:

import android.graphics.Bitmap
import android.graphics.Canvas
import android.graphics.ColorMatrix
import android.graphics.ColorMatrixColorFilter
import android.graphics.Paint
import com.oney.WebRTCModule.videoEffects.VideoFrameProcessor
import com.oney.WebRTCModule.videoEffects.VideoFrameProcessorFactoryInterface
import com.streamio.videofiltersreactnative.common.BitmapVideoFilter
import com.streamio.videofiltersreactnative.common.VideoFrameProcessorWithBitmapFilter

class GrayScaleVideoFilterFactory : VideoFrameProcessorFactoryInterface {
  override fun build(): VideoFrameProcessor {
    return VideoFrameProcessorWithBitmapFilter {
      GrayScaleFilter()
    }
  }
}
private class GrayScaleFilter : BitmapVideoFilter() {
    override fun applyFilter(videoFrameBitmap: Bitmap) {
        val canvas = Canvas(videoFrameBitmap)
        val paint = Paint().apply {
            val colorMatrix = ColorMatrix().apply {
                // map the saturation of the color to grayscale
                setSaturation(0f)
            }
            colorFilter = ColorMatrixColorFilter(colorMatrix)
        }
        canvas.drawBitmap(videoFrameBitmap, 0f, 0f, paint)
    }
}

Step 2 - Register this filter in your native module

Add a method to register the video filter with @stream-io/video-filters-react-native:

Create an Android native module if needed. Add a method to register the filter with ProcessorProvider:

import com.facebook.react.bridge.Promise
import com.facebook.react.bridge.ReactApplicationContext
import com.facebook.react.bridge.ReactContextBaseJavaModule
import com.facebook.react.bridge.ReactMethod
import com.oney.WebRTCModule.videoEffects.ProcessorProvider
// Example import path based on a typical project structure. Update this to match your project's package name and directory structure.
import com.example.myapp.videofilters.GrayScaleVideoFilterFactory

class VideoEffectsModule (reactContext: ReactApplicationContext) : ReactContextBaseJavaModule(reactContext) {
    override fun getName(): String {
        return NAME;
    }

    @ReactMethod
    fun registerVideoFilters(promise: Promise) {
        ProcessorProvider.addProcessor("grayscale", GrayScaleVideoFilterFactory())
        promise.resolve(true)
    }

    companion object {
        private const val NAME = "VideoEffectsModule"
    }
}

When calling addProcessor, provide a filter name (e.g., grayscale) to use later in JavaScript.

Step 3 - Apply the video filter in JavaScript

Call mediaStreamTrack._setVideoEffect(name) to apply the filter. Use disableAllFilter from useBackgroundFilters() to disable filters. Example hook (media stream is in the Call instance from useCall):

import {
  useBackgroundFilters,
  useCall,
} from "@stream-io/video-react-native-sdk";
import { useRef, useCallback, useState } from "react";

import { MediaStream } from "@stream-io/react-native-webrtc";

import { NativeModules, Platform } from "react-native";

type CustomFilters = "GrayScale" | "MyOtherCustomFilter";

export const useCustomVideoFilters = () => {
  const call = useCall();
  const isFiltersRegisteredRef = useRef(false);
  const { disableAllFilters } = useBackgroundFilters();
  const [currentCustomFilter, setCustomFilter] = useState<CustomFilters>();

  const applyGrayScaleFilter = useCallback(async () => {
    if (!isFiltersRegisteredRef.current) {
      // registering is needed only once per the app's lifetime
      await NativeModules.VideoEffectsModule?.registerVideoFilters();
      isFiltersRegisteredRef.current = true;
    }
    disableAllFilters(); // disable any other filter
    (call?.camera.state.mediaStream as MediaStream | undefined)
      ?.getVideoTracks()
      .forEach((track) => {
        track._setVideoEffect("grayscale"); // set the grayscale filter
      });
    setCustomFilter("GrayScale");
  }, [call, disableAllFilters]);

  const disableCustomFilter = useCallback(() => {
    disableAllFilters();
    setCustomFilter(undefined);
  }, [disableAllFilters]);

  return {
    currentCustomFilter,
    applyGrayScaleFilter,
    disableCustomFilter,
  };
};

Call applyGrayScaleFilter while in a call:

Preview of the grayscale video filter

Complete code available in our React Native CLI sample app.