import com.oney.WebRTCModule.videoEffects.VideoFrameProcessor
import com.oney.WebRTCModule.videoEffects.VideoFrameProcessorFactoryInterface
import org.webrtc.VideoFrame
class RotationFilterFactory : VideoFrameProcessorFactoryInterface {
override fun build(): VideoFrameProcessor {
return VideoFrameProcessor { frame, textureHelper ->
VideoFrame(
frame.buffer.toI420(),
180, // apply rotation to the video frame
frame.timestampNs
)
}
}
}
Custom Video Filters with React Native Community CLI
In this guide, you’ll learn how to create and use your own video filter in React Native community CLI based app, using a grayscale filter as an example.
Step 1 - Add your custom filter natively in Android and iOS
To create a new video filter, you need to implement the VideoFrameProcessorFactoryInterface
from @stream-io/react-native-webrtc
. A simple example that applies rotation to the video filter would be like the following:
For the easiness of processing video frames in Bitmap
, we export a VideoFrameProcessorWithBitmapFilter
class from the @stream-io/video-filters-react-native
library. The built-in background filters of the library have been implemented using this class. To implement a video filter with Bitmap
, create a class by extending a filter that extends from BitmapVideoFilter
abstract class. This BitmapVideoFilter
abstract class gives you a Bitmap
for each video frame, which you can manipulate directly. By returning a new VideoFrameProcessorWithBitmapFilter
instance with that filter we can implement a bitmap processing filter.
BitmapVideoFilter
is less performant than a normal video filter that does not use bitmaps. It is due to the overhead of certain operations, like YUV <-> ARGB conversions.
Example: grayscale video filter
We can create and set a simple video filter that turns the video frame to grayscale by extending a filter that extends from BitmapVideoFilter
abstract class like this:
import android.graphics.Bitmap
import android.graphics.Canvas
import android.graphics.ColorMatrix
import android.graphics.ColorMatrixColorFilter
import android.graphics.Paint
import com.oney.WebRTCModule.videoEffects.VideoFrameProcessor
import com.oney.WebRTCModule.videoEffects.VideoFrameProcessorFactoryInterface
import com.streamio.videofiltersreactnative.common.BitmapVideoFilter
import com.streamio.videofiltersreactnative.common.VideoFrameProcessorWithBitmapFilter
class GrayScaleVideoFilterFactory : VideoFrameProcessorFactoryInterface {
override fun build(): VideoFrameProcessor {
return VideoFrameProcessorWithBitmapFilter {
GrayScaleFilter()
}
}
}
private class GrayScaleFilter : BitmapVideoFilter() {
override fun applyFilter(videoFrameBitmap: Bitmap) {
val canvas = Canvas(videoFrameBitmap)
val paint = Paint().apply {
val colorMatrix = ColorMatrix().apply {
// map the saturation of the color to grayscale
setSaturation(0f)
}
colorFilter = ColorMatrixColorFilter(colorMatrix)
}
canvas.drawBitmap(videoFrameBitmap, 0f, 0f, paint)
}
}
To add a new video filter, you need to specify an object that conforms to the VideoFrameProcessorDelegate
protocol from the @stream-io/video-filters-react-native
library and inherits from the NSObject
class.
For the easiness of processing video frames in CIImage
, copy this VideoFilters.swift file into your app. How does this class work? If you implement the filter using the VideoFilter
class, you will receive each frame of the user’s local video as CIImage
, allowing you to apply the filters. The VideoFilter
class allows you to easily create your own filters. It contains the function that converts the original CIImage
to an output CIImage
. This way you have complete freedom over the processing pipeline. Instead, if you would need to access the raw video frame you can look into the implementation of VideoFilter
class and adapt it to your own filter.
Additionally, you will need to import the necessary headers in the bridging header file. This header file is to expose the Objective-C files to Swift. If it is the first Swift file that you add to your app, Xcode will automatically offer to create a bridging header file. Your bridging header file should minimally have the two following header imports:
#import <React/RCTBridgeModule.h>
#import "ProcessorProvider.h"
Example: grayscale video filter
We can create and set a simple video filter that turns the video frame to grayscale by extending a filter class that extends from VideoFilter
class like this:
import Foundation
final class GrayScaleVideoFrameProcessor: VideoFilter {
@available(*, unavailable)
override public init(
filter: @escaping (Input) -> CIImage
) { fatalError() }
init() {
super.init(
filter: { input in
let filter = CIFilter(name: "CIPhotoEffectMono")
filter?.setValue(input.originalImage, forKey: kCIInputImageKey)
let outputImage: CIImage = filter?.outputImage ?? input.originalImage
return outputImage
}
)
}
}
Step 2 - Register this filter in your native module
Now you have to add a method in your app to register this video filter to the @stream-io/video-filters-react-native
library.
Follow the official React Native documentation to create a new Android native module if there is no native module in your app already. In that native module, add a method to add the filter to the ProcessorProvider
from the @stream-io/video-filters-react-native
library. For example:
import com.facebook.react.bridge.Promise
import com.facebook.react.bridge.ReactApplicationContext
import com.facebook.react.bridge.ReactContextBaseJavaModule
import com.facebook.react.bridge.ReactMethod
import com.oney.WebRTCModule.videoEffects.ProcessorProvider
// Example import path based on a typical project structure. Update this to match your project's package name and directory structure.
import com.example.myapp.videofilters.GrayScaleVideoFilterFactory
class VideoEffectsModule (reactContext: ReactApplicationContext) : ReactContextBaseJavaModule(reactContext) {
override fun getName(): String {
return NAME;
}
@ReactMethod
fun registerVideoFilters(promise: Promise) {
ProcessorProvider.addProcessor("grayscale", GrayScaleVideoFilterFactory())
promise.resolve(true)
}
companion object {
private const val NAME = "VideoEffectsModule"
}
}
In this step, we add a method to our iOS native module in Swift. If there is no native module in your app already, add a new one. In that native module, add a method to add the filter to the ProcessorProvider
from the @stream-io/video-filters-react-native
library. For example:
@objc(VideoEffectsModule)
class VideoEffectsModule: NSObject {
@objc(registerVideoFilters:withRejecter:)
func registerVideoFilters(resolve: RCTPromiseResolveBlock, reject: RCTPromiseRejectBlock) {
ProcessorProvider.addProcessor(GrayScaleVideoFrameProcessor(), forName: "grayscale")
resolve(true)
}
}
It is important to use the @objc
modifiers to ensure the class and functions are exported properly to the Objective-C runtime.
Then create a private implementation file that will register the required information with React Native:
#import <React/RCTBridgeModule.h>
@interface RCT_EXTERN_MODULE(VideoEffectsModule, NSObject)
RCT_EXTERN_METHOD(registerVideoFilters:(RCTPromiseResolveBlock)resolve
withRejecter:(RCTPromiseRejectBlock)reject)
+ (BOOL)requiresMainQueueSetup
{
return NO;
}
@end
NOTE
While calling the addProcessor
method. We need to provide a name to the filter that we are registering. In the above example, it is grayscale
. This name is to be later called in JavaScript.
Step 3 - Apply the video filter in JavaScript
To apply this video filter. You have to call the method mediaStreamTrack._setVideoEffect(name)
. To disable the filters you have to call the disableAllFilter
method from the useBackgroundFilters()
hook. Below is a small example of a hook that can be used to apply the grayscale video filter that we created. Note that the media stream is present inside the Call
instance returned from the useCall
hook.
import {
useBackgroundFilters,
useCall,
} from "@stream-io/video-react-native-sdk";
import { useRef, useCallback, useState } from "react";
import { MediaStream } from "@stream-io/react-native-webrtc";
import { NativeModules, Platform } from "react-native";
type CustomFilters = "GrayScale" | "MyOtherCustomFilter";
export const useCustomVideoFilters = () => {
const call = useCall();
const isFiltersRegisteredRef = useRef(false);
const { disableAllFilters } = useBackgroundFilters();
const [currentCustomFilter, setCustomFilter] = useState<CustomFilters>();
const applyGrayScaleFilter = useCallback(async () => {
if (!isFiltersRegisteredRef.current) {
// registering is needed only once per the app's lifetime
await NativeModules.VideoEffectsModule?.registerVideoFilters();
isFiltersRegisteredRef.current = true;
}
disableAllFilters(); // disable any other filter
(call?.camera.state.mediaStream as MediaStream | undefined)
?.getVideoTracks()
.forEach((track) => {
track._setVideoEffect("grayscale"); // set the grayscale filter
});
setCustomFilter("GrayScale");
}, [call, disableAllFilters]);
const disableCustomFilter = useCallback(() => {
disableAllFilters();
setCustomFilter(undefined);
}, [disableAllFilters]);
return {
currentCustomFilter,
applyGrayScaleFilter,
disableCustomFilter,
};
};
Now, all that is left is to call the applyGrayScaleFilter
method while being in a call. Below is a preview of the above grayscale video filter:
You can find the complete code for this example video filter module, including the grayscale filter, in our React Native community CLI based sample app.