npx create-expo-module@latest --local
Custom Video Filters with Expo
In this guide, you’ll learn how to create and use your own video filter in Expo, using a grayscale filter as an example.
Step 1 - Setup a new local Expo module
Navigate to your project directory (the one that contains the package.json
file) and run the following command. This is the recommended way to create a local Expo module
For the inputs asked by this command, we pass the following values (as an example):
- Name of the module: video-effects
- Native module name: VideoEffects
- Android package name: io.getstream.videoeffects
Once the command completes, navigate to the directory <project-directory>/modules/video-effects/
. All the relevant files for our module will live here.
Step 2 - Define our module structure
The files come with a default implementation of a example module and a example view with web support. We do not have web support in our implementation and there is no View necessary.
First, lets remove the files for view and web support. Delete the following files:
- android/VideoEffectsView.swift
- ios/VideoEffectsView.swift
- src/VideoEffectsView.web.ts
- src/VideoEffectsModule.web.ts
- src/VideoEffects.types.ts
- src/VideoEffectsView.tsx
Now, let us configure the module appropriately. There will be only one method in our module, it will be called registerVideoFilters
. It will register the video filter with webrtc module once invoked. The files below should have the following content:
import { NativeModule, requireNativeModule } from 'expo';
declare class VideoEffectsModule extends NativeModule {
registerVideoFilters(): void;
}
// This call loads the native module object from the JSI.
export default requireNativeModule<VideoEffectsModule>('VideoEffects');
{
"platforms": ["apple", "android"],
"apple": {
"modules": ["VideoEffectsModule"]
},
"android": {
"modules": ["io.getstream.videoeffects.VideoEffectsModule"]
}
}
export { default } from './src/VideoEffectsModule';
Step 3 - Implement the Android module
Navigate to the directory <project-directory>/modules/video-effects/android/src/main/java/io/getstream/videoeffects
. All the relevant files for your Android implementation live here.
To create a new video filter, you need to implement the VideoFrameProcessorFactoryInterface
from @stream-io/react-native-webrtc
. A simple example that applies rotation to the video filter would be like the following:
import com.oney.WebRTCModule.videoEffects.VideoFrameProcessor
import com.oney.WebRTCModule.videoEffects.VideoFrameProcessorFactoryInterface
import org.webrtc.VideoFrame
class RotationFilterFactory : VideoFrameProcessorFactoryInterface {
override fun build(): VideoFrameProcessor {
return VideoFrameProcessor { frame, textureHelper ->
VideoFrame(
frame.buffer.toI420(),
180, // apply rotation to the video frame
frame.timestampNs
)
}
}
}
For the easiness of processing video frames in Bitmap
, we export a VideoFrameProcessorWithBitmapFilter
class from the @stream-io/video-filters-react-native
library. The built-in background filters of the library have been implemented using this class. To implement a video filter with Bitmap
, create a class by extending a filter that extends from BitmapVideoFilter
abstract class. This BitmapVideoFilter
abstract class gives you a Bitmap
for each video frame, which you can manipulate directly. By returning a new VideoFrameProcessorWithBitmapFilter
instance with that filter we can implement a bitmap processing filter.
BitmapVideoFilter
is less performant than a normal video filter that does not use bitmaps. It is due to the overhead of certain operations, like YUV <-> ARGB conversions.
Example: grayscale video filter
We can create and set a simple video filter that turns the video frame to grayscale by extending a filter that extends from BitmapVideoFilter
abstract class like this:
package io.getstream.videoeffects.videofilters
import android.graphics.Bitmap
import android.graphics.Canvas
import android.graphics.ColorMatrix
import android.graphics.ColorMatrixColorFilter
import android.graphics.Paint
import com.oney.WebRTCModule.videoEffects.VideoFrameProcessor
import com.oney.WebRTCModule.videoEffects.VideoFrameProcessorFactoryInterface
import com.streamio.videofiltersreactnative.common.BitmapVideoFilter
import com.streamio.videofiltersreactnative.common.VideoFrameProcessorWithBitmapFilter
class GrayScaleVideoFilterFactory : VideoFrameProcessorFactoryInterface {
override fun build(): VideoFrameProcessor {
return VideoFrameProcessorWithBitmapFilter {
GrayScaleFilter()
}
}
}
private class GrayScaleFilter : BitmapVideoFilter() {
override fun applyFilter(videoFrameBitmap: Bitmap) {
val canvas = Canvas(videoFrameBitmap)
val paint = Paint().apply {
val colorMatrix = ColorMatrix().apply {
// map the saturation of the color to grayscale
setSaturation(0f)
}
colorFilter = ColorMatrixColorFilter(colorMatrix)
}
canvas.drawBitmap(videoFrameBitmap, 0f, 0f, paint)
}
}
Add the the above file in a new folder named as videofilters
. Next, let us implement our kotlin module. This will have the registerVideoFilters
method.
package io.getstream.videoeffects
import expo.modules.kotlin.modules.Module
import expo.modules.kotlin.modules.ModuleDefinition
import java.net.URL
import io.getstream.videoeffects.videofilters.GrayScaleVideoFilterFactory
import com.oney.WebRTCModule.videoEffects.ProcessorProvider
class VideoEffectsModule : Module() {
override fun definition() = ModuleDefinition {
Name("VideoEffects")
Function("registerVideoFilters") {
ProcessorProvider.addProcessor("grayscale", GrayScaleVideoFilterFactory())
}
}
}
Lastly, navigate to the directory <project-directory>/modules/video-effects/android/
. Lets add the required dependencies in the build.gradle
file. At the bottom of the file add the following content:
dependencies {
implementation project(':stream-io_react-native-webrtc')
implementation project(':stream-io_video-filters-react-native')
}
Step 4 - Implement the iOS module
Navigate to the directory <project-directory>/modules/video-effects/ios/
. All our iOS module implementation will live here.
To add a new video filter, you need to specify an object that conforms to the VideoFrameProcessorDelegate
protocol from the @stream-io/video-filters-react-native
library and inherits from the NSObject
class.
For the easiness of processing video frames in CIImage
, copy this VideoFilters.swift file into this directory. How does this class work? If you implement the filter using the VideoFilter
class, you will receive each frame of the user’s local video as CIImage
, allowing you to apply the filters. The VideoFilter
class allows you to easily create your own filters. It contains the function that converts the original CIImage
to an output CIImage
. This way you have complete freedom over the processing pipeline. Instead, if you would need to access the raw video frame you can look into the implementation of VideoFilter
class and adapt it to your own filter.
Additionally, you will need to import the necessary headers in a bridging header file. This header file is to expose the Objective-C files to Swift. Add a mew VideoEffects-Bridging-Header.h
file with the following content:
#import "ProcessorProvider.h"
Example: grayscale video filter
We can create and set a simple video filter that turns the video frame to grayscale by extending a filter class that extends from VideoFilter
class like this:
import Foundation
final class GrayScaleVideoFrameProcessor: VideoFilter {
@available(*, unavailable)
override public init(
filter: @escaping (Input) -> CIImage
) { fatalError() }
init() {
super.init(
filter: { input in
let filter = CIFilter(name: "CIPhotoEffectMono")
filter?.setValue(input.originalImage, forKey: kCIInputImageKey)
let outputImage: CIImage = filter?.outputImage ?? input.originalImage
return outputImage
}
)
}
}
Next, let us implement our Swift module. This will have the registerVideoFilters
method.
import ExpoModulesCore
public class VideoEffectsModule: Module {
public func definition() -> ModuleDefinition {
Name("VideoEffects")
Function("registerVideoFilters") {
ProcessorProvider.addProcessor(GrayScaleVideoFrameProcessor(), forName: "grayscale")
}
}
}
Lastly, lets add the required dependencies in the VideoEffects.podspec
file. Roughly at the middle of the file add the highlighted content below:
s.dependency 'ExpoModulesCore'
s.dependency 'stream-react-native-webrtc'
Step 5 - Apply the video filter in JavaScript
To apply this video filter. You have to call the method mediaStreamTrack._setVideoEffect(name)
. To disable the filters you have to call the disableAllFilter
method from the useBackgroundFilters()
hook. Below is a small example of a hook that can be used to apply the grayscale video filter that we created. Note that the media stream is present inside the Call
instance returned from the useCall
hook.
import {
useBackgroundFilters,
useCall,
} from "@stream-io/video-react-native-sdk";
import { useRef, useCallback, useState } from "react";
import { MediaStream } from "@stream-io/react-native-webrtc";
// NOTE: Ensure the relative path matches your project structure.
// In a standard Expo project, local modules are often located in the `modules` directory at the root level.
import VideoEffectsModule from '../../modules/video-effects';
type CustomFilters = "GrayScale" | "MyOtherCustomFilter";
export const useCustomVideoFilters = () => {
const call = useCall();
const isFiltersRegisteredRef = useRef(false);
const { disableAllFilters } = useBackgroundFilters();
const [currentCustomFilter, setCustomFilter] = useState<CustomFilters>();
const applyGrayScaleFilter = useCallback(async () => {
if (!isFiltersRegisteredRef.current) {
// registering is needed only once per the app's lifetime
VideoEffectsModule.registerVideoFilters();
isFiltersRegisteredRef.current = true;
}
disableAllFilters(); // disable any other filter
(call?.camera.state.mediaStream as MediaStream | undefined)
?.getVideoTracks()
.forEach((track) => {
track._setVideoEffect("grayscale"); // set the grayscale filter
});
setCustomFilter("GrayScale");
}, [call, disableAllFilters]);
const disableCustomFilter = useCallback(() => {
disableAllFilters();
setCustomFilter(undefined);
}, [disableAllFilters]);
return {
currentCustomFilter,
applyGrayScaleFilter,
disableCustomFilter,
};
};
Now, all that is left is to call the applyGrayScaleFilter
method while being in a call. Below is a preview of the above grayscale video filter:
You can find the complete code for this example video filter module, including the grayscale filter, in our Expo sample app.