Expo

Our SDK is not available on Expo Go due to native code being required, but you can use the expo-dev-client library to run your Expo app with a development build.

Development Build

If you haven’t already, prepare your project for expo development builds.

SDK Installation

Add the Stream Video React Native SDK and its required dependencies to your project:

Terminal
npx expo install @stream-io/video-react-native-sdk
npx expo install @stream-io/react-native-webrtc
npx expo install @config-plugins/react-native-webrtc
npx expo install react-native-incall-manager
npx expo install react-native-svg
npx expo install @react-native-community/netinfo

So what did we install precisely?

  • @stream-io/video-react-native-sdk (SVRN) is Stream’s Video SDK which contains UI components, hooks and util functions that will enable audio/video calls.
  • @stream-io/react-native-webrtc is a WebRTC module for React Native, SVRN depends on this dependency, it’s components and utilities to render audio/video tracks and interact with the phone’s media devices.
  • @config-plugins/react-native-webrtc config plugin to auto-configure @stream-io/react-native-webrtc when the native code is generated (npx expo prebuild).
  • react-native-incall-manager handles media-routes/sensors/events during an audio/video call.
  • react-native-svg provides SVG support to React Native, SVRN’s components and it’s icons are reliant on this dependency.
  • @react-native-community/netinfo - is used to detect the device’s connectivity state, type and quality.

Android Specific installation

Update the minSdk version

In your app.json file add the following to the expo-build-properties plugin:

app.json
{
  "expo": {
    ...
    "plugins": [
      // highlight-start
      "expo-build-properties",
      {
        "android": {
          "minSdkVersion": 24
        }
      }
      // highlight-end
    ]
  }
}

Add config plugin

Add the config plugin for @stream-io/video-react-native-sdk and react-native-webrtc to your app.json file:

app.json
{
  "expo": {
    ...
    "plugins": [
      // highlight-start
      "@stream-io/video-react-native-sdk",
      [
        "@config-plugins/react-native-webrtc",
        {
          // add your explanations for camera and microphone
          "cameraPermission": "$(PRODUCT_NAME) requires camera access in order to capture and transmit video",
          "microphonePermission": "$(PRODUCT_NAME) requires microphone access in order to capture and transmit audio"
        }
      ]
      // highlight-end
    ]
  }
}

If Expo EAS build is not used, please do npx expo prebuild --clean to generate the native directories again after adding the config plugins.

Permissions need to be granted by the user as well. Requests for Camera and Microphone usage are automatically asked when the stream is first requested by the app. But other permissions like BLUETOOTH_CONNECT in Android need to be requested manually. However, we recommend that all necessary permissions be manually asked at an appropriate place in your app for the best user experience.

We recommend the usage of react-native-permissions library to request permissions in the app.

Run on device

iOS

In iOS simulators, recording audio or video is not supported. So always test your app on an actual device for the best experience.

Android

In Android emulators, a static video stream can be sent and so it can be used for testing. However, we recommend that you always test your app on an actual device for the best experience.

New Architecture (Fabric)

The SDK’s native modules and views are compatible with the New Architecture and Bridgeless mode through the New Renderer Interop Layers. These layers are automatically enabled when you turn on the New Architecture in React Native 0.74 and above. We recommend that you use React Native 0.74+ if you are using the New Architecture with the SDK.

Troubleshooting

© Getstream.io, Inc. All Rights Reserved.