import { StreamChat } from "stream-chat";
import { Chat, OverlayProvider } from "stream-chat-react-native";
const client = StreamChat.getInstance("api_key");
export const App = () => (
<OverlayProvider accessibility={{ enabled: true }}>
<Chat client={client}>{/** App components */}</Chat>
</OverlayProvider>
);Accessibility
The React Native SDK ships an opt-in accessibility layer for VoiceOver (iOS) and TalkBack (Android). When enabled, components add the appropriate accessibilityRole, accessibilityState, accessibilityLabel, accessibilityValue, and accessibilityLiveRegion attributes; the SDK announces incoming messages, AI typing transitions, and connection-state changes through a single imperative announcer; modal/sheet surfaces apply the platform-correct focus-trap props; and high-traffic surfaces like MessageList expose a "Scroll to bottom" rotor action so screen-reader users can jump to the latest messages without swiping through the entire list.
Accessibility is off by default. Existing integrations behave exactly as before until you opt in.
Best Practices
- Pass
accessibility={{ enabled: true }}toOverlayProvider, not toChat. - Test on both iOS (VoiceOver) and Android (TalkBack) before shipping a release.
- When you author custom components, use
useA11yLabel('a11y/...')instead oft('a11y/...')so labels short-circuit when an integrator opts out. - Override
a11y/*keys viaStreami18nrather than hardcoding English strings in component code. - When you replace an SDK component via
WithComponents, the SDK no longer drives that component's accessibility — readuseAccessibilityContext()if you want to mirror the SDK's behavior. - Don't subscribe to
useScreenReaderEnabled()inside list-item components — toggling the screen reader will re-render every row. - Use
forceScreenReaderMode: trueduring development to exercise accessible variants without enabling VoiceOver/TalkBack on the device. - Avoid wrapping the announcer twice; do not mount a second
AccessibilityProviderif you are already insideOverlayProvider. - Hide decorative icons inside labeled controls using
accessibilityElementsHidden(iOS) andimportantForAccessibility="no-hide-descendants"(Android).
Enabling Accessibility
Pass an accessibility config to OverlayProvider. Defaults take over once enabled is true — there is nothing else to wire up.
Every other field in AccessibilityConfig is ignored unless enabled is true. Keeping the master switch off preserves the SDK's previous behavior bit-for-bit.
Configuration Reference
| Prop | Description | Type | Default |
|---|---|---|---|
enabled | Master toggle. When false no announcer mounts, no listeners attach, and useA11yLabel returns undefined so t() is skipped on hot paths. | boolean | false |
forceScreenReaderMode | Forces "screen reader on" UI even when no screen reader is active. Useful during development. | boolean | false |
announceNewMessages | Announces incoming messages through the live-region announcer (throttled and batched). | boolean | true |
announceTypingIndicator | Announces typing transitions. Off by default — TalkBack and VoiceOver chatter on mobile is more disruptive than helpful for most chat surfaces. | boolean | false |
announceConnectionState | Announces offline/online/reconnecting transitions. | boolean | true |
audioRecorderTapMode | Whether the audio recorder swaps its hold-to-record gesture for a tap-toggle alternative. See Gesture Alternatives. | 'auto' | 'always' | 'never' | 'auto' |
imageGalleryScreenReaderMode | Whether the image gallery swaps its multi-touch gestures for tap-driven controls. See Gesture Alternatives. | 'auto' | 'always' | 'never' | 'auto' |
messageActionsTrigger | Whether to render an extra "More actions" button next to messages. 'auto' shows it when a screen reader is active; 'always-button' shows it for everyone; 'long-press' keeps the gesture-only behavior. | 'long-press' | 'auto' | 'always-button' | 'auto' |
announceTypingIndicator defaults to false because mobile screen readers (especially TalkBack on Android) interrupt aggressively when a live region updates — announcing every keystroke from another participant tends to make a chat feel hostile rather than informative. Turn it on if your audience expects every typing transition called out, but test on a real device first.
Gesture Alternatives
Mobile gestures (long-press menus, hold-to-record audio, pinch/pan in the image gallery) are inaccessible to screen-reader users. The SDK exposes three knobs that control whether equivalent tap-driven UI is shown:
'auto'— render the accessible variant when a screen reader is detected viaAccessibilityInfo. Sighted users see no change.'always'— render the accessible variant for everyone. Useful when you want the same UI in every state.'never'— the SDK never swaps UI. Pick this when you ship a fully custom component viaWithComponentsand own the accessibility yourself.
messageActionsTrigger is wired end-to-end today. audioRecorderTapMode and imageGalleryScreenReaderMode are accepted but no-op until their underlying UI swaps ship; setting them now is safe and will take effect automatically once the SDK lands the corresponding UI.
Localization
All accessibility strings flow through Streami18n under the a11y/* namespace. The English defaults ship in every locale; override per-key the same way you override any other translation. See the Localization guide for the full setup.
import { Streami18n } from "stream-chat-react-native";
const i18n = new Streami18n({ language: "nl" });
i18n.registerTranslation("nl", {
"a11y/Avatar of {{name}}": "Avatar van {{name}}",
"a11y/{{count}} new messages": "{{count}} nieuwe berichten",
"a11y/New message from {{user}}": "Nieuw bericht van {{user}}",
});The namespace covers labels for SDK buttons across the composer, attachment picker, image gallery, polls, message actions, and live-region announcements. The canonical list lives in package/src/i18n/en.json — search for keys prefixed with a11y/ to see what is overridable.
Localizing custom buttons
The SDK's Button component accepts an accessibilityLabelKey (and optional accessibilityLabelParams) so the rendered accessibility label is resolved through the same translation pipeline. Prefer this over a hard-coded accessibilityLabel when you build SDK-style icon-only buttons:
import { Button } from "stream-chat-react-native";
<Button
accessibilityLabelKey="a11y/Send message"
iconOnly
LeadingIcon={SendIcon}
type="solid"
variant="primary"
onPress={onSend}
/>;The legacy accessibilityLabel prop still works as a fallback when accessibilityLabelKey is omitted. AttachmentRemoveControl exposes the same accessibilityLabelKey / accessibilityLabelParams props for the per-attachment remove icon.
Programmatic Announcements
Use useAccessibilityAnnouncer() to speak custom events that don't have a built-in SDK trigger — typically when your own components resolve a state that screen-reader users should hear about.
import { useAccessibilityAnnouncer } from "stream-chat-react-native";
const PermissionPrompt = () => {
const announce = useAccessibilityAnnouncer();
const onGranted = () => {
announce("Microphone access granted", "polite");
};
return /* ... */;
};For state-driven announcements that change as a value transitions (a loading state moving through pending → success → error, for example), use useAnnounceOnStateChange. It dedupes consecutive identical messages and applies a built-in debounce so the screen reader is not flooded.
import { useAnnounceOnStateChange } from "stream-chat-react-native";
const UploadStatus = ({ status }) => {
const message =
status === "uploading"
? "Uploading"
: status === "done"
? "Upload complete"
: status === "failed"
? "Upload failed"
: null;
useAnnounceOnStateChange(message);
return /* ... */;
};Both hooks are no-ops when accessibility.enabled is false, so it is safe to call them unconditionally.
Announcing incoming messages on a custom list
MessageList and MessageFlashList already wire incoming-message announcements internally — you do not need to do anything special when you use them. If you build a custom message list (or a custom thread view), call useIncomingMessageAnnouncements to opt back into the same throttled, batched announcer behavior.
import {
useChannelContext,
useChatContext,
useIncomingMessageAnnouncements,
} from "stream-chat-react-native";
const CustomMessageList = () => {
const { channel } = useChannelContext();
const { client } = useChatContext();
useIncomingMessageAnnouncements({
channel,
ownUserId: client.userID,
// For a thread view: pass `threadList: true` and the active `parent_id`
// as `activeThreadId` so only thread replies trigger announcements.
});
return /* ... */;
};The hook deduplicates by message id, throttles to one announcement per second, and batches multiple incoming messages into a "{{count}} new messages" summary. It subscribes to channel.on('message.new') only when accessibility.enabled and accessibility.announceNewMessages are both true.
Detecting Screen Reader and Reduced Motion
Two utility hooks expose the live OS-level state:
import {
useReducedMotionPreference,
useScreenReaderEnabled,
} from "stream-chat-react-native";
const ConfettiBurst = () => {
const reduceMotion = useReducedMotionPreference();
const duration = reduceMotion ? 0 : 1500;
return /* ...animate using `duration`... */;
};
const RecorderButton = () => {
const screenReaderOn = useScreenReaderEnabled();
// Render the tap-mode variant when SR is on, otherwise the hold-to-record gesture.
return /* ... */;
};Do not call useScreenReaderEnabled() inside list-item components rendered by MessageList, ChannelList, or ThreadList. Every screen-reader toggle would re-render every row. Lift the subscription up to the parent screen and pass the boolean down only where it is needed.
Custom Components
When you replace an SDK component via the WithComponents override pattern, the SDK no longer wires accessibility for that component — you own it. To mirror the SDK's behavior in a custom component, read the resolved configuration from useAccessibilityContext():
import {
useAccessibilityContext,
useScreenReaderEnabled,
} from "stream-chat-react-native";
export const CustomAudioRecorder = () => {
const { audioRecorderTapMode } = useAccessibilityContext();
const screenReaderOn = useScreenReaderEnabled();
const useTapMode =
audioRecorderTapMode === "always" ||
(audioRecorderTapMode === "auto" && screenReaderOn);
return useTapMode ? <TapToRecordUI /> : <HoldToRecordUI />;
};This keeps your custom component honest with whatever the integrator passed to OverlayProvider.
Bridging TalkBack double-tap to onPress
Some Android Pressable configurations don't reliably forward TalkBack's double-tap activation to the existing onPress handler. useAccessibilityActivateAction returns accessibilityActions and an onAccessibilityAction handler that bridge the screen-reader activate action back to your press callback. The hook returns undefined when accessibility is disabled, so you can spread its result unconditionally.
import { Pressable } from "react-native";
import { useAccessibilityActivateAction } from "stream-chat-react-native";
const CustomActionButton = ({ accessibilityLabel, onPress }) => {
const activateAction = useAccessibilityActivateAction({
onPress,
shouldHandleActivate: !!onPress && !!accessibilityLabel,
});
return (
<Pressable
accessible
accessibilityLabel={accessibilityLabel}
accessibilityRole="button"
onPress={onPress}
{...activateAction}
>
{/* ... */}
</Pressable>
);
};The SDK's own Button component already wires this hook internally, so you only need it when you build a custom pressable from the React Native primitives.
Testing
- Set
forceScreenReaderMode: truewhile iterating locally to render the accessible variants without enabling VoiceOver or TalkBack on the device. - Pair with the
Streami18noverride flow above to verify that localized labels reach the screen reader correctly. - Enable VoiceOver on iOS and TalkBack on Android on a real device before sign-off.
- Run through the same flows with Reduce Motion enabled to catch components whose animations should respect the preference.
Platform Notes
- iOS / VoiceOver — focus management uses
AccessibilityInfo.setAccessibilityFocus(reactTag). After a modal opens or a layout shift, defer the focus call by one frame (requestAnimationFrame) so the accessibility tree has settled. - Android / TalkBack — the SDK uses
AccessibilityInfo.announceForAccessibilityfor both platforms because the cross-platform behavior is more predictable than relying onaccessibilityLiveRegionalone. If you author your own announcement code and target Android specifically, both mechanisms are available to you. setAccessibilityFocuson Android requires the React tag fromfindNodeHandle(ref). If your custom component override uses a functional ref, expose a method or pass the ref through.- Attachment picker — when the picker opens, its bottom sheet is treated as a single accessibility group. Focus transitions smoothly from the composer into the picker contents, and the backdrop is excluded from the accessibility tree. No integrator action is required; this is automatic when accessibility is enabled.
If you need to step outside the SDK's accessibility surface — for example to call setAccessibilityFocus on your own view, query the boldText or grayscale flags, or add custom announcement behavior — work directly against the React Native AccessibilityInfo API. The SDK's hooks (useScreenReaderEnabled, useReducedMotionPreference, useAccessibilityAnnouncer) are thin wrappers and you can mix them with raw AccessibilityInfo calls in the same component.