# AI Integrations

The AI UI components are designed specifically for AI-first applications written in Jetpack Compose. When paired with our real-time [Chat API](https://getstream.io/chat/), it makes integrating with and rendering responses from LLM providers such as ChatGPT, Gemini, Anthropic or any custom backend easier, by providing out-of-the-box components able to render Markdown, Code blocks, tables, thinking indicators, images, charts etc.

This library includes the following components which assist with this task:

- `StreamingText` - a composable that progressively reveals text content word-by-word with smooth animation, perfect for displaying AI-generated responses in real-time, similar to ChatGPT. Includes built-in markdown rendering with support for code blocks, code fences, and Chart.js diagrams.
- `ChatComposer` - a fully featured prompt composer with attachments and speech input.
- `SpeechToTextButton` - a reusable button that records voice input and streams the recognized transcript back into your UI.
- `AITypingIndicator` - a component that can display different states of the LLM (thinking, checking external sources, etc).

You can find a complete ChatGPT clone sample that uses these components [here](https://github.com/GetStream/stream-chat-android-ai).

## Installation

The AI components are available via Maven. Add the dependency to your `build.gradle.kts`:

```kotlin
dependencies {
    implementation("io.getstream:stream-chat-android-ai-compose:$version")
}
```

### Snapshot Releases

To use snapshot releases, you need to add the Sonatype snapshot repository to your `settings.gradle.kts`:

```kotlin
dependencyResolutionManagement {
    repositories {
        google()
        mavenCentral()
        maven { url = uri("https://central.sonatype.com/repository/maven-snapshots") }
    }
}
```

Find the latest snapshot version in the [Maven Central snapshot repository](https://central.sonatype.com/repository/maven-snapshots/io/getstream/stream-chat-android-ai-compose/).

### StreamingText

The `StreamingText` composable progressively reveals text content word-by-word with smooth animation, perfect for displaying AI-generated responses in real-time. It includes built-in markdown rendering with support for code blocks, tables, images, charts, etc.

Here's an example how to use it:

```kotlin
import io.getstream.chat.android.ai.compose.ui.component.StreamingText

@Composable
fun AssistantMessage(
    text: String,
    isGenerating: Boolean
) {
    StreamingText(
        text = text,
        animate = isGenerating
    )
}
```

Additionally, you can specify the speed of the animation with the `chunkDelayMs` parameter. The default value is 30ms.

### AITypingIndicator

The `AITypingIndicator` is used to present different states of the LLM, such as "Thinking", "Checking External Sources", etc. You can specify any text you need. There's also a nice animation when the indicator is shown.

**Basic Usage:**

```kotlin
import io.getstream.chat.android.ai.compose.ui.component.AITypingIndicator

@Composable
fun ThinkingIndicator() {
    AITypingIndicator(
        label = { Text("Thinking") }
    )
}
```

**Customization:**

```kotlin
AITypingIndicator(
    modifier = Modifier.padding(16.dp),
    label = { Text("Processing...") },
    indicator = {
        // Custom indicator composable
        CircularProgressIndicator()
    }
)
```

### ChatComposer

The `ChatComposer` is a complete chat input component that provides text input, attachment support, voice input, and send/stop buttons. It manages state internally and provides a polished UI with automatic keyboard handling.

**Basic Usage:**

```kotlin
import io.getstream.chat.android.ai.compose.ui.component.ChatComposer
import io.getstream.chat.android.ai.compose.ui.component.MessageData

@Composable
fun ChatScreen(isGenerating: Boolean) {
    ChatComposer(
        onSendClick = { messageData: MessageData ->
            // Handle message send
            sendMessage(messageData.text, messageData.attachments)
        },
        onStopClick = {
            // Handle stopping AI streaming
            stopStreaming()
        },
        isGenerating = isGenerating,
    )
}
```

The composer automatically shows different buttons based on state: stop button when generating, send button when text is entered, and voice button when text is empty. It also automatically resets attachments once a message is sent.

### SpeechToTextButton

`SpeechToTextButton` turns voice input into text using Android's SpeechRecognizer. When tapped it asks for microphone access, records audio, and forwards the recognized transcript through its closure.

```kotlin
import io.getstream.chat.android.ai.compose.ui.component.SpeechToTextButton
import io.getstream.chat.android.ai.compose.ui.component.rememberSpeechToTextButtonState

@Composable
fun VoiceInput() {
    val state = rememberSpeechToTextButtonState(
        onFinalResult = { transcript ->
            // Handle recognized text
            appendToInput(transcript)
        }
    )

    SpeechToTextButton(
        state = state
    )
}
```

You can also use `onPartialResult` to receive real-time updates as the user speaks:

```kotlin
val state = rememberSpeechToTextButtonState(
    onPartialResult = { partialText ->
        // Called with partial results as user speaks
        text = partialText
    },
    onFinalResult = { finalText ->
        // Called with the final result when recording stops
        text = finalText
    }
)
```

These components are designed to work seamlessly with our existing [Chat SDK](https://getstream.io/tutorials/android-chat/). Our [developer guide](https://getstream.io/chat/solutions/ai-integration/) explains how to get started building AI integrations with Stream and Jetpack Compose.


---

This page was last updated at 2026-04-17T17:33:31.220Z.

For the most recent version of this documentation, visit [https://getstream.io/chat/docs/sdk/android/v6/ai-integrations/overview/](https://getstream.io/chat/docs/sdk/android/v6/ai-integrations/overview/).