Build an AI Assistant with Flutter

Sahil Kumar
Sahil Kumar
Published December 6, 2024

In this tutorial, we will demonstrate how easy it is to build an AI assistant for iOS using the Stream Flutter Chat SDK on both the client and server sides. For this example, we will use the Anthropic and OpenAI APIs as the LLM service, but you can use any LLM service with Stream Chat. Stream offers a free Maker Plan so developers can leverage innovative integrations at any scale.

Talk is cheap, so here’s a video of the result:

We will use our new UI components for AI to render messages as they come, with animations similar to those of popular LLMs, such as ChatGPT. Our UI components can render LLM responses that contain markdown, code, tables, and much more.

We also provide UI for thinking indicators that can react to the new AI-related events we have on the server side.

The entire code can also be found here.

1. Project Setup

To follow along, we must ensure a minimum version of 8.3.0 of the Stream Chat Flutter SDK. These SDKs contain UI components that will help facilitate the integration of AI into our chat feature.

First, let’s create and set up the Flutter project. Create a new Flutter Project and name it stream_chat_ai_assistant_flutter_example (or any other name you prefer).

bash
1
flutter create stream_chat_ai_assistant_flutter_example

Next, we add the required dependencies from StreamChat and the UI components.

  • Add the following dependencies to your pubspec.yaml file.
yaml
1
2
3
4
5
dependencies: flutter: sdk: flutter stream_chat: ^8.3.0 stream_chat_flutter: ^8.3.0
  • Run flutter pub get to install the dependencies.
bash
1
flutter pub get

With that, we have our Flutter project ready and can add some code.

2. Setting Up the StreamChat Client

Next, we need to set up the StreamChat client. To do this, we open the main.dart file and add the following code:

dart
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
import 'package:flutter/material.dart'; import 'package:stream_chat_flutter/stream_chat_flutter.dart'; Future<void> main() async { final client = StreamChatClient('zcgvnykxsfm8'); final user = await client.connectUser( User(id: 'anakin_skywalker'), 'eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9eyJ1c2VyX2lkIjoiYW5ha2luX3NreXdhbGtlcJ9.ZwCV1qPrSAsie7-0n61JQrSEDbp6fcMgVh4V2CB0kM8', ); debugPrint('User connected: ${user.id}'); runApp(MyApp(client: client)); } class MyApp extends StatelessWidget { const MyApp({ super.key, required this.client, }); final StreamChatClient client; Widget build(BuildContext context) { return MaterialApp( title: 'Stream Chat AI Assistant', theme: ThemeData.light(), darkTheme: ThemeData.dark(), themeMode: ThemeMode.light, // We are going to create the channel list page in the next step home: const ChatAiAssistantChannelListPage(), builder: (_, child) => StreamChat(client: client, child: child), ); } }

The code above creates the chat client and StreamChatClient object, connecting a hardcoded user. You would want a proper setup for production apps where the user is provided after a login process. You can learn more about the client's setup in our docs.

3. Setting Up the Channel List

Next, let’s present the Stream Chat channel list component. When a channel is tapped, we will open the channel view with the message list. To do this, we create a new file called chat_ai_assistant_channel_list_page.dart and add the following code:

dart
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
import 'package:flutter/material.dart'; import 'package:stream_chat_ai_assistant_flutter_example/src/chat_ai_assistant_channel_page.dart'; import 'package:stream_chat_flutter/stream_chat_flutter.dart'; class ChatAiAssistantChannelListPage extends StatefulWidget { const ChatAiAssistantChannelListPage({super.key}); State<ChatAiAssistantChannelListPage> createState() => _ChatAiAssistantChannelListPageState(); } class _ChatAiAssistantChannelListPageState extends State<ChatAiAssistantChannelListPage> { // Create a channel list controller to fetch the channels. late final _controller = StreamChannelListController( client: StreamChat.of(context).client, filter: Filter.in_( 'members', [StreamChat.of(context).currentUser!.id], ), presence: true, limit: 30, ); void dispose() { _controller.dispose(); super.dispose(); } Widget build(BuildContext context) { return Scaffold( appBar: AppBar( title: const Text('AI Assistant Channels'), ), body: StreamChannelListView( controller: _controller, onChannelTap: (channel) { // Navigate to the chat page when a channel is tapped on. Navigator.of(context).push( MaterialPageRoute( builder: (context) { // We are going to create the chat page in the next step. return ChatAIAssistantChannelPage( channel: channel, ); }, ), ); }, ), ); } }

4. Setting Up the Channel Page

Next, in order to open the channel page, we need to create a new file called chat_ai_assistant_channel_page.dart and add the following code:

dart
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
import 'dart:async'; import 'package:flutter/material.dart'; import 'package:stream_chat_flutter/stream_chat_flutter.dart'; class ChatAIAssistantChannelPage extends StatefulWidget { const ChatAIAssistantChannelPage({ super.key, required this.channel, }); final Channel channel; State<ChatAIAssistantChannelPage> createState() => _ChatAIAssistantChannelPageState(); } class _ChatAIAssistantChannelPageState extends State<ChatAIAssistantChannelPage> { Widget build(BuildContext context) { return StreamChannel( channel: widget.channel, child: const Scaffold( appBar: StreamChannelHeader(), body: StreamMessageListView(), bottomNavigationBar: StreamMessageInput(), ), ); } }

When we run the app at this point, we will see the channel list. When we tap on an item, we will be navigated to a channel view to see all the messages.

5. Running the Backend

Before adding AI features to our Flutter app, let’s set up our node.js backend. The backend will expose two methods for starting and stopping an AI agent for a particular channel. If the agent is started, it listens to all new messages and sends them to OpenAI. It provides the results by sending a message and updating its text.

We use the Anthropic API and the new Assistants API from OpenAI in this sample. We also have an example of function calling. By default, Anthropic is selected, but we can pass openai as a platform parameter in the start-ai-agent request if we want to use OpenAI.

The sample also supports sending different states of the typing indicator (for example, Thinking, Checking external sources, etc).

To run the server locally, we need to clone it:

bash
1
git clone <https://github.com/GetStream/ai-assistant-nodejs.git> your_local_location

Next, we need to set up our .env file with the following keys:

Building your own app? Get early access to our Livestream or Video Calling API and launch in days!
json
1
2
3
4
5
ANTHROPIC_API_KEY=insert_your_key STREAM_API_KEY=insert_your_key STREAM_API_SECRET=insert_your_secret OPENAI_API_KEY=insert_your_key OPENWEATHER_API_KEY=insert_your_key

The STREAM_API_KEY and STREAM_API_SECRET can be found in our app's dashboard. To get an ANTHROPIC_API_KEY, we can create an account at Anthropic. Alternatively, we can get an OPENAI_API_KEY from OpenAI.

The example also uses function calling from OpenAI, which allows us to call a function if a specific query is recognized. In this sample, we can ask, “What’s the weather like?” in a particular location. If you want to support this feature, you can get your API key from Open Weather (or any other service, but we would need to update the request in that case).

Next, we need to install the dependencies using the npm install command.

After the setup is done, we can run the sample from the root with the following command:

bash
1
npm start

This will start listening to requests on localhost:3000.

6. Use a Service for Backend Interaction

We’re returning to the Flutter app and writing the necessary code to interact with the server we created in the previous step.

We will use the Dio package to make HTTP requests to do this. First, we need to add the package to our pubspec.yaml file:

yaml
1
2
dependencies: dio: ^5.7.0

Next, we create a new file called chat_ai_assistant_service.dart and add the following code:

dart
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
import 'dart:convert'; import 'package:dio/dio.dart'; class ChatAIAssistantService { factory ChatAIAssistantService() => _instance; static final _instance = ChatAIAssistantService._(); ChatAIAssistantService._() : _client = Dio() { _client ..options.baseUrl = '<https://localhost:3000>' ..options.headers = { 'Content-Type': 'application/json', } ..interceptors.addAll([LogInterceptor()]); } final Dio _client; Future<Response<T>> startAIAgent<T>(String channelId) async { final result = await _client.post<T>( '/start-ai-agent', data: jsonEncode({'channel_id': channelId}), ); return result; } Future<Response<T>> stopAIAgent<T>(String channelId) async { final result = await _client.post<T>( '/stop-ai-agent', data: jsonEncode({'channel_id': channelId}), ); return result; } }

This service exposes methods that start and stop the AI agent for a given channel identifier.

7. Creating a Typing Indicator

We have covered the interaction with the backend and will now switch our focus to building the UI. Let’s now add some code to handle the AI typing indicator. To do this, we create a new file chat_ai_assistant_typing_indicator_handler.dart, and add the following code:

dart
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
import 'dart:async'; import 'package:flutter/cupertino.dart'; import 'package:stream_chat/stream_chat.dart'; class ChatAIAssistantTypingStateHandler extends ValueNotifier<AITypingStateValue> { ChatAIAssistantTypingStateHandler({ required this.channel, }) : super(const AITypingStateValue()) { _startListeningWatchersStream(); _startListeningAiTypingStateStream(); } final Channel channel; static const _botUserId = 'ai-bot'; StreamSubscription<List<User>>? _channelWatchersSubscription; void _startListeningWatchersStream() async { // Fetch the watchers list to get the initial state. // // Note: This is a workaround to get the initial state of the watchers. // This is needed because the watchersStream doesn't emit the initial state. final channelState = await channel.query( watchersPagination: const PaginationParams(limit: 5, offset: 0), ); _updateBotPresenceFromWatchers(channelState.watchers); // Start listening to the channel's watchers stream. _channelWatchersSubscription = channel.state?.watchersStream.listen( _updateBotPresenceFromWatchers, ); } StreamSubscription<Event>? _aiTypingStateSubscription; void _startListeningAiTypingStateStream() { _aiTypingStateSubscription = channel.on().listen( (event) { final state = switch (event.type) { EventType.aiIndicatorUpdate => (event.aiState, event.messageId), EventType.aiIndicatorClear => (AITypingState.idle, null), EventType.aiIndicatorStop => (AITypingState.idle, null), _ => null, }; if (state == null) return; value = value.copyWith( aiTypingState: state.$1, aiMessageId: state.$2, ); }, ); } void _updateBotPresenceFromWatchers(List<User>? watchers) { if (watchers == null) return; value = value.copyWith( isBotPresent: watchers .where((it) => it.id.startsWith(_botUserId)) .any((it) => it.online), ); } void dispose() { _aiTypingStateSubscription?.cancel(); _channelWatchersSubscription?.cancel(); super.dispose(); } } class _NullConst { const _NullConst(); } const _nullConst = _NullConst(); class AITypingStateValue { const AITypingStateValue({ this.isBotPresent = false, this.aiMessageId, this.aiTypingState = AITypingState.idle, }); final bool isBotPresent; final String? aiMessageId; final AITypingState aiTypingState; AITypingStateValue copyWith({ bool? isBotPresent, Object? aiMessageId = _nullConst, AITypingState? aiTypingState, }) { return AITypingStateValue( isBotPresent: isBotPresent ?? this.isBotPresent, // This was done to support nullability of aiMessageId in copyWith. aiMessageId: switch (aiMessageId == _nullConst) { true => this.aiMessageId, false => aiMessageId as String?, }, aiTypingState: aiTypingState ?? this.aiTypingState, ); } bool operator ==(Object other) { if (identical(this, other)) return true; return other is AITypingStateValue && other.isBotPresent == isBotPresent && other.aiMessageId == aiMessageId && other.aiTypingState == aiTypingState; } int get hashCode => isBotPresent.hashCode ^ aiMessageId.hashCode ^ aiTypingState.hashCode; }

This handler reacts to the events that the node.js server sends, and based on that, it provides info about whether the typing indicator is shown and if a message is being generated.

8. Add UI to Handle the AI

Now that we’ve created the base let’s modify our widgets to include these AI capabilities. We will go step by step.

  1. First, we will add a ValueListenableBuilder to the ChatAIAssistantChannelPage to react to the changes in the AI assistant state and typing indicator.
dart
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
class ChatAIAssistantChannelPage extends StatefulWidget { const ChatAIAssistantChannelPage({ super.key, required this.channel, }); final Channel channel; State<ChatAIAssistantChannelPage> createState() => _ChatAIAssistantChannelPageState(); } class _ChatAIAssistantChannelPageState extends State<ChatAIAssistantChannelPage> { late final ChatAIAssistantTypingStateHandler _typingStateHandler; void initState() { super.initState(); _typingStateHandler = ChatAIAssistantTypingStateHandler( channel: widget.channel, ); } void dispose() { _typingStateHandler.dispose(); super.dispose(); } Widget build(BuildContext context) { return StreamChannel( channel: widget.channel, child: ValueListenableBuilder( valueListenable: _typingStateHandler, builder: (context, value, _) => const Scaffold( appBar: StreamChannelHeader(), body: StreamMessageListView(), bottomNavigationBar: StreamMessageInput(), ), ), ); } }
  1. Next, we want to show a button on the top right corner, which will start and stop the AI agent. To do this, we create a new widget called ToggleAIAssistantButton and add it over the message list.
dart
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
class ToggleAIAssistantButton extends StatelessWidget { const ToggleAIAssistantButton({ super.key, required this.child, this.onPressed, }); final Widget child; final VoidCallback? onPressed; Widget build(BuildContext context) { return ElevatedButton( onPressed: onPressed, child: child, ); } }
dart
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
Future<void> _toggleAIAssistant(bool toggleState) async { final channelId = widget.channel.id; if (channelId == null) return; try { await switch (toggleState) { true => ChatAIAssistantService().startAIAgent(channelId), false => ChatAIAssistantService().stopAIAgent(channelId), }; } catch (e) { debugPrint('Failed to toggle AI assistant: $e'); } } Widget build(BuildContext context) { return StreamChannel( channel: widget.channel, child: ValueListenableBuilder( valueListenable: _typingStateHandler, builder: (context, value, _) => Scaffold( appBar: const StreamChannelHeader(), body: Stack( children: [ const StreamMessageListView(), // Add a button to toggle the AI assistant. Align( alignment: Alignment.topRight, child: ToggleAIAssistantButton( child: Text(value.isBotPresent ? 'Stop AI' : 'Start AI'), onPressed: () => _toggleAIAssistant(!value.isBotPresent), ), ), ], ), bottomNavigationBar: const StreamMessageInput(), ), ), ); }
  1. Next, we will modify our StreamMessageInput to react to the changes in the AI assistant state and show a button to stop the AI agent.
dart
1
2
3
4
5
6
7
8
9
10
StreamMessageInput( // Add a button to stop the AI response if it's in progress. sendButtonBuilder: value.aiMessageId != null ? (_, controller) => IconButton( color: const Color(0XFF006BFE), onPressed: () => widget.channel.stopAIResponse(), icon: const Icon(Icons.stop_circle_rounded), ) : null, );
  1. Next, we will modify our StreamMessageListView to display a different UI component when the message is being generated by the AI agent. To do this we create a new widget called StreamingMessageView and add it over the message list.
dart
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
StreamMessageListView( messageBuilder: (_, details, ___, defaultWidget) { // Customize the message widget based on whether it's an // AI generated message or not. if (details.message.isAI) { return defaultWidget.copyWith( textBuilder: (context, message) { // Use the `StreamingMessageView` for AI messages // to animate the typing effect. return StreamingMessageView( text: message.text ?? '', ); }, bottomRowBuilderWithDefaultWidget: ( context, message, defaultWidget, ) { // Hide the edited label for AI messages. return defaultWidget.copyWith( showEditedLabel: false, ); }, ); } return defaultWidget; }, );
  1. Finally, the last thing we need to do is show a typing indicator when the AI agent generates a message. To do this, we will wrap the StreamMessageListView with a column and add a AITypingIndicatorStateView below the message list and create a new TypewriterState variable to sync the typing state with the StreamingMessageView.
dart
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
Column( crossAxisAlignment: CrossAxisAlignment.start, children: [ Expanded( child: StreamMessageListView( messageBuilder: (_, details, ___, defaultWidget) { ...previous code return StreamingMessageView( text: message.text ?? '', // Update the typewriter state to animate the typing effect. onTypewriterStateChanged: (state) { if (state == _typewriterState) return; WidgetsBinding.instance.addPostFrameCallback((_) { setState(() => _typewriterState = state); }); }, ); ...previous code }, ), ), // Show the AI typing indicator when the AI assistant is // generating a response. AITypingIndicatorStateView( typewriterState: _typewriterState, aiTypingState: value.aiTypingState, ), ], );

Finally, we are ready to run the app! If we open a channel and start the AI agent, we can ask some questions.

Conclusion

In this tutorial, we have built an AI assistant bot that works seamlessly with StreamChat’s Flutter SDK:

  • We have shown how to use our AI components to render messages for LLM responses such as markdown, code, tables, etc.
  • We have shown how to create our server, which will start and stop AI agents that respond to user questions.
  • You have learned how to customize our Flutter SDK to integrate these new AI features.

If you want to learn more about our AI capabilities, head to our AI solutions page. Additionally, check our Flutter docs to learn how you can provide more customizations to your chat apps. Create your free Stream account to get started today.

Integrating Video With Your App?
We've built a Video and Audio solution just for you. Check out our APIs and SDKs.
Learn more ->