Visualizing Stream Chat Data with Kibana: A Step-by-Step Guide

New
5 min read
Dumebi Okolo
Dumebi Okolo
Published April 11, 2025

In our previous article, we discussed setting up your stream chat application with advanced search features using Elasticsearch. In this article, we will combine this power tool with visualization software to enhance our chat application’s features and capabilities.

(Note: This guide assumes some familiarity with Node.js, Docker, and basic command-line usage.)

Combining Stream Chat, Elasticsearch, and Kibana gives us the following:

  • Instant Chat: Stream Chat handles user presence, real-time updates, push notifications, etc.
  • Powerful Visualization: Kibana turns raw chat messages into line charts, bar graphs, or real-time dashboards.
  • Search & Analytics: Elasticsearch indexes your messages for full-text search, advanced aggregations, and near-real-time queries.

If you followed along in the first article, hurray! You have built your (first) chat application or system. But, the success of a product or application lies heavily on how much or how far it is being used. A good way to know this is to collect and analyze usage data. This is where we introduce Kibana's power through Stream Chat!

By combining these tools, you can:

  • Monitor message volume over time.
  • Identify top users or channels quickly.
  • Catch anomalies or suspicious spikes in chat traffic.

This tutorial walks you from Node.js code that captures Stream Chat webhooks to Kibana dashboards that bring your chat data to life.


Below is a flow diagram showing the components involved with collecting data:

Kibana then uses the data collected by ElasticSearch:

Explanation of the flow diagram:

  1. User sends a message → Stream triggers message.new.
  2. Node.js webhook receives the event and indexes the data into Elasticsearch.
  3. Kibana queries Elasticsearch to visualize chat analytics.

We will briefly go over setting up our Stream Chat with Elasticsearch as we did previously.

Prerequisites & Project Setup

  1. Node.js (v14+).
  2. Stream Chat account
    • Sign up for free or log in, create an app, and note your API Key & API Secret.
  3. Docker (to run Elasticsearch + Kibana).
  4. ngrok (optional) if you want to test webhooks locally.

Create Project Folder

shell
1
2
3
4
5
6
7
mkdir stream-chat-kibana-app cd stream-chat-kibana-app npm init -y npm install express body-parser @elastic/elasticsearch stream-chat dotenv

Create a .env file:

shell
1
2
3
4
5
6
7
8
9
STREAM_API_KEY=<YOUR_STREAM_API_KEY> STREAM_API_SECRET=<YOUR_STREAM_API_SECRET> STREAM ID: <YOUR-STREAM-ID> ELASTICSEARCH_NODE=http://localhost:9200 PORT=3000

Capturing Stream Chat Events via Node.js

We’ll build an Express server to handle message metadata events from Stream. server.js:

Privacy Consideration

Note: Recall that in our previous article, we handled message.new in our express server. However, logging full chat messages in Elasticsearch may raise privacy and security concerns, especially if your application has strict data policies. Since we only need analytics (e.g., message count per user, message size), we will log metadata instead of full message content.

javascript
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
import 'dotenv/config'; import express from 'express'; import bodyParser from 'body-parser'; import { Client as ElasticsearchClient } from '@elastic/elasticsearch'; import { StreamChat } from 'stream-chat'; const app = express(); app.use(bodyParser.json()); const esClient = new ElasticsearchClient({ node: process.env.ELASTICSEARCH_NODE }); try { const info = await esClient.info(); console.log('Elasticsearch connected:', info); } catch (err) { console.error('Elasticsearch connection error:', err); } const serverClient = StreamChat.getInstance( process.env.STREAM_API_KEY, process.env.STREAM_API_SECRET ); console.log('Stream server client initialized.'); app.post('/stream-webhook', async (req, res) => { try { const { type, message } = req.body; console.log('Webhook event type:', type); if (type === 'message.new') { await esClient.index({ index: 'stream-chat', id: message.id, body: { user_id: message.user.id, channel_id: message.channel_id, created_at: message.created_at, message_size: Buffer.byteLength(message.text, 'utf-8') // Log only the message size }, }); console.log(`Indexed metadata for message ${message.id} to Elasticsearch`); } res.status(200).send('Webhook processed'); } catch (error) { console.error('Error in webhook route:', error); res.status(500).send('Server error'); } }); app.get('/', (req, res) => { res.send('Server is up and running'); }); const port = process.env.PORT || 3000; app.listen(port, () => { console.log(`Server listening on port ${port}`); });

Configure Webhook in Stream:

  • If local, run ngrok http 3000 → set https://<ngrok-id>.ngrok.io/stream-webhook in your Stream Dashboard under Events & Webhooks.
  • Enable message.new.

Are you confused about how to set ngrok up? Refer to our previous article.


Dockerizing Elasticsearch & Kibana

Create a Docker Network

shell
1
docker network create es-network

Launch Elasticsearch

Building your own app? Get early access to our Livestream or Video Calling API and launch in days!
shell
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
docker run -d --name es-dev --network es-network \ -p 9200:9200 -p 9300:9300 \ -e "discovery.type=single-node" \ -e "xpack.security.enabled=false" \ -e "ES_JAVA_OPTS=-Xms512m -Xmx512m" \ docker.elastic.co/elasticsearch/elasticsearch:8.10.2 ``` ### Launch Kibana ```shell docker run -d --name kibana-dev --network es-network \ -p 5601:5601 \ -e ELASTICSEARCH_HOSTS=http://es-dev:9200 \ docker.elastic.co/kibana/kibana:8.10.2

If your docker set-up goes correctly, you should see this on your docker containers page:

After you have confirmed that your setup is correct, check the following URL to see your project come to life!


Creating a Data View in Kibana

When Kibana loads, you’ll see a “Welcome to Elastic” screen. Then:

  • Go to the “Explore on my own” button to get started!
  • Once in the Kibana environment, go to:
  1. Stack Management → Data Views (Index Patterns in older versions).
  2. Click Create data view → Type stream-chat (our index name).
  3. If you have a time field (e.g. created_at), select it so Kibana can do time-based filtering.
  4. Save.

Now Kibana knows to look at the stream-chat index for data.


Building Visualizations & Dashboards

Discover Your Data

Follow this process to view your chat data:

  1. Kibana → Discover.
  2. Select stream-chat from the data view dropdown.
  3. Adjust the time range (top-right) if your data has a date field.

You should see your message data. If not, confirm Node.js is indexing data (check logs or do a GET /_cat/indices in Kibana Dev Tools).

Visualize Message Volume Over Time

If you are interested in knowing the message volume over a given period of time or how many messages are generally being sent using your application, you can easily do this on Kibana.

  1. Analytics → Visualize Library.
  2. Create new visualization → Choose Line chart.
  3. Pick stream-chat data view.
  4. X-axis: Date histogram on created_at.
  5. Y-axis: Count of records.
  6. (Optional) Split series by user_id to see multiple lines.
  7. Save your chart.

Kibana gives visualization suggestions to match your data visualization needs.

Dashboard

  1. Kibana → Dashboard → Create new.
  2. Add the saved line chart.
  3. Optional: Add more charts (top channels, user stats).
  4. Save your dashboard.

Now, you have a custom Kibana dashboard showing real-time chat metrics.


Advanced Kibana Use Cases

  1. Filters & Drilldowns: Filter by a specific user or channel directly in the chart.
  2. Geo-based Data: If your messages store location/IP info, plot them on Kibana Maps.
  3. Machine Learning: Kibana’s ML features can detect anomalies in chat volume or user activity.
  4. Moderation: Index flagged messages or user bans, then track them with Kibana’s visuals.

Quick Troubleshooting Reference

  1. Webhook Not Firing

    • Use ngrok if you’re testing locally.
    • Ensure message.new events are enabled in Stream.
  2. Elasticsearch Index Missing

    • Check Docker logs for ES startup errors.
    • Run GET _cat/indices in Kibana Dev Tools to confirm the stream-chat index.
  3. Kibana “Server Not Ready”

    • Wait a minute; Kibana can take time to initialize.
    • Verify enough Docker memory is allocated (2–4GB recommended).
  4. No Documents in Discover

    • Make sure your data view (stream-chat) matches the actual index name.
    • Check time filter (top-right). If created_at is out of range, you’ll see zero docs.

By pairing Stream for real-time messaging with Elasticsearch and Kibana for analytics, you have an efficient feedback loop: instantly updated chat plus dynamic insights into how users interact. With just a few Docker containers and a lightweight Node.js webhook, you can:

  • Monitor user engagement and channel popularity in real-time.
  • Customize visuals and dashboards that highlight critical metrics.
  • Scale to thousands (or millions) of messages while retaining lightning-fast search and analytics.

Whether you run​​ a community platform or build in-app chat for a global audience, this integration helps you monitor user behavior and deliver a data-driven chat experience.

Try out Stream Chat for your project. Explore Kibana’s docs for advanced features like alerting and ML.

Integrating Video With Your App?
We've built a Video and Audio solution just for you. Check out our APIs and SDKs.
Learn more ->