Stream Chat Benchmarks 5M Concurrent Connections in a Single Channel

Stream is proud to announce a new industry benchmark for scaling real-time chat as-a-service, with 5 million concurrent connections recently supported in a single chat channel. This engineering achievement underscores Stream Chat’s position as the preferred in-app chat provider for enterprise organizations hosting the world’s largest online events. Stream’s infrastructure currently powers messaging and activity feed experiences for more than one billion end users, with an industry leading 99.999% uptime SLA available to enterprise customers.

Live events with such high concurrency create unique challenges for scaling cloud infrastructure, with thousands of users connecting to the chat API every second and millions of messages being sent simultaneously. In terms of uptime and performance, the stakes couldn’t be higher: Even a minute or two of downtime or slow functionality can completely derail a one-hour live event. Compare that to a brief service interruption in the middle of the night on an app like Slack, where asynchronous conversations are the norm, and the need for especially reliable chat infrastructure in this use case becomes clear.

Stream Chat’s New Benchmark at a Glance

Stream’s test was designed to simulate a realistic chat scenario, with roughly 4,000 users joining the channel per second and three messages sent to the channel per second, for a total of 15 million messages per second delivered at peak load. Average message_send latency remained well under 40 milliseconds throughout the test, and total platform stability was maintained. Thanks to architectural upgrades to the Stream Chat infrastructure, Stream is confident that the 5M concurrent number and these associated metrics still fall far below the upper limit of our systems’ capacity. Previous benchmark tests have included 1.5M and 3M concurrent users sending messages in a single channel.

This new benchmark puts Stream Chat in a league of its own compared to similar in-app chat providers whose infrastructure cannot scale to support as many concurrent connections. Some providers advertise support for as many as 1 million concurrent connections, but in practice, their customers report compromised functionality and technical failures above a threshold of around 200,000-300,000 clients connected to the same channel. Instead of engineering truly scalable solutions, some chat providers also simulate a high-concurrency experience by breaking out large groups of end users into many smaller, more manageable channels.

Many of today's most prominent companies already trust Stream’s white-label chat solution to power seamless user communication during live events. If your platform requires a no-compromise in-app messaging experience without the hassle of developing and maintaining back-end chat infrastructure at scale, the Stream Chat API and client-side SDKs can help conserve engineering resources and greatly accelerate time to market. Activate your free Stream Chat trial today to unlock full-featured access to the highest performing in-app chat infrastructure available.