•8 months ago
When building a chat application, real-time events are crucial to providing an engaging user experience and adding custom functionality or behaviors specific to your use-case.
With Stream Chat webhooks, you can provide endpoints on your server to receive events as they happen, allowing you to consume the requests using domain-specific logic, handle custom chat commands, implement custom notification systems, and much more. For example - in a Customer Support setting - you may want to change the "open" status of a ticket in your database when an agent gets added to a channel.
Depending on the size of your application, many hundreds - even thousands - of events can be happening at any given time, from user presence changes to new messages and updates to a channel or it's members - all of which get forwarded to your webhook endpoint.
However, until now, if the recipient server fails to keep up with the events - or cannot receive the webhook request for whatever reason - the event becomes lost, and can’t be handled by your server or retried.
Because of this, we recently made some updates to how our Webhook system works to help keep your apps running smoothly, even in unpredictable or undesirable conditions.
Webhook events will now retry for up to 15 seconds - or a maximum of five attempts, whichever comes first - in the case of a failed delivery.
The incoming request now includes an
X-Webhook-ID header that provides a unique identifier for the webhook that will be consistent across retries, so you can deduplicate requests if necessary. More information on all of the headers included with each Stream Chat webhook event can be found here
Webhook connections are now reused when possible, including support for
keep-alive to make sure requests are speedy and efficient.
All of the above updates are now available to all customers, so you can benefit from more reliable Webhooks starting today!
For more information on these updates and all things Webhooks with Stream Chat you can check out our Webhooks Documentation, and as always, let us know your feedback through the comment form on the docs page.