Webhooks

Webhooks provide a powerful way to implement custom logic for moderation.

Configure webhook URL

You can configure the webhook URL from the Stream dashboard.

  1. Go to the dashboard
  2. Select the app for which you want to receive webhook events
  3. Click on “Preferences” under the “Moderation” section in the left navigation
  4. Set the URL as shown in the following screenshot

Screenshot 2024-09-03 at 12.28.01.png

Webhook events

The webhook url will receive following events

  • review_queue_item.new
    • This event notifies you of new content available for review. The content could be a message, activity, reaction, or even a user profile.
    • This event is triggered when content is flagged for the first time. Consequently, you’ll also receive the associated flags as part of the payload.
  • review_queue_item.updated
    • This event notifies you when existing flagged content receives additional flags or when a moderator performs an action on flagged content.
    • The payload includes the new flags or details of the action performed.

Webhook Events Payload Structure

The webhook event payload is structured as follows:

{
  "review_queue_item": {
    --> See ReviewQueueItem shape in the table below
  },
  "flags": [
    --> See ModerationEventFlag shape in the table below
  ],
  "action": {
    --> See ModerationEventActionLog shape in the table below
  },
  "type": "review_queue_item.new | review_queue_item.updated",
  "created_at": "timestamp",
  "received_at": "timestamp"
}

ReviewQueueItem Shape

This shape contains the following fields:

KeyTypeDescriptionPossible Values
idstringUnique identifier of the review queue item
created_atdatetimeTimestamp when the review queue item was created
updated_atdatetimeTimestamp when the review queue item was last updated
entity_typestringDescribes the type of entity under reviewDefault types include:
  • chat message: stream:chat:v1:message
  • feeds activity: stream:feeds:v2:activity
  • feeds reaction: stream:feeds:v2:reaction
  • user: stream:user
For custom moderation, the entity_type can be any unique string.
moderation_payloadobjectexact content which was sent for auto moderation E.g., { texts: ["fuck you"], images: ["https://sampleimage.com/test.jpg"], videos: [] }
statusstringPossible values are as following:
  • partial when sync moderations results are ready, but async moderations are pending execution
  • completed when both sync and async moderations have been run
recommended_actionAction recommended by stream moderation engines for the entity/content.Possible values are as following:
  • flag Stream moderation recommended flagging the content for manual review
  • remove Stream moderation recommended removing the content.
completed_atdatetimeTimestamp when all the moderation engines finished assessing the content
languagesarray
severityint
reviewed_atdatetimeTime at which the entity was reviewed
reviewed_bystringId of the moderator who reviewed the item. This value is set when moderator takes an action on review queue item from dashboard
message (only applicable to Chat product)objectChat message object under review
entity_creatorobjectThe user who created the entity under review. In case of chat, its the user who sent the message under review. In case of activity feeds, its the actor of the activity/reaction

ModerationEventFlag Shape

This shape contains an array of flags associated with the moderation event. Each flag is represented as follows:

KeyDescriptionPossible Values
typeType of the flag. This represents the name of moderation provider which created the flag.
  • ai_image
  • user_report When user flags another user
  • automod_semantic_filters
  • automod_platform_circumvention
  • block_list
  • ai_text
  • automod When user is flagged from Stream’s internal rules. Currently we have 1 rule setup: if more than 3 content from same user is flagged by moderation engines, then automatically flag the user as well. So user ends up in users list page on dashboard
  • automod_toxicity
reasonDescription of the reason for the flag
created_atTimestamp when the flag was created
updated_atTimestamp when the flag was updated
labelsClassification labels for the content under review from moderation engine (e.g., ai_text, ai_image etc) which created this flag
resultComplete result object from moderation engine.

ModerationEventActionLog Shape

This shape logs actions taken during the moderation event:

KeyTypeDescriptionPossible Values
idstringuuid for action logUnique identifier for the action log
created_atdatetimeTimestamp when this action was performed
typestringType of action performed on the review queue item.
  • delete_message
  • delete_activity
  • delete_reaction
  • delete_user
  • ban
  • custom
  • unban
  • restore
  • unblock
user_idstringId of the user (or moderator) who performed the action
reasonstringReason attached by moderator for the action. This can be any string value
customobjectAdditional data regarding the action.In case of ban type action, custom object will contain following properties:
  • timeout {int} Duration in minutes
  • shadow {bool} Shadow block. Only applicable to chat moderation
  • channel_ban_only {bool} Weather ban is only for channel. Only applicable to chat moderation
  • channel_cid {string} Channel CID in which user is banned. Only applicable to chat moderation
In case of delete_user action, custom object will contain following properties:
  • mark_messages_deleted {bool}
  • hard_delete {bool}
  • delete_conversations {bool}
In case of delete_reaction or delete_message action, custom object will contain following properties:
  • hard_delete {bool}
target_user_idstringThis is same as entity creator id of the review queue item.
© Getstream.io, Inc. All Rights Reserved.