Build low-latency Vision AI applications using our new open-source Vision AI SDK. ⭐️ on GitHub

Moderation Certification Course

Navigating the dashboard interface

The Stream Dashboard is your central hub for managing AI Moderation. This lesson walks you through selecting the correct app environment, navigating the dashboard, and using key moderation tools like queues, policies, logs, and filters. By understanding how to move through the interface, you’ll be able to configure settings, review flagged content, and ensure your team is working in the right environment without costly mistakes.

The Stream Dashboard is your command center for configuring and monitoring AI Moderation. Before diving into the details, it’s essential to understand how to navigate the interface and ensure you’re working within the correct app environment, especially if your organization manages multiple apps.

Selecting the Right App

When you log in to the Stream Dashboard, the first thing you check is which app environment you're in. Stream allows organizations to manage multiple apps, such as dev, staging, or production within a single account.

To select or switch apps:

  • Look at the top-left corner of the dashboard for the App Selector dropdown.
  • Click the dropdown to view all apps associated with your account.
  • Select the correct app before making any changes, each app has its own moderation settings, policies, and queue.

Pro Tip: Use consistent naming conventions (e.g., “AppName-Prod”, “AppName-Staging”) to avoid misconfigurations across environments.

Dashboard Navigation Overview

The left-hand sidebar of the dashboard is your main navigation hub. Here’s a quick overview of where to find key moderation tools:

  • Moderation → Overview
    Visualize moderation trends, volume, and reviewer activity over time to help optimize policies.

  • Moderation → API Keys
    Your API keys and secrets.

  • Moderation → Filters
    View and manage all your filters, including semantic, blocklists, allowlists, and regex

  • Moderation → Policies
    Create, edit, and manage moderation policies tied to AI, blocklists, and regex patterns.

  • Moderation → Rules
    Create, edit, and manage moderation rules for content and users.

  • Moderation → Users
    View and manage flagged users in real-time. Filter by user ID, content type, date, etc.

  • Moderation → Content
    View and manage flagged content in real-time. Filter by user ID, reporter type, date, etc.

  • Moderation → Chat Explorer
    Browse and search across channels to review flagged content in context and identify patterns.

  • Moderation → General
    Configure global moderation settings, including default actions, queue behavior, and review workflows tailored to your platform’s needs.

  • Moderation → Webhooks
    Get real-time notifications for moderation events, so your team can trigger automated workflows the moment content is flagged, reviewed, approved, or removed.

  • Moderation → Audit Logs
    Access a historical log of all moderation decisions and system activity for compliance and transparency.

Filter Tools

Use the filters within the various queues to quickly find users, flagged messages, or specific moderation events.

Within the Moderation Queue, apply filters by:

  • Date
  • Content Type
  • User ID
  • Reporter ID
  • Reporter Type
  • Reason
  • Moderator Action

This helps you and your team focus on high-priority content first.

Selecting the correct app and understanding dashboard layout is critical to avoiding costly mistakes. Once you’re in the right environment, the left-hand navigation and filtering tools give you quick access to every layer of moderation from policy design to real-time review.

What’s Next

Now that you know how to move through the dashboard and select the right environment, let’s dig into the most important Admin capabilities, including policies, harms, severity levels, and more.