Why Audit Logs and Reporting Matter
Moderation isn’t just about detecting and acting on harmful content. It’s also about showing your work to your community, your leadership, and, in some industries, regulators. Audit logs and reporting ensure every action is recorded, every reviewer is accountable, and every policy can be refined with real data.
Stream’s Dashboard provides detailed logs for individual decisions and high-level reporting to identify patterns and trends.
Audit Logs
Audit logs track every moderation decision across your platform, creating a tamper-proof record for accountability.
What’s Tracked
- Action → The enforcement applied (e.g., Flag, Block, Shadowblock, Bounce).
- User → The account responsible for the content, including their User ID.
- User Content → The exact message, post, or snippet of media flagged.
- Action By → Who took the action (AI Mod or a specific moderator).
- Date & Time → A precise timestamp of when the action occurred.
Additional filters at the top allow admins to search logs by:
- Action Taken
- Moderation Category (e.g., self-harm, harassment)
- Entity ID (the specific piece of content)
- User ID (to track repeat offenders)
- Action By (AI Mod vs human mod)
Pro Tip: The Export Logs button lets you pull these records into your compliance system, BI dashboards, or archive them for audits.
Why It Matters
- Accountability → Shows which moderator made each decision.
- Compliance → Provides an auditable trail for legal, regulatory, or internal policy requirements.
- Transparency → Builds community trust when you can explain enforcement consistently.
- Error Correction → Makes it easy to spot and fix mistakes in moderation.
Best Practice: Require moderators to leave notes on false positives, escalations, and overrides so audit logs aren’t just records, but rich learning tools.
Reporting and Insights
Where audit logs focus on granularity, reporting provides the big picture. Stream’s dashboard includes an Overview tab with metrics to help admins evaluate performance and optimize policies.
What You’ll See in Reports
- Total Usage → Counts of moderated messages, images, and videos.
- Top Detected Harms → Breakdown of the most frequent categories flagged by AI (e.g., PII, scam, hate speech).
- Moderator Action Distribution → A chart showing what actions moderators are taking most often (e.g., Mark Reviewed, Delete, Ban).
- Moderated Entities by Engine → Distribution of flagged content by detection engine (AI Text, AI Image, AI Video, Semantic Filters, Blocklist, etc.).
- AI Moderation by Action → Breakdown of how the system responded: how many Flags, Blocks, Bounces, Shadowblocks, etc.
How To Use Reporting
- Optimize Policies → If “spam” is 60% of your queue, strengthen regex or blocklist rules to reduce load.
- Identify Training Needs → If one moderator frequently overrides AI flags, retrain or review thresholds.
- Spot Emerging Risks → Surges in harassment or new slang terms may signal cultural shifts in your community.
- Measure Efficiency → Track how fast moderators clear flagged content and escalate critical harms.
Compliance and Legal Use Cases
For many industries (finance, healthcare, education, gaming), auditability isn’t optional. Audit logs and reports help you:
- Demonstrate compliance with trust & safety standards.
- Provide records during disputes (e.g., why a user was banned).
- Respond to regulators or auditors with documented evidence of moderation activity.
- Support transparency reports for the public or your community.
Pro Tip: Pair Stream’s logs with your internal compliance processes. Export reports regularly and archive them in your company’s data retention system.
Best Practices
- Centralize Reporting: Use Stream’s dashboard as the source of truth for moderation decisions.
- Regular Audits: Schedule monthly reviews of logs to check for patterns, errors, or policy drift.
- Use Tags + Notes: Enrich logs with moderator notes and escalation tags for more context.
- Close the Loop: Use insights from reports to adjust policies, thresholds, or moderator workflows.
- Automate Exports: Set up regular exports to external BI tools if your organization requires deeper analysis.
Audit logs and reporting make your moderation system transparent, accountable, and data-driven. Logs capture every decision for accountability and compliance, while reporting surfaces patterns that help you refine policies, train moderators, and protect your community at scale.
Next, we’ll explore real-world moderation workflows, following flagged content from detection through queues, actions, notes, and reporting, so you can see how all the pieces come together.