The Digital Services Act (DSA) is set to reshape the landscape of content moderation for digital platforms operating within the European Union (EU). As compliance becomes mandatory starting February 2024, understanding what this means for platforms of all sizes is essential. This article will break down the DSA's requirements, who needs to comply, potential penalties, and how moderation practices can be adapted to meet these new regulations.
Navigating Related Regulations
The DSA is part of a broader framework of content moderation laws globally, including:
- United States: Section 230 (Communications Decency Act), California Consumer Privacy Act (CCPA), Children's Online Privacy Protection Act (COPPA)
- United Kingdom: Online Safety Bill
- Germany: Network Enforcement Act (NetzDG)
- India: IT Rules 2021
Penalties for Non-Compliance
Fines for failing to meet DSA obligations can be significant:
- Platforms with fewer than 45 million EU users: Subject to penalties defined by individual member states.
- VLOPs and VLOSEs: Can face fines of up to 6% of their global turnover for non-compliance.
As of now, no fines have been issued. However, there are several ongoing investigations for potential violations of the DSA, with the most notable examples being:
- TikTok: In February 2024, the European Commission initiated formal proceedings against TikTok to assess its compliance with DSA obligations, particularly concerning the protection of minors, advertising transparency, data access for researchers, as well as the risk management of addictive design and harmful content [European Commission Press Release]
- Meta (Facebook and Instagram): In April and May 2024, formal proceedings were opened against Meta to examine its adherence to DSA requirements related to minor protection and content moderation [European Commission Press Release]
- X (formerly Twitter): In December 2023, the European Commission began investigating X for potential DSA breaches, focusing on risk management, content moderation, dark patterns, advertising transparency, and data access for researchers [European Commission Press Release]
Why Content Moderation Matters
Content moderation is fundamental to the health and success of digital platforms, namely:
- User Engagement: Platforms that fail to prevent the spread of harmful or inappropriate content risk losing user trust and engagement.
- App Store Compliance: Both the App Store and Google Play Store mandate user-blocking and content moderation capabilities.
- Legal Compliance: The DSA imposes stringent obligations to ensure a safer digital environment for EU users.
It is a critical safeguard, ensuring that online spaces remain safe, trustworthy, and welcoming for users. By proactively managing harmful or inappropriate content, platforms can build stronger relationships with their users, protect their reputations, and comply with evolving industry and legal standards. Effective moderation is not just a protective measure---it's a cornerstone of sustainable growth and long-term platform success.
Who Must Comply with the DSA?
The DSA applies to online platforms and intermediaries doing business in the EU's 27 member states. Specific compliance requirements differ based on company size:
- General Compliance: Applies to most digital platforms, including those with fewer than 45 million EU users. It's important to note that the DSA differentiates between intermediary services (those that offer network infrastructure), hosting services (cloud and web hosting services), and online platforms (online marketplaces and app stores), with obligations varying based on the category.
- Very Large Online Platforms (VLOPs) and Search Engines (VLOSEs): Platforms with more than 45 million EU users face stricter regulations and higher reporting standards.
- Small Enterprises: Companies with fewer than 50 employees and under €10 million in annual revenue are partially exempt.
What DSA Requirements Apply to Online Platforms?
The DSA establishes a comprehensive set of expectations for platforms regarding transparency, user protection, and content handling. These requirements are designed to create a safer and more accountable digital ecosystem, emphasizing clear communication with users and consistent enforcement of platform policies. Online platforms, regardless of their size, must demonstrate a commitment to fairness and accessibility in their content moderation practices to comply with these new regulations.
Here's what digital platforms need to consider:
1. Transparent Moderation Policies
Platforms are required to include detailed explanations in their terms and conditions that clearly outline their content moderation practices. This includes specifying the types of content that are restricted and providing the rationale behind these restrictions. Platforms must describe the procedures and measures used to enforce these policies, such as content removal processes or user bans. Transparent communication not only helps users understand the rules but also builds trust in the platform's governance.
Platforms must include detailed explanations in their terms and conditions covering:
- Content restrictions and the rationale for imposing them
- Procedures and measures for policy enforcement, such as content removal or user bans
2. Reporting Obligations
A cornerstone of the DSA is its emphasis on transparency through comprehensive reporting. Platforms are required to publish annual reports that provide detailed insights into their content moderation practices, enabling regulators, users, and other stakeholders to evaluate their efforts. These reports must not only highlight how platforms manage harmful content but also offer a deeper look into the mechanisms and resources used to ensure compliance. By mandating detailed disclosures, the DSA aims to foster accountability and build trust, ensuring that moderation efforts are both effective and fair.
The reporting obligations include several key metrics and areas of focus:
- Orders from authorities: Number of content removal orders, categorized by type of content and country.
- Notice and action outcomes: Users and trusted flaggers can report the number of notices processed, including actions taken and whether automated tools were involved.
- Company-initiated moderation: Number of content pieces flagged, type of actions taken, and the detection method.
- Automated means: Accuracy, error rates, and safeguards related to AI-based moderation.
- Human resources: Staffing and training information for moderation teams.
- Complaints: Data on complaints received, actions taken, and average handling times.
- Disputes submitted to the out-of-court dispute settlement bodies: The number and outcome of the dispute settlement.
- Number of suspensions imposed
- Average Monthly Active Users (MAU) in the EU
Example DSA Transparency Report
Transparency reporting is a cornerstone of the DSA's approach to fostering accountability and trust between platforms, users, and regulators. These reports provide a comprehensive view of a platform's efforts to create a safer digital space by detailing how content moderation decisions are made and enforced.Â
An illustrative example of a DSA-compliant transparency report showcases the type of information platforms are expected to disclose, including data on content removal orders from authorities, notice-and-action outcomes, automated moderation performance, and complaints handling. These reports not only fulfill regulatory requirements but also serve as a valuable tool for platforms to demonstrate their commitment to user safety and regulatory compliance.Â
For a clearer understanding of what such a report might look like, you can view an example here.
3. Notice and Action Mechanisms
The DSA places significant emphasis on creating efficient and transparent notice-and-action mechanisms to handle harmful content and enforce platform policies effectively. These systems are vital for maintaining user trust, ensuring fair treatment, and adhering to regulatory requirements. Platforms must implement structured processes that allow users and trusted entities to report harmful content while also providing clarity and accountability in how these reports are addressed. A well-designed notice-and-action system not only streamlines content moderation but also ensures a balance between protecting users and safeguarding free expression.
Platforms must establish a robust notice and action system that:
- Prioritizes trusted flaggers
- Confirms receipt of notices
- Informs both the reporter and the reported party about the action taken
- Suspends service for users frequently reported or processed notices from unreliable reporters
4. User Complaint Processes
To ensure fairness and accountability, the DSA mandates that platforms establish a comprehensive complaints mechanism for users affected by moderation decisions. This mechanism is crucial for maintaining user trust and ensuring that moderation actions are not only effective but also just. Platforms must provide users with a clear and accessible process to contest moderation decisions, giving them the opportunity to present their case and seek resolution.
This process must remain open for at least six months, ensuring adequate time for users to challenge decisions. Furthermore, platforms are required to promptly reverse any actions found to be unfounded, reinforcing the importance of fairness and transparency in their content moderation practices. By addressing user concerns efficiently and equitably, platforms can demonstrate their commitment to upholding users' rights while maintaining compliance with the DSA.
How Can Online Platforms Ensure Compliance?
A Multi-Layered Approach to Content Moderation
Effective content moderation involves multiple defensive layers:
- AI-Powered Screening: Deploy AI moderation to analyze and filter harmful text, images, and videos.
- Custom Filters: Use semantic filtering and tailored AI engines to address specific harmful behaviors.
- Basic Safeguards: Implement tools like word blocklists to manage content during live events.
- User Reports: Supplement automated systems with user-generated content reports to identify missed issues.
Essential Features for Compliance
Platforms should integrate key moderation features to align with DSA mandates:
- User Blocking and Muting: Basic functions that enhance user safety.
- Reporting Tools: Allow users to flag inappropriate content.
- Transparency Indicators: Show when content is removed or flagged.
- Complaints Process: Provide pathways for users to contest moderation decisions.
- AI Accuracy Reporting: Monitor and report on the performance of automated moderation systems.
Recommendations for Platforms
Given the complexity and variability of global laws, platforms should adopt flexible moderation strategies that include:
- Comprehensive Moderation Policies: Ensure clear, enforceable, and transparent policies.
- Robust User Tools: Prioritize user safety features like blocking, muting, and reporting.
- Adaptable AI Systems: Use AI that can adjust based on evolving regulations.
Conclusion
Moderating the content on your intermediary services, hosting services, and online platforms is critical to creating a secure and trustworthy user experience. As the DSA becomes enforceable, platforms must act proactively to meet compliance standards and avoid significant penalties. Stream's AI oderation solution can support your organization in implementing thorough moderation practices and maintaining transparent communication with users, helping you stay ahead in this evolving regulatory landscape. Book a demo with our team today.