Trust and Safety: Reduce Risk on Your Platform

13 min read

Trust is the foundation of any strong relationship—especially between a brand and its user base.

Emily R.
Emily R.
Published June 17, 2022

It is the responsibility of online platforms to create a safe environment its users can trust, enjoy, and recommend to others. When businesses fail to moderate user-generated content, they risk alienating their audience and contributing to churn. Many businesses are looking to Trust and Safety policies and professionals to protect their customers and reputations.

What is Trust and Safety?

Trust and Safety (T&S) is an umbrella term for the department, technology, policy, and mission online platforms establish to protect their users. Fraud, harassment, offensive content, and spam are just a few examples of the risks Trust and Safety teams aim to mitigate within their communities.

T&S tactics like creating community guidelines, enforcing violation consequences, and implementing moderation software are essential steps organizations must take to build brand loyalty, safeguard their reputation, and deliver a positive experience for all parties interacting with the forum and each other.

It is an ever-evolving concept that businesses must keep a pulse on to provide users the same high level of protection, even when introducing new communication channels or words take on new meanings.

For more information: Why You Need to Know More About Digital Trust and Safety

Why is Digital Trust and Safety Important?

There is an element of trust inherently expected in all social interactions; however, humans have been communicating face-to-face far longer than over a screen. The anonymity of the internet emboldens some users to do and say things they ordinarily would not in person.

When users are bullied, harassed, threatened, or exposed to inappropriate content, it negatively impacts their experience and those who witness it. These violations erode trust users previously had in the business, especially when minors are involved, and they become more likely to churn.

Effective digital T&S programs create a safe and inclusive environment conducive to growing the size and diversity of the audience while improving core business metrics, like:

  • Customer Acquisition Cost (CAC)
  • User Engagement
  • Life-Time Value (LTV)
  • Churn Rate

Trust is likened to perception, but it is firmly rooted in the basis of safety. Safety is equally important as trust in online communities. Inadequate moderation can result in cyberbullying, the illicit sale of weapons, the spread of misinformation, and radicalized extremist behaviors that put both users and non-users at risk. Companies that stand by and tolerate these unsafe behaviors on their apps could experience irreparable damage to their brand reputation.

How Does it Work?

Nearly all modern corporations have Trust and Safety policies in place that are tailored to the specific needs of their platform. For example, a banking app’s policy might focus more on fraud prevention than a social media app’s.

But, even if your business does not have a formal Trust and Safety policy or team in place, there are three core elements of T&S that probably already exist:

  1. Community guidelines are the most common gateway into the world of Trust and Safety. They do not require a special team or big budget to create, yet they are an effective way to make it clear to users that your organization cares about what happens on its watch and that certain behaviors will not be tolerated. In need of inspiration? Interactive live stream provider, Twitch is recognized for its comprehensive and effective set of community guidelines, review them and identify which ones would make the most sense to add to your own platform.

  2. Content moderation is another component of T&S policy your business might already be familiar with. Whether someone on your team manually reviews content, handles complaints, and issues judgments, or you’ve integrated moderation software to help lighten the load, staying up to date on how users communicate on your platform is critical to keeping it safe.

  3. Trust and Safety practitioners are industry professionals who can strengthen the security of your app by monitoring community health, average complaint time-to-resolution, detection coverage, and the true impact of moderation efforts. Partnering with a T&S consulting company like GenPact is another option for teams who are not ready to bring on dedicated hires.

T&S professionals differ from members of your customer support team because of their unique skill sets that qualify them to oversee the following Trust and Safety processes:

  • Acceptable Use Enforcement: Standard content moderation plus the review of harmful assets from known hate groups or criminal organizations, as well as scanning for spam, phishing, and malware-driven posts.
  • Marketplace Surveillance: Identifying, removing, and reporting to law enforcement (if need be) the sale of counterfeit, stolen, or fake digital and physical goods by a vendor.
  • Fraud Investigations: Inquiries into compromised accounts created using stolen credit cards or leveraging identity theft to make purchases, scam other users, and spread malware.

Designating an internal stakeholder to establish and enforce community guidelines can be a sufficient solution. However, if there is a high volume of violations and the offenses border criminal, partnering with T&S specialists can ensure swift resolution of complaints while protecting other users and the reputation of your company.

For more information: The State of Platform Trust & Safety Standards

How to Establish a Trust and Safety Team

Trust and Safety teams include security-conscious software engineers, empathetic community managers, industry experts willing to investigate complex subject matter, and detail-oriented content moderators. If your platform is looking for its first T&S hire, explore existing role listings on job boards to get a sense of the standard experience and qualifications your next team member should have.

There are three fundamental ways to ensure all of your hard work pays off in the form of a successful Trust and Safety team:

  • Prepare Documentation- Taking the time to finalize your app’s community guidelines and T&S training documentation will set new hires up for success and ensure that proper protocol is consistently followed from day one. This will allow T&S professionals from moderators to developers, to confidently perform their duties and ramp up more quickly in their roles.

  • Maintain Morale- T&S can be draining, as specialists only interact with the most difficult, offensive, and fraudulent users and cases. Maintain morale by checking in on your team, offering mental health and wellness benefits, and encouraging them to lean on one another for assistance on particularly tricky cases. These tactics will help keep your T&S team motivated, efficient, and collaborative with their colleagues.

  • Stay Agile- New conspiracy theories, symbols, and slang terms pop up every day—and it is more than likely that your T&S employees will be the ones to discover them. Creating a safe environment where T&S specialists feel comfortable sharing their insight and that it will be incorporated into the community guidelines/internal review process is critical to ensuring your documentation is always current and effective.

For More Information: Your Guide to Building a Complete Trust & Safety Team

Trust and Safety Policy Best Practices & Common Challenges

Trust and Safety operations can run smoothly with the right processes, but they are not without challenges. Organizations dedicated to upholding the principles of Trust and Safety can join the Digital Trust and Safety Partnership—an organization that partners with consumer and user advocates, policymakers, law enforcement, relevant NGOs, and T&S experts to help develop best practices for its membership to follow.

For more information: Digital Trust & Safety Best Practices Framework

Top 5 Best Practices

  • Design with Intention- Consider your app’s content-related risks during the product development process.
  • Document Thoroughly- Create explainable processes for content governance and identify and designate Trust and Safety stakeholders.
  • Test Efficacy of Process- Conduct enforcement operations to ensure proper implementation of content governance.
  • Review Performance- Evaluate the performance of processes and make improvements as needed.
  • Amplify Policies- Ensure that your company’s Trust & Safety policies are published where users can easily find them.

Top 4 Common Challenges

  • Volume- Digital T&S teams struggle to find an efficient, accurate solution capable of moderating a high volume of users and content.
  • Context- Automated moderation tools will sometimes misread the context of a situation where certain flagged terms are present but benign. Combining the efforts of a human moderator and artificial intelligence (AI) based solution can help navigate nuanced content, create a more fair review system, and decrease Trust and Safety issues.
  • Channel Type- The way users communicate on a digital forum can change over time. Trust and Safety processes that work for one channel may not be as effective for the next. When channel developers and Trust and Safety teams collaborate, new channels can be developed with user safety at top of mind.
  • False Positives- Similarly to misunderstanding the context of content, T&S teams that rely heavily on AI-based moderation solutions can issue consequences based on false-positive readings from the system. Blocking and banning innocent users can erode trust as greatly as exposing them to offensive content.

Examples of Trust and Safety Sensitive Verticals & Policy

All businesses these days incorporate some element of digital transactions—financial, informational, or otherwise. So, these businesses should understand the consequences of failing to implement a digital Trust and Safety strategy. Let’s look at four industries that should pay special attention to Trust and Safety threats and a few examples of existing policies for companies within each vertical.

Social/Team

As online communities continue to grow, it is critical to ensure the integrity of platforms remains strong—from a technical standpoint and a moral and ethical one. Let’s look at the policies two popular social forums have in place to protect users from cyberattacks and each other.

For more information: Establishing Trust and Safety in Social Communities

Facebook Trust and Safety

Users can also view Facebook’s Community Standards Enforcement Report from this central trust and safety page, transparently illustrating the policies for specific content infractions, the number of violations, and their subsequent consequences.

While these resources for Facebook users have always been available, its internal enforcement guidelines have not. When a list of VIP Facebook users was leaked and revealed them to have immunity from violation consequences, the company chose to publish its internal moderation and appeals processes in a last-ditch effort to regain the trust of its users.

Building your own app? Get early access to our Livestream or Video Calling API and launch in days!

eCommerce

The digital economy is exploding—millions of eCommerce sites (and more are being created daily) for users to shop anytime, from anywhere. But, Trust and Safety regulations become complicated when a supplier transacts with a consumer in another country.

For more information: Five Ways to Increase Trust in eCommerce

Etsy Trust and Safety

The money will go towards expanding their Trust and Safety team, processes, and tech stack, as well as providing education to sellers to help make their shops successful and education for buyers to build trust with them while also instilling behavioral expectations.

To keep eCommerce “human,” Etsy notes that it will never entirely defer to automated moderation tools; instead, it commits to expanding its content review team to ensure that all judgments are issued fairly and consider the situational context.

Gaming

The barrier to entry is low for most video and mobile gaming applications. With millions of unverified users emboldened by anonymity, community guidelines and auto-moderation tools are critical to mitigating harmful content and creating a safe environment that players can trust.

For more information: Fostering Trust & Safety for the Gaming Community and Metaverse

Discord Trust and Safety

Discord transparently provides examples of how its Trust and Safety team approaches multiple complaint scenarios, what they consider when creating new policies, and an example of a recently updated policy on a particular violation.

The company references its quarterly Transparency Reports to emphasize the importance of establishing trust with users and remind them of its promise to keep the Discord Trust and Safety team and policies accountable.

Marketplace

Trust is as valuable as currency in virtual marketplaces. These apps often require access to personally identifiable information (PII) like a user’s legal name, address, and credit card number. Not only must an online marketplace protect its users from data security breaches, but from other users who might misuse or abuse this information or be exposed to physical harm during face-to-face transactions.

For more information: Trust and Safety— Considerations for a Successful Marketplace

Instacart Trust and Safety

While their policies cover a wide swath of subject matter, they all universally center around three key “ingredients”: safety, respect, and integrity.

While they provide less detail on the violation review and decision-making process compared to the examples above, they provide the contact information of their Trust and Safety team on each page. Instacart is dedicated to improving the Instacart experience for all parties and has released new safety features within the past year to bolster efforts.

Regulations that Mandate Trust and Safety

In the United States

Section 230 of the U.S. Communications Decency Act (CDA) of 1996 mandates digital forum providers to regulate and remove content that violates federal law, like sex trafficking, drug sales, and copyright infringement. While the government does not hold businesses legally responsible for hosting and failing to regulate illegal 3rd party content on their apps, that does not mean they can turn a blind eye to community guideline violations without inviting in a certain degree of risk.

In the European Union

This year the European Parliament and Council reached a provisional political agreement on the Digital Services Act (DSA), setting the standard for more responsible businesses and safer online services for users. Details of the act include new transparency obligations to allow users to be better informed about how content is recommended to them, increased protection for victims of cybercrimes and steep DSA violation fines of up to 6% of the liable platforms’ revenue.

Implementing a Digital Trust and Safety Strategy

Businesses creating a Trust and Safety strategy from scratch should take stock of existing processes that might overlap with T&S. Identifying security-critical areas of your app that are lacking and reviewing current content for infractions are helpful practices can inform where your T&S team should focus their efforts first. From there, new hires, new technology, and new trends are the three key ingredients to staying at the top of your business’s Trust and Safety game.

Integrate a Content Moderation Solution

Automated moderation software can provide much-needed assistance to your team if your platform holds a high number of users and volume of content. Features like blocklists, shadowbanning, and time-outs can serve as the first line of defense to protect your online community from violations. However, the goal of moderation technology should be to use it in concert with your Trust and Safety team, not to replace them.

For More Information: The Ultimate Guide to Chat Moderation

Trust and Safety best practices evolve in response to advancements in communication technology. To ensure your digital T&S strategy remains effective throughout these changes, your company must commit to staying on top of industry trends and communicating policy updates to platform users.

For More Information: Join the Trust & Safety Professional Association

Trust and Safety Questions

1. What do Trust and Safety teams do?
Trust and Safety teams are comprised of professionals who specialize in moderation, T&S-conscious product development, research, and customer support. They work in concert to ensure that online applications are protected from cybercrimes and harmful user-generated content while also offering 1:1 recourse to users with T&S-related complaints.

2. What is Trust and Safety in tech?
As it relates to technology, Trust and Safety is a catch-all term that governs platforms, protects users, establishes and enforces community guidelines, and moderates content. The tech industry evolves quickly, so Trust & Safety departments, professionals, and policy must, too. Tech-based T&S groups must stay up to date on the latest trends in malware, phishing, neologism, and cybercrimes in order to effectively moderate digital forums.

3. Why is trust important in safety?
In order for users to feel safe, they must trust that the app or business they’re using values them and has their best interests at heart. Building trust with your audience as a brand takes time but losing it is far too easy. That is why it is important to have T&S policies and people in place so that even if something unsavory does occur, there is a direct line of communication and recourse for users to remind them that a team is looking out.

4. How do you build Trust and Safety?
Building T&S takes time—it requires earning the trust of your users, establishing a set of community guidelines, implementing moderation tools, and assembling a team. Following Trust and Safety best practices can help break down this seemingly daunting process and ensure that the time and resources you dedicate to it are used wisely.

5. What does a Trust and Safety analyst do?
A Trust and Safety analyst can act as a consultant or work in-house at a business to calculate the current risk to users and the brand itself based on a host of factors including the number of community guideline violations, time-to-resolution of complaints, or the platform’s data security. The analyst will take the results into consideration and develop a plan of action for that business to implement and strengthen its level of T&S.

Benefits of Improving the Trust and Safety of Your Brand

The benefits of improving your platform's digital Trust and Safety far outweigh the difficulties of creating a strong team, integrating a moderation solution, and implementing a strategy. Safe and trusting users turn into loyal brand ambassadors that boost the metrics that mean the most to your application. A strong T&S program can also improve online visibility, stay out of the wrong headlines, and gain a reputation of zero tolerance for hate speech, bullying, and other cybercrimes—all of which make your site a less risky, more appealing partner to investors and advertisers.

Integrating Video with your App?
We've built a Video and Audio solution just for you. Check out our APIs and SDKs.
Learn more ->