Join the Movement

The Importance of Human Content Moderation & Digital Wellness



In this episode of the Brand Safety Exchange, our host, Tiffany Xingyu Wang, sits down with the latest appointed board member at Oasis Consortium, Phil Tomlinson, for a deep dive into the important role of humans in an effective content moderation strategy and how to think about digital wellness and building resilience for content moderation teams.

Phil is currently the Global Lead of Trust & Safety at TaskUs; a global digital outsourcer headquartered in Texas that focuses on providing content moderation services and technology to augment trust and safety across online platforms for high-growth technology companies across various categories including social, gaming, social, dating and fin-tech.

After 15 years in the space, Phil holds a rare perspective having worn several hats in the realm of risk and fraud, managing various customer-facing business operations with companies such as Cognizant and Twitter.

The Power of Connectivity, Inclusivity, and Access to Information

For Phil, who grew up in Johannesburg, everything started when he was 16 with the 1994 election in South Africa, which saw Nelson Mandela become its first democratically elected leader. This landmark event coincided with the advent of the internet in the region, popping the information bubble South Africa had been trapped in, and opening up its citizens to the rest of the world.

As he recalls in the conversation with Tiffany, “The internet consisted of chats, and forums and bulletin boards. And that's where I kind of spent my time.” Being on those platforms, Phil quickly noticed the huge amount of eye-opening information that was being circulated. In South Africa, dangerous information equated to anything that counteracted the government’s propaganda, and the information that was being provided was a revelation for its population. “It just blew my brain, I fell in love with the idea that information and knowledge could transform an individual and transform society and that was the key to accessing education. At its core, the internet is access to information and equitable access to information.”

The Evolution of Trust and Safety Issues

However, Phil could immediately see the downside to the freedom the Internet brought when there is the unfettered ability to provide and receive information. Platforms and online communities can quickly become black holes of abuse, exploitation, and attack.

Some of these dynamics became evident during his early role as a content moderator at Twitter. “The big problems we were trying to solve in those days were really spam and porn.” But, over the years, the trust and safety attacks became more radical and more complex, including how to navigate them in the many, many languages outside of English. The motivations of the bad actors have also changed from initially being financially driven, to today, where truth has rapidly become distorted as a vector for pushing personal or political agendas and online speech can quickly turned into real-world violence. The fact that access to the internet has spread to areas of the world where free speech has been traditionally restricted, has also played a role in amplifying those episodes of violence.

As the technologies and accessibility evolved, the format of content has also evolved which means there are more dark corners of the internet and the definition of toxicity is constantly changing. Moderation is now needed across, for example, live audio, memes, short-form video, and community rooms. As Tiffany adds, “it has since become an arms race out there.”

The Role of Human Content Moderation and How to Safeguard Digital Wellness

As the definition and manifestation of toxicity changes across time and space, it makes finding the right content moderation approach even more important. Today, effective content moderation involves maintaining the right balance between technology and human input as the core mechanisms for your efforts.

For example, while AI can undoubtedly offer the right tools and necessary scalability for moderation to take place, it is human beings who are the real judgment-makers and who can and should take a hyperlocal approach to laws and customs. It is far too risky to simply copy and paste a Western model of content moderation onto behavior in other jurisdictions.

Phil also adds that, "you need to make sure that the staff are employed within markets where you operate. For example, if you have folks in India doing content moderation, they should be speaking to psychologists and mental health professionals in their market, because they understand the local health care rules, they understand the local ethical approach to mental health in the workplace, they understand the cultural nuances.”

And with technology, comes a unique set of challenges. A key issue facing the industry right now is data flow where a client working with multiple partners is controlling and holding back their data, resulting in an disconnected system where critical information is not able to be shared or acted on. The closer that ecosystem can work together in a symbiotic way, the better trust and safety can be implemented.

Finally, you have to have people looking outside of what some would consider the “mainstream” online communities to understand what might be coming next. Keep an eye on, for example,Telegram and Discord channels where emerging trends may surface first.

The conversation concludes with Tiffany and Phil sharing an important message of support for the digital well-being of moderators. Day in and day out their role is centered around being exposed to toxic and traumatic content. It is extremely important for employers to support the trust and safety team by understanding their needs and providing them with specialized resources.

In addition, with global layoffs in the tech industry, along with the constant emergence of new threats and the arrival of a whole new set of Web 3.0 governance, the next wave of industry innovation will bring new challenges, but also new opportunities to get things right.

Key Trust & Safety Lessons

  • Content moderation teams should deeply understand the local culture and local frameworks of reference for the material and behaviors they are responsible for managing
  • Remember that content moderation involves creating and maintaining the right synergy between technology and human beings; and that optimisation can include well-designed and managed data feedback loops.
  • Address and help maintain the health and well-being of your trust and safety teams with specialized and localized support
  • Observe key trends emerging on platforms such as Discord and Telegram to see what may play out as media and social connectivity continue to evolve.

Watch the full interview here or listen and subscribe to the Brand Safety Exchange podcast on Apple Podcasts and Spotify.