Oasis Consortium - Insights

The State of Transparency Reports: What’s Working, and What’s Not

Written by Tiffany Xingyu Wang | Jan 1, 2020 5:00:00 AM

Transparency reports have been around for a while. For almost a decade, tech companies have been providing information about company operations and enforcement of guidelines, along with data on government and third-party requests for user data.

At their best, transparency reports are a critical measure of how a platform manages its own guidelines for online behavior: whether offensive or dangerous behavior is recognized, and dealt with in an effective and consistent manner. Moreover, transparency reports are where the people can find information on government and third-party requests for data - how many requests were made, and whether the platform complied with those requests or not.

These reports can help the public understand a company’s policies and procedures, and the security of sensitive data.

What is a transparency report?

A transparency report is a publication that is published on a regular basis, outlining a company’s:

  • Operations
  • Guidelines
  • Enforcement
  • Terms of service
  • Requests for data

In order to be an effective tool, a transparency report should be:

  • Published regularly - at least annually
  • Accessible - on a company’s website, ideally
  • Consistent - using the same metrics year-over-year to make it possible to track trends

Facebook has released its 2020 transparency report, with some interesting findings. There was a 23% increase in government requests for user data, from the last half of 2019 to the first half of 2020. There was also a 40% increase in content restrictions, which the company attributed to new restrictions related to COVID-19. 

What is working, and what is not?

The Electronic Frontier Foundation (EFF) publishes a transparency report tracker every year, called the “Who Has Your Back” report. In this, they review 16 transparency reports from top tech companies and grade the reports on six different categories:

  • Legal Requests
  • Platform Policy Requests
  • Notice 
  • Appeals Mechanisms
  • Appeals Transparency
  • Santa Clara Principles

In the 2019 report1 (the most recent year available), the EFF noted that platforms are adopting some content moderation best practices well, including the ability for users to appeal the removal of content. However, in some cases the appeals process is better than the content removal notification process - creating a critical gap between a user’s knowledge of removal and appeal of removal.

Many companies measure content removal, but do not measure the attribution of content that is removed with a person’s inclusion in a marginalized group or protected class. This creates a gap in our understanding of how content moderation policies may be enforced in a way that has undue impact on a protected group.

Closing these gaps and others like them will be very important to improving the usefulness of transparency reports going forward.

A recent study2 conducted by the European Commission of the EU found that “many online users engage with platforms on the basis of blind trust, ignorant of common practices in the online world or of the ways in which platforms generate revenue.” The Commission recommended a combination of transparency by design and transparency by default: which would be the joint responsibility of platforms and regulators.

Making strides

As more users, organizations, and platforms become invested in organizing and monitoring user-generated content and requests for personal data, transparency reports will be more important to generating trust between users, platforms, and regulators. However, this will require a few changes to build a truly effective environment for platform transparency. This includes actionable standards for transparency reporting, making metrics consistent from one platform to another, and between different time periods at the same platform. Data that speaks to content moderation in light of a user’s inclusion in a protected or marginalized group must be included in transparency reports. And they must be regularly monitored and evaluated by a trusted third-party organization that will publish regular reports for the public.

1https://www.eff.org/wp/who-has-your-back-2020
2https://ec.europa.eu/info/sites/info/files/transparency_of_platforms-study-final-report_en.pdf