Internet Censorship Course / Book Workshop
Transparency reports are documents that provide information about the activities of a company or organization, typically focused on the following types of activities:
These reports can typically include information about the organization’s finances, operations, and compliance with laws and regulations. They may also include information about the organization’s policies and procedures, as well as its commitment to ethical and responsible business practices. Transparency reports are often used by companies to demonstrate their commitment to transparency and accountability.
However, these reports are also typically based on companies’ self-reporting practices and thus may not be complete or accurate. For example, companies may withhold various details about the specific content that was removed, who asked for the content to be removed, the legal basis for the removal, and so forth.
In this activity, you will investigate the transparency reports of a number of companies and organizations to determine what the reveal, and what information you’d perhaps like to know more about but the reports do not disclose.
Read the transparency reports for two of the following companies, and (if you have time), find one more that is not listed:
While these transparency reports focus on a wide variety of activities, in this class we are primarily focused on activities that relate to speech (e.g., takedown notices, removal of content, censorship, etc.). Try to focus your attention on those portions of the reports. We have linked a few of these sections above specifically for your convenience.
The European Union has recently passed a new regulation called the Digital Services Act (DSA). This regulation requires large online platforms to publish transparency reports that provide detailed information about their content moderation practices. In this related part of the activity, you will explore some of the information that platforms do (and do not) make available as part of their obligations under the DSA.
The Digital Services Act (DSA) is a landmark European regulation that became effective in February 2024, with some obligations applied to Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) starting in August 2023. It aims to make the digital world safer by ensuring that what is illegal offline is also illegal online, holding intermediaries accountable. The DSA imposes various transparency and content moderation requirements, especially on VLOPs, which are platforms with over 45 million monthly users in the EU.
One of the key obligations under the DSA is for VLOPs to publish a Transparency Database where they must provide public access to their content moderation decisions, known as “Statements of Reasons” (SoRs). These statements must include detailed information on the decision-making process, including whether it was automated and on what legal grounds, as well as pathways for appeal. Furthermore, the DSA allows vetted researchers to access platform data to study systemic risks, such as online gender-based violence.
Think about the following questions: