Internet Censorship Course / Book Workshop
In this exercise, you will explore the extent to which different platforms specify content moderation policies. We will explore platforms in several categories: posting forums, social media, and web hosting.
Content moderation is the process of monitoring user-generated content (UGC) on websites, social media, and other online platforms to ensure it meets certain standards of quality and appropriateness. Content moderation involves reviewing and removing content that violates a platform’s terms of service. Each platform independently sets these terms of service and determines what content is acceptable.
Content moderators review, approve, or reject UGC based on predetermined criteria, such as obscenity, hate speech, and copyright infringement. Content moderation is an important part of maintaining a safe and secure online environment.
Investigate the content moderation policies of the following types of platforms:
Try to find information about the content moderation policies for one site from each category. In particular, explore content delivery and web hosting, a a category of platforms for which content moderation is not often discussed.
Consider some of the following aspects of each content moderation: