A recent investigation by The New York Times has shed light on Pornhub's content moderation practices concerning child sexual abuse material (CSAM) approximately five years ago. The report details how, while executives stated their commitment to removing CSAM from the platform, internal debates and differing interpretations of what constituted 'obvious' illegal content hindered their efforts.
According to the investigation, disagreements arose regarding the age of children depicted in videos. One employee reportedly stated that they considered material 'obvious' CSAM only when it involved children as young as three years old. This revelation raises concerns about the effectiveness of Pornhub's moderation system and its ability to protect children from exploitation.
The New York Times report highlights the complexities involved in identifying and removing CSAM from large online platforms. It also underscores the importance of clear definitions, consistent enforcement, and robust monitoring to prevent the distribution of illegal and harmful content. The findings have reignited discussions about the responsibility of online platforms in safeguarding children and combating online child sexual abuse.
Pornhub's Handling of Child Sexual Abuse Material Examined
A New York Times investigation reveals details about Pornhub's content moderation practices five years ago. While executives claimed to remove child sexual abuse material (CSAM), internal disagreements arose over what constituted 'obvious' illegal content. One employee reportedly defined 'obvious' as involving a child as young as three years old. The report raises serious questions about the company's commitment to protecting children and preventing the distribution of CSAM.