Facebook has over 2 billion users, all producing and sharing content. Most of it is harmless, but some of it isn’t. How does Facebook regulate content showing child abuse, animal
.. show full overview
Facebook has over 2 billion users, all producing and sharing content. Most of it is harmless, but some of it isn’t. How does Facebook regulate content showing child abuse, animal cruelty, self-harm and hate speech? This revealing documentary offers unique undercover footage inside Facebook’s “moderating hub”. It presents a stark picture of an organization putting money before morality and for whom extreme content equals extreme profits.
A video of a man hitting a small boy was shared more than 44,000 times on Facebook within two days of it being posted. The video is still widely available on the platform. Nicci Astin, an online child abuse campaigner, has repeatedly complained to Facebook about the video, but they told her that it did not violate Facebook’s community standards.
In training at CPL Resources - a Dublin based content moderation contractor that has worked with Facebook since 2010 - the video is shown as an example of content that should be marked as disturbing, meaning it remains on the site, but is restricted to certain viewers. A moderator at CPL explains that “if you start censoring too much then people lose interest in the platform…It’s all about making money at the end of the day”.