Meta Ramps Up Removal of Violence-Inciting Content
US tech giant Meta, formerly known as Facebook, took action on 21.7 million comments, videos, photos, and posts that were violent or incited violence in Q1 of 2022, according to their latest Community Standards Enforcement report. This figure is nearly twice the rate of Q4 in 2021, when the company detected and acted on 12.4 million incidents of violent content. An increase in their response to such content was also seen on Instagram, although it was relatively minimal, rising from 2.6 million incidents at the end of last year, to 2.7 million in the first few months of 2022.
But does the increase reflect a rise in the number of violent posts, or does it simply show that Meta is getting better at discovering them? According to the report, the platform would argue that it is the latter, claiming that over 98 percent of detected violent content was taken down before users reported it, after it had undergone an “expansion of proactive detection technology.”
Meta hit the news yet again in recent weeks following the live streaming of the racist mass shooting in Buffalo, New York, which was then uploaded onto Facebook. The platform has been heavily criticized for being too slow to remove the content, with The Washington Post reporting that footage was left up for more than ten hours, in which time it was shared 46,000 times.
Our chart shows data from the past nine months, ever since Meta devised an additional categorization of detected content, labeled ‘Violence and Incitement.’ This is in addition to the older categorization called ‘Violent and Graphic Content’, which focuses on graphic imagery as opposed to actual calls for violence.
Description
This chart shows the increase in violent content detected on Facebook since Q4 of 2021.
Related Infographics
Any more questions?
Get in touch with us quickly and easily.
We are happy to help!
Statista Content & Design
Need infographics, animated videos, presentations, data research or social media charts?