Facebook has published the fourth edition of its Community Standards Enforcement Report.
In the report, it was revealed that between July and September 2019, 11.6 million pieces of child abuse content relating to child nudity and child sexual exploitation were removed from Facebook and 754,000 from Instagram.
To which, over 99% of those were “proactively detected,” thus indicating that Facebook still relies on third-party reports.
This report is the first time the tech giant is releasing figures for Instagram and including the number of posts related to self-harm and suicide, due to public outrage over the death of 11-year-old Molly Russell.
In 2017, the teenager committed suicide, and soon after her father found explicit material on self-harm and suicide on her Instagram account.
Facebook’s vice president, Guy Rosen said: “We remove content that depicts or encourages suicide or self-injury, including certain graphic imagery and real-time depictions that experts tell us might lead others to engage in similar behaviour.
“We place a sensitivity screen over content that doesn’t violate our policies but that may be upsetting to some, including things like healed cuts or other non-graphic self-injury imagery in a context of recovery.”
A total of 2.5 million pieces of content related to self-harm and suicide had been removed from Facebook and 845,000 from Instagram.
A further 4.4 million pieces of drug-sales content were removed from Facebook and 1.5 million from Instagram, and 2.3 million pieces of fire-sales content were removed from Facebook.
The post #Privacy: More than 11m child abuse posts removed by Facebook appeared first on PrivSec Report.