Facebook has published the fifth edition of its Community Standards Enforcement Report, providing metrics on how well it enforced the policies from October 2019 through March 2020. Facebook has spent the last few years building tools, teams, and technologies to help protect elections from interference, prevent misinformation from spreading on its apps, and keep people safe from harmful content.
So when the COVID-19 crisis emerged, FB had the tools and processes in place to move quickly and it was able to continue finding and removing content that violates its policies. When we temporarily sent our content reviewers home due to the COVID-19 pandemic, we increased our reliance on these automated systems and prioritized high-severity content for FB teams to review in order to continue to keep the apps safe during this time.
This report includes data only through March 2020 so it does not reflect the full impact of the changes it made during the pandemic. FB anticipates to see the impact of those changes in its next report, and possibly beyond, and it will be transparent. For example, for the past seven weeks, FB couldn’t always offer the option to appeal content decisions and account removals, so it expects the number of appeals to be much lower in its next report. Facebook also prioritized removing harmful content over measuring its efforts, so we may not be able to calculate the prevalence of violating content during this time. This report shows the impact of advancements FB has made in the technology it uses to proactively find and remove violating content.
What’s New in This Report?
Facebook is now including metrics across twelve policies on Facebook and metrics across ten policies on Instagram. The report introduces Instagram data in four issue areas: Hate Speech, Adult Nudity and Sexual Activity, Violent and Graphic Content, and Bullying and Harassment. For the first time, FB is also sharing data on the number of appeals people make on content it has taken action against on Instagram, and the number of decisions FB overturns either based on those appeals or when it identifies the issue itself. FB has also added data on its efforts to combat organized hate on Facebook and Instagram.
Progress in Finding and Removing Violating Content
Facebook improved our technology that proactively finds violating content, which helped us remove more violating content so fewer people saw it.
Facebook continued to expand its proactive detection technology for hate speech to more languages and has improved existing detection systems. FB’s proactive detection rate for hate speech increased by more than 8 points over the past two quarters totaling almost a 20-point increase in just one year. As a result, FB is able to find more content and can now detect almost 90% of the content FB removes before anyone reports about it. In addition, thanks to other improvements FB made to its detection technology, it has doubled the amount of drug content we removed in Q4 2019, removing 8.8 million pieces of content.
On Instagram, improvements were made to the text and image matching technology to help find more suicide and self-injury content. As a result, the company increased the amount of content it took action by 40% and increased its proactive detection rate by more than 12 points since the last report. It also made progress in its work combating online bullying by introducing several new features to help people manage their experience and limit unwanted interactions, and we announced new Instagram controls today. FB sharing enforcement data for bullying on Instagram for the first time in this report, including taking action on 1.5 million pieces of content in both Q4 2019 and Q1 2020.
Lastly, improvements to its technology for finding and removing content similar to existing violations in the databases helped it take down more child nudity and sexually exploitative content on Facebook and Instagram.
Over the last six months, Facebook started to use technology more to prioritize content for our teams to review based on factors like virality and severity among others. Going forward, FB plans to leverage technology to also take action on content, including removing more posts automatically. This will enable its content reviewers to focus their time on other types of content where more nuance and context are needed to make a decision.
The Community Standards Enforcement Report is published in conjunction with FB’s bi-annual Transparency Report that shares numbers on government requests for user data, content restrictions based on local law, intellectual property take-downs, and internet disruptions.
In the future. Facebook will share Community Standards Enforcement Reports quarterly, so the next report will be released in August.