Lifestyle

Facebook’s community report reveals content that violates its policies

Facebook disclosed ten policies on Facebook and four policies on Instagram across the world including Pakistan describing the categories of the content that violates its metrics and leads to the blockade or ban.

In its fourth edition of Community Standards Enforcement Report for quarters 2 and 3, it provided details about the content violated its policies subsequently facing blockade or ban on the social networking platforms.

Facebook reveals its metrics including the prevalence of the content violation; its action against it called actioned content; proactive action on the content even before it was reported; how much was detected before someone reported it.

The metrics include appealed content which describes how much content people appealed after it took an action; and the restored content which describes how much content was restored after Facebook initially took action.

In this first report for Instagram, Facebook reveals its strict content policy on four policy areas including child nudity and child sexual exploitation; regulated goods — specifically, illicit firearm and drug sales; suicide and self-injury; and terrorist propaganda.

While it uses the same proactive detection systems to find and remove harmful content across both Instagram and Facebook, the metrics may be different across the two services.

Facebook has also recently strengthened its policies around self-harm and made improvements to its technology to find and remove more violating content.

“On Facebook, we took action on about 2 million pieces of content in Q2 2019, of which 96.1% we detected proactively, and we saw further progress in Q3 when we removed 2.5 million pieces of content, of which 97.1% we detected proactively,” the statement issued said.

Related Post

On Instagram, it can be seen similar progress with the removal of about 835,000 pieces of content in Q2 2019, of which 77.8% we detected proactively, and it removed about 845,000 pieces of content in Q3 2019, of which 79.1% we detected proactively.

Dangerous Individuals and Organizations policy ban all terrorist organizations from having a presence on Facebook services. Facebook has identified a wide range of groups, based on their behavior, as terrorist organizations. Previous reports only included its efforts specifically against al Qaeda, ISIS and their affiliates as FB focused on measurement efforts on the groups understood to pose the broadest global threat.

Now, Facebook expanded the report to include the actions it is taking against all terrorist organizations. While the rate at which we detect and remove content associated with Al Qaeda, ISIS and their affiliates on Facebook has remained above 99%, the rate at which we proactively detect content affiliated with any terrorist organization on Facebook is 98.5% and on Instagram is 92.2%. We will continue to invest in automated techniques to combat terrorist content and iterate on our tactics because we know bad actors will continue to change theirs.

In this report, Fb has added prevalence metrics for content that violates its suicide and self-injury and regulated goods (illicit sales of firearms and drugs) policies for the first time. Because it cares most about how often people may see content that violates its policies, FB measure prevalence or the frequency at which people may see this content on our services. For the policy areas addressing the most severe safety concerns – child nudity and sexual exploitation of children, regulated goods, suicide and self-injury, and terrorist propaganda – the likelihood that people view content that violates these policies are very low, and we remove much of it before people see it.

As a result, when Facebook samples views of content in order to measure prevalence for these policy areas, many times we do not find enough, or sometimes any, violating samples to reliably estimate a metric. Instead, we can estimate an upper limit of how often someone would see content that violates these policies. In Q3 2019, this upper limit was 0.04%. Meaning that for each of these policies, out of every 10,000 views on Facebook or Instagram in Q3 2019, it is estimated that no more than 4 of those views contained content that violated that policy.

Over the last two years, Facebook has invested in proactive detection of hate speech so that we can detect this harmful content before people report it to us and sometimes before anyone sees it. Its detection techniques include text and image matching, which means we’re identifying images and identical strings of text that have already been removed as hate speech, and machine-learning classifiers that look at things like language, as well as the reactions and comments to a post, to assess how closely it matches common phrases, patterns, and attacks that we’ve previously seen in content that violates our policies against hate.

Facebook will continue to invest in systems that enable us to be proactive in combating hateful content on our services as well as the processes we use to ensure our accuracy in removing content that violates our policies while safeguarding content that discusses or condemns hate speech.

Similar to how Facebook reviews decisions made by its content review team in order to monitor the accuracy of our decisions, its teams routinely review removals by its automated systems to make sure it is enforcing the policies correctly. Facebook also continues to review the content again when people appeal and tell then that they made a mistake in removing their posts.

Facebook also launched a new page so people can view examples of how our Community Standards apply to different types of content and see where it has drawn the line.

Saman Siddiqui

Recent Posts

  • Sports News

Pakistan beat New Zealand by 9 runs to level T20I series

Pakistan levelled the T20I series against New Zealand by winning the final match by 9…

11 hours ago
  • Latest News

Power Division refutes reports of taxing solar power consumers

Power Division on Saturday refuted reports of taxing solar power consumers across the country. The…

11 hours ago
  • Hollywood News

Aaron Sorkin hints about a possible­ sequel to The­ Social Network

Aaron Sorkin has hinted about a possible­ sequel to The­ Social Network. "The Social Network"…

12 hours ago
  • Hollywood News

New Evil Dead film in works with Francis Galluppi to direct

A new Evil Dead film is in the works with Francis Galluppi to direct it.…

12 hours ago
  • Latest News

Sessions Judge of South Waziristan abducted from Tank and DI Khan Border

Sessions Judge Waziristan Shakirullah Marwat was reportedly abducted from Tank and DI Khan Border on…

15 hours ago
  • Lifestyle

19-year-old girl from Karachi undergoes successful heart transplant in Chennai

A 19-year-old girl from Karachi is reported to have received a successful heart transplant in…

15 hours ago