Facebook leaked “dangerous individuals and organizations” list
Facebook bans more than 4,000 people and groups that society considers dangerous, including white supremacists, militarized social movements and suspected terrorists.
The Intercept on Tuesday released a leaked list of dangerous individuals and organizations that Facebook does not allow on its platform, giving insight into how the social network moderates content that could lead to offline violence. Experts told The Intercept that Facebook’s list and policy suggest the company is imposing tougher restrictions on marginalized groups.
More than half of the list consisted of suspected foreign terrorists who are mainly from the Middle East, South Asia and Muslims. Facebook has a three-tier system that indicates what type of content app the business will take. Terrorist groups, hate and criminal organizations are part of the most restrictive level 1.
The least restrictive Level 3 includes militarized social movements, which, according to The Intercept, “are predominantly right-wing American anti-government militias, which are virtually entirely white.”
Facebook’s Policy Director for Counterterrorism and Dangerous Organizations Brian Fishman stated in a series of tweets that the version of the list published by The Intercept is not exhaustive and is constantly updated.
âDefining and identifying dangerous organizations on a global scale is extremely difficult. There are no hard and fast definitions agreed upon by everyone, âhe said. Fishman also pointed out that terrorist groups like ISIS and al Qaeda have hundreds of individual entities, which skews the number of entities from a particular region.
âIt is important that FB documents each Wilayat of ISIS to facilitate enforcement, but counting each separately to support the argument that the overall list is biased is misleading,â he tweeted. The Tier 1 list, he said, also includes more than 250 white supremacist organizations.
Facebook has faced more pressure to be more transparent about its policy against dangerous individuals and organizations. In January, a supervisory board tasked with reviewing the social network’s toughest content moderation decisions overturned a decision to remove a post the company said violated that policy, noting that these “rules were not sufficiently clear for users “. The board recommended that Facebook publish its list of dangerous organizations and individuals or list examples.
Fishman said Facebook did not share the list “to limit legal risks, limit security risks and minimize opportunities for groups to circumvent the rules,” but is trying to improve the policy.