Facebook has removed a string of far-right extremists from its platform, saying their accounts violated its policies against dangerous individuals and organizations.
Among those banned from the platform are Alex Jones and his site controversial site Infowars, Milo Yiannopoulos, Paul Joseph Watson, Laura Loomer, Paul Nehlen and Louis Farrakhan.
These figures are now banned from creating accounts on both Instagram and Facebook, but users will still be able to publish posts supporting their viewpoints.
The crackdown comes as the social media giant has been criticized for its failure to curb the spread of misinformation, as well as hateful and extremist content on the site.
Scroll down for video
Facebook has removed a string of far-right extremists from its platform, saying their accounts violated its policies against dangerous individuals and organizations
‘We’ve always banned individuals or organizations that promote or engage in violence and hate, regardless of ideology,’ a Facebook spokesperson told Dailymail.com.
‘The process for evaluating potential violators is extensive and it is what led us to our decision to remove these accounts today.’
Prior to Thursday’s announcement, Jones and Infowars had been banned from creating accounts or distributing content on Facebook, Twitter, Spotify and Apple’s App Store.
But the latest move represents a step up in action taken against Jones, who has suggested on his Infowars shows that the 2012 Sandy Hook massacre was a hoax, among other sensational claims.
Any accounts that share content from Infowars will see it immediately removed from Facebook or Instagram and will be subject to a ban if they violate this policy more than once,
Yiannopoulos, Watson, Loomer, Nehlen and Farrakhan will be barred from creating accounts on Instagram and Facebook.
Louis Farrakhan (left) and Alex Jones (right) are among the numerous far-right and anti-Semitic figures banned by Facebook and Instagram. The firm said it conducts a lengthy process to determine whether an account violates its policies around hateful and dangerous content
Facebook will also take down any pages, groups and accounts that are created to represent them, as well as events promoting where the far-right figures are participating.
A Facebook spokesperson told Dailymail.com that it conducts a lengthy process to determine which people or groups it considers to have a violent or hateful mission.
Among the factors include whether or not the person has called for or directly carried out acts of violence against people based on characteristics like race, ethnicity or national origin.
The firm also considers whether they’re a self-described or identified follower of a hateful ideology, use hate speech or slurs in their bio on Facebook, Instagram or other social sites, and whether they’ve had pages, groups and accounts removed from Facebook or Instagram for violating its hate speech policies, the spokesperson added.
WHAT DO FACEBOOK’S GUIDELINES FOR CONTENT SAY?
Facebook has disclosed its rules and guidelines for deciding what its 2.2 billion users can post on the social network.
1. Credible violence
Facebook says it considers the language, context and details in order to distinguish casual statements from content that constitutes a credible threat to public or personal safety.
2. Dangerous individuals and organisations
Facebook does not allow any organizations or individuals that are engaged in terrorist, organized hate, mass or serial murder, human trafficking, organized violence or criminal activity.
3. Promoting or publicising crime
Facebook says it prohibit people from promoting or publicizing violent crime, theft, and/or fraud. It does not allow people to depict criminal activity or admit to crimes they or their associates have committed.
4. Coordinating harm
The social network says people can draw attention to harmful activity that they may witness or experience as long as they do not advocate for or coordinate harm.
5. Regulated goods
The site prohibits attempts topurchase, sell, or trade non-medical drugs, pharmaceutical drugs, and marijuana as well as firearms.
6. Suicide and self-injury
The rules for ‘credible violence’ apply for suicide and self-injury.
7. Child nudity and sexual exploitation of children
Facebook does not allow content that sexually exploits or endangers children. When it becomes aware of apparent child exploitation, we report it to the National Center for Missing and Exploited Children (NCMEC).
8. Sexual exploitation of adults
The site removes images that depict incidents of sexual violence and intimate images shared without permission from the people pictured.
Facebook removes content that purposefully targets private individuals with the intention of degrading or shaming them.
Facebook’s harassment policy applies to both public and private individuals.
It says that context and intent matter, and that the site will allow people to share and re-share posts if it is clear that something was shared in order to condemn or draw attention to harassment.
11. Privacy breaches and image privacy rights
Users should not post personal or confidential information about others without first getting their consent, says Facebook.
12. Hate speech
Facebook does not allow hate speech on Facebook because it says it creates an environment of intimidation and exclusion and in some cases may promote real-world violence.
13. Graphic violence
Facebook will remove content that glorifies violence or celebrates the suffering or humiliation of others.
It will, however, allow graphic content (with some limitations) to help people raise awareness about issues.
14. Adult nudity and sexual activity
The site restricts the display of nudity or sexual activity.
It will also default to removing sexual imagery to prevent the sharing of non-consensual or underage content.
15. Cruel and insensitive
Facebook says it has higher expectations for content that defined as cruel and insensitive.
It defines this as content that targets victims of serious physical or emotional harm.
Facebook is trying to prevent false advertising, fraud and security breaches.
It does not allow people to use misleading or inaccurate information to artificially collect likes, followers or shares.
Facebook will require people to connect on Facebook using the name that they go by in everyday life.
18. False news
Facebook says that there is also a fine line between false news and satire or opinion.
For these reasons, it won’t remove false news from Facebook, but, instead, significantly reduce its distribution by showing it lower in News Feed.
Facebook will memorialise accounts of people who have died by adding “Remembering” above the name on the person’s profile.
The site will not remove, update or change anything about the profile or the account.
20. Intellectual property
Facebook users own all of the content and information that they post on Facebook, and have control over how it is shared through your privacy and application settings.
21. User requests
Facebook say they will comply with:
- User requests for removal of their own account
- Requests for removal of a deceased user’s account from a verified immediate family member or executor
- Requests for removal of an incapacitated user’s account from an authorised representative
22. Additional protection of minors
Facebook complies with:
- User requests for removal of an underage account
- Government requests for removal of child abuse imagery depicting, for example:
- Beating by an adult
- Strangling or suffocating by an adult
- Legal guardian requests for removal of attacks on unintentionally famous minors