Facebook launches internal standards for prohibiting posts

0
319
TOPSHOT-US-INTERNET-FACEBOOK

Revealed: The Secrets our Clients Used to Earn $3 Billion

Facebook is providing its users more insight into what is and isn’t permitted on the social media.


Josh Edelson/ AFP/GettyImages

Facebook wishes to be more transparent about what is and isn’t permitted on the world’s biggest social media.

On Tuesday, the business launched an upgraded variation of its Community Standards standards– the guidelines that determine what’s appropriate material for its 2.2 billion users to publish.

Facebook’s guidelines themselves have not altered. What’s brand-new is the release of the extensive standards its material mediators utilize to deal with objectionable product. Previously, users might just see surface-level descriptions of what they could not publish. Now, the guidelines supply information on how Facebook handles particular circumstances and specifies particular terms.

For example, the social media states it specifies a “mass murder” as a murder that “results in 4 or more deaths in one incident.” And in the area on harassment, Facebook states individuals can’t send out a message that “calls for death, serious disease or disability, or physical harm” or “claims that a victim of a violent tragedy is lying about being a victim.”

< div class ="shortcode video v2" data-video-playlist="[{" id="" analytica="" is="" just="" the="" tip="" of="" iceberg="" for="" facebook="" data="" issues="" former="" director="" with="" analytics="" firm="" said="" thousands="" companies="" like="" cambridge="" likely="" have="" on="" millions="" users.="" news="" video="">

britney still1


Now playing:
Watch this:

Cambridge Analytica is just the tip of the iceberg for…



1:40

Facebook is also expanding its rules around appeals. Where before, you could request an appeal only if your Facebook profile, Page or Group was taken down, you now can challenge the social network about the removal of an individual piece of content. Users can also appeal Facebook’s decision to preserve content they’d reported as a violation of the company’s rules.

“These standards will continue to evolve as our community continues to grow,” Monika Bickert, vice president of product policy and counterterrorism, said last week at a press briefing in Facebook’s Menlo Park, California, headquarters. “Now everybody out there can see how we’re instructing these reviewers.”

Facebook has been in the hot seat since last month’s scandal involving Cambridge Analytica, a digital consultancy that improperly accessed data on up to 87 million Facebook users without their consent. The controversy has put several of Facebook’s policies and practices under the microscope.

Bickert says the new transparency around Facebook’s Community Standards doesn’t have anything to do with that controversy, however.

“I’ve been in this job for five years,” Bickert said. “We’ve wanted to do this for that entire time.”

After Facebook published the guidelines, the Anti-Defamation League applauded Facebook for its transparency, but said the company needs to go further. The organization wants Facebook to work with independent organizations and academic researchers “to open up Facebook’s data around hate speech for study.”

“It is imperative for Facebook to explain how hate content spreads on the platform, and how their policies are enforced in ways consistent with both internal standards and with the ethical standards of civil society,” the ADL said in a statement. 

Hot seat

Facebook has been under pressure to clarify its moderation guidelines since the 2016 election, when Russian trolls abused Facebook with a combination of paid ads and organic posts to sow discord among American voters. Many conservatives have also criticized the platform for what they see as political bias.

When Mark Zuckerberg was grilled by Congress two weeks ago, lawmakers repeatedly asked him about what is — and isn’t — allowed on Facebook.

Rep. David McKinley, a Republican from West Virginia, mentioned illegal listings for opioids posted on Facebook, and asked why they had not been taken down. Other Republican lawmakers asked why the social network had removed posts by Diamond and Silk, two African-American Trump supporters with 1.6 million Facebook followers.

In November, Facebook said it will add 20,000 content moderators, up from 10,000 last year. In his testimony to Congress, Zuckerberg said the real breakthrough will come when artificial intelligence tools will be able to proactively police the platform’s content, although it will take “years” before that kind of technology will be good enough.

In the meantime, Bickert said Facebook’s moderators do a good job overall of taking down inappropriate material. Still, some things fall through the cracks.

“We have millions of reports every week,” Bickert said. “So even if we maintain 99 percent accuracy, there’s still going to be mistakes made.”

Facebook has also talked about further expanding its appeals process to include opinions of people outside the company. In an interview with Vox earlier this month, Zuckerberg mentioned the idea of a Facebook “Supreme Court,” made up of independent members who don’t work for the company. Their role would be to make the “final judgement call” on what’s acceptable speech on Facebook.

Bickert didn’t address that idea last week, but said the company is “always exploring new options” for appeals.

Facebook also said it wants community input on how its guidelines should evolve. In May, it’s launching a new forum called Facebook Open Dialogue to get feedback on its policies. The first events will take place in Paris, Berlin and the UK. Events in the US, India and Singapore are planned for later this year.

First published April 23, 5:36 p.m. PT.
Update, April 24 at 10:36 a.m. PT: Adds statement from the Anti-Defamation League

Cambridge Analytica: Everything you need to know about Facebook’s data mining scandal.

Tech Enabled: CNET chronicles tech’s role in providing new kinds of accessibility.