Facebook steps up battle versus phony news in groups and messaging

0
373
Facebook like logo is seen on an android mobile phone

Revealed: The Secrets our Clients Used to Earn $3 Billion


Getty Images

Facebook has stated its users are publishing more in the social media network’s personal areas, consisting of groups and messaging, a shift that might make it harder for the tech giant to authorities offending material.

On Wednesday, the business stated it’s taking brand-new actions to stop false information, rip-offs and other “problematic” material from going viral on the platform, with a few of the modifications using to personal Facebook groups, which let users publish material just group members can see. The business has actually been slammed in the past for refraining from doing enough to stop false information about vaccines and other subjects from spreading out in groups.

The brand-new actions offer a peek into how the world’s biggest social media network, which has more than 2 billion users worldwide, is moderating material as users pivot to sharing more info independently

img-0216

Facebook’s vice president of Integrity, Guy Rosen, speaks at an interview on Wednesday at the tech business’s head office in Menlo Park, California.


Queenie Wong/CNET

“Ultimately, the balance between protecting people’s privacy and protecting public safety is something that societies have been grappling with for centuries probably, and we’re certainly grappling with it,” Guy Rosen, Facebook’s vice president of stability, stated throughout an interview at the business’s Menlo Park, California, head office.

The business stated that in the coming weeks it’ll begin taking a look at how administrators and mediators of Facebook groups choose what material to maintain. That’ll assistance Facebook identify whether a group is breaching the social media network’s guidelines. The business is likewise launching a Group Quality function so group administrators can see what material was eliminated and flagged, consisting of phony news. Facebook groups that consistently share false information will appear lower on the social media network’s News Feed.

news facebooknewsfeed


Now playing:
Watch this:

Facebook is putting women on the front line of its war…



4:06

Facebook has community standards that prohibit hate speech, nudity, violence and other offensive content. Misinformation and clickbait, though, don’t always violate Facebook’s rules, unless there’s a risk of offline violence or the content is trying to discourage or prevent people from voting.

Antivaccine content, for example, can fall in a “gray area” because it’s challenging to link content to something that happens offline, said Tessa Lyons, Facebook’s head of News Feed integrity. 

Facebook said it’s been using technology, human reviewers and user reports to flag and remove content in groups that violates its rules, even if the groups aren’t public. That’s allowed Facebook to proactively detect offensive content even before someone reports it to the company, Rosen said.

The company said it’ll also soon let people remove their posts and comments from a group even if they’re no longer a member.

This week, Facebook is also adding a verified badge for high-profile people in its messaging app, signaling to users whether a scammer is impersonating someone else. Earlier this year, as part of an effort to combat misinformation, the company released a tool to let users know if a message has been forwarded.

The social network unveiled a variety of other steps it’s taking to combat fake news, following criticism that its efforts aren’t working well enough. Facebook said it’s working with journalists, fact-checking experts, researchers and other groups to find new solutions to fight misinformation more quickly. The Associated Press, which reportedly stopped fact-checking for the company in February, is returning to fact-check videos and Spanish content in the US.

Facebook acknowledged it still has more work to do as user behavior on the site changes.

Users are sharing photos and videos that vanish in 24 hours via a feature called Stories. That makes policing the content challenging.

“Now there’s a clock ticking and that’s actually a huge amount of pressure,” said Alex Deve, Facebook’s product management director who works on Stories.

Users can also use text, stickers or drawings to change a photo or video in a way that violates Facebook’s rules. Someone could also string together images of a product, price tag and email to sell an item that the social network doesn’t allow like guns and drugs. Individually, each image might be okay, but put together it would violate Facebook’s rules. 

“We actually don’t have all the answers,” Deve said. “There’s a lot of things here we are learning.”

Originally published April 10, 10 a.m. PT.
Update, 12:17 p.m.: Adds remarks from Facebook’s press conference.
Update, 2:20 p.m.: Adds more background about Facebook Stories.