Hateful remarks flood YouTube livestream of congressional hearing on hate

0
503
Facebook And Google Reps Testify At House Hearing On Rise Of White Nationalism

Revealed: The Secrets our Clients Used to Earn $3 Billion


Now playing:
Watch this:

What Facebook and Google say they’re doing to combat…



13:29

Facebook And Google Reps Testify At House Hearing On Rise Of White Nationalism

Alexandria Walden, Google’s counsel for free expression and human rights, testifies at a House Judiciary Committee hearing about the rise of white nationalism on social media sites.


Getty Images

YouTube shut down the comment section on its livestream of a congressional hearing on white nationalism Tuesday after the section filled with hateful comments, underscoring the problem lawmakers had gathered to discuss.

Many of the comments expressed anti-Semitic views or decried multicultural societies. Others expressed white pride.

The comments overwhelmed the livestream as Neil Potts, Facebook’s public policy director, and Alexandria Walden, counsel for free expression and human rights at Google, appeared before the House Judiciary Committee to discuss the role of the platforms in the rise of white nationalism. Both companies have been under mounting pressure to combat hate speech following a string of hate-ridden events, including a white nationalist rally in Charlottesville, Virginia, in 2017 that led to one death, a shooting at a Pittsburgh synagogue last year that killed 11 and a terrorist rampage in New Zealand earlier this year that left 50 Muslim worshippers dead. 

The New Zealand shootings renewed pressure on tech companies because the gunman livestreamed part of the attack on Facebook. The video spread to other social media sites, including Twitter and Google-owned YouTube, raising questions about the ability of tech companies to combat hate speech.

The flood of toxic comments during the congressional hearing demonstrated the difficulty tech companies, which often rely on users to flag inappropriate comments, have monitoring activity on their platforms.

On Tuesday, YouTube opted to shut down a basic element of its service to control the problem.

“Hate speech has no place on YouTube. We’ve invested heavily in teams and technology dedicated to removing hateful comments/videos,” YouTube said in a tweet. “Due to the presence of hateful comments, we disabled comments on the livestream of today’s House Judiciary Committee hearing.”

Before the four-hour hearing, anonymous YouTube users made racist and anti-Semitic comments on the platform. The irony wasn’t lost on observers.

CNN reporter Donie O’Sullivan said in a tweet that having hateful comments appear beside a live video of a congressional hearing about hate on social media was “the most meta thing today.”

Rep. Jerry Nadler, chairman of the House Judiciary Committee and a New York Democrat, emphasized the seriousness of the issue.

“White nationalism and its proliferation online have real consequences,” Nadler said. “Americans have died because of it.”

Facebook’s Potts told the committee that it isn’t “simple” for the world’s largest social network to decide which posts it should keep and which it should pull down, because of the huge amount of information that flows through the site. Facebook has more than 2 billion users worldwide.

The remark was echoed by Google’s Walden, who said removing hate speech can be complex because content may be offensive but not violate YouTube’s policies against hate speech or inciting violence. It’s also contentious because there are disagreements on where to draw the line between political speech and hate speech.

“Overaggressive enforcement can also inadvertently silence voices that are using the platform to make themselves heard on these important issues,” Walden said.

Neutral platforms or editorial publications?

At one point during the hearing, officials from Google and Facebook were asked if their sites were neutral platforms or editorial publications. Potts said Facebook is a tech company. Walden said Google’s YouTube is a “free and open platform” for users to upload their own content.

Other witnesses included representative of civil rights groups that have urged both tech firms and the government to take action against hate speech.

Eileen Hershenov, senior vice president of policy at the Anti-Defamation League, said the resurgence of white supremacy has been fueled in part by social media sites, including fringe websites Gab and 8chan. The New Zealand mosque shooter used 8chan, a message board, to share his Facebook Live video.

White supremacists marching in Charlottesville, Virginia, in August 2017.

White supremacists marching in Charlottesville, Virginia, in August 2017.


Samuel Corum/Getty Images

“These platforms are like round-the-clock digital white supremacist rallies, creating online communities that amplify their vitriolic fantasies,” she said.

Candace Owens, a conservative activist and commentator who testified during the hearing, accused lawmakers of holding the hearing for political gain. Owens was named in a manifesto written by the man accused of carrying out the New Zealand shootings.

“It’s about fear-mongering power and control,” Owens said about the hearing. 

In several cases, questions from lawmakers highlighted their limited understanding of how some of the world’s biggest tech platforms work. Some asked Facebook whether users can report a post or if Instagram has the same rules as the social network.

Officials from Facebook and YouTube tried to assure lawmakers that they’ve been stepping up their efforts to combat hate speech, but they also emphasized some of the challenges they face. Both companies said they’ve invested in people and technology to help flag and remove hate speech before it spreads but are also balancing safety with giving people a voice.

Their efforts, however, weren’t evident in YouTube’s comment section, which caught the attention of lawmakers.

During the hearing, Nadler read snippets from a Washington Post story about the hateful comments that had appeared during the livestream. “This just illustrates part of the problem we’re dealing with,” he said.

On Wednesday, the Senate will meet to discuss “Stifling Free Speech: Technological Censorship and the Public Discourse.” Representatives from Facebook, Google and Twitter are expected to attend.

It’s unclear whether the comment section in that livestream will be open.

Originally published April 8, 5 a.m. PT.
Updates, 1:29 p.m.: Adds info on scheduled Senate hearing; April 9, 6 a.m.: Includes more information about Christchurch attack; 11:01 a.m.: Adds background from hearing; 12:36 p.m.: Includes comments from Nadler.


Now playing:
Watch this:

Zuckerberg explains the internet to Congress



2:42