x
Breaking News
More () »

Is social media doing enough to curb hate speech?

Many recent articles have explored social media's role in hate speech.

GOLDEN VALLEY, Minn. -- Social media allows us to share parts of our lives online, connect with others and express our views. But a recent New York Times article dug into the ugly side of social media.

Before the Pittsburgh synagogue attack, the shooting suspect had posted anti-Semitic posts on a social networking site called Gab. The site is temporarily offline but its CEO Andrew Torba said that "Gab isn't going anywhere" and to "please keep pointing the finger at a social network instead of pointing the finger at the alleged shooter who holds sole responsibility for his actions."

But many recent articles have explored social media's role in hate speech.

On Monday, the New York Times searched one anti-Semitic hashtag on Instagram and found nearly 11,700 posts.

"You can go online, you can find other people that support your ideologies and it kind of creates this negative feedback loop," said Lucas Youngvorst.

Youngvorst teaches in the Communication Department at Normandale Community College in Bloomington. He has studied how social media shapes our interactions.

"I think particularly in light of recent events, particularly in light of how our political climate is, it's more important than ever to explore technology and how we're using that to interact with others," Youngvorst said.

The New York Times reported that Facebook, Twitter and Youtube have all announced plans on investing in artificial intelligence and other technology that finds and removes content, like hate speech, from their sites. But Facebook told the New York Times that this year only 38 percent of hate speech on its site was flagged by its systems, while it was able to track down and remove 96 percent of adult nudity and more than 99 percent of terrorist content. Facebook said during a hearing related to Russian interference in the 2016 election that it would add 10,000 more people to work on safety and security issues by the end of 2018. The CEO of YouTube also announced last year that Google wanted to have more than 10,000 people in 2018 working to address content that might violate the company's policies.

"Online, all it takes is the click of a 'like' button on some sort of hate speech meme that then expresses this ideology. You didn't have to form an argument, you didn't have to support your opinion," Youngvorst said. "We start to think that it has less of an effect. Oh, all I did was click a 'share' button. All I did was click a 'like' button. But soon when that post masses over 100,000 likes, it starts to get a viewpoint of this is a popular opinion. And what to you is a single 'like' click, has now spanned into a viewpoint that's supported by hundreds of thousands of people. That's a major effect."

Before You Leave, Check This Out