Skip to main content
Image
Photo of Committee panel

Pallone Opening Remarks at Hearing on Social Media’s Role in Radicalizing America

September 24, 2020

Energy and Commerce Chairman Frank Pallone, Jr. (D-NJ) delivered the following opening remarks at today's Consumer Protection and Commerce Subcommittee hearing titled, "Mainstreaming Extremism: Social Media's Role in Radicalizing America:"

Extremists who sow chaos and division, incite violence and impose their radical beliefs and ideologies remain an ever-present threat to our welfare, society and way of life. The magnitude of that threat has become painfully clear in recent years.

Eleven worshipers massacred in a rampage at a synagogue in Pittsburg; more than 46 shoppers killed or injured in a mass shooting targeting ‘Mexicans' at a shopping center in El Paso, Texas; 9 Black worshipers murdered by a white supremacist during a Bible study in Charleston, South Carolina; two killed and one wounded by a vigilante shooter in Kenosha, Wisconsin.

Extremists are even exploiting COVID-19 to divide our nation in this time of crisis. According to the New Jersey Office of Homeland Security and Preparedness, in March, a New Jersey white supremacist group altered its propaganda and social media posts to falsely claim that immigration and Jewish people are behind the pandemic. Their members have since attended anti-lock down events to push these egregiously false claims.

There is a lot of anger and resentment across the country. Only last night, we saw pleas for justice in the Breonna Taylor case left unanswered by the judicial system. And this led to violence in the streets, including the shooting of two police officers. Such violence is absolutely unacceptable and we must work to promote civility and unity. The right to peacefully assemble and seek redress from the government is a Constitutional right. But I fear that extremists online and in the streets are taking advantage of these turbulent times to divide our nation and threaten its people.

Acts of violence, disruption and misinformation represent a troubling trend of increasing extremist activities across the country. Four of the six most deadly years for domestic extremist killings have occurred since 2015. And experts warn that current political and social polarization, coupled with increasing economic disparity, are ideal conditions for even further rises in extremism.

While extremism is not new, methods for indoctrinating, radicalizing and mobilizing individuals have evolved over the past decade. The internet, an efficient tool for the dissemination of hate and radical ideologies traditionally left in the fringes of society, has become the predominant incubator for extremism.

And social media platforms have served as the predominant outlets for bigotry, conspiracy theories and incitements to violence. Social media companies represent their platforms as forums for connecting people, but the ability to connect like-minded people and susceptible individuals is being exploited by extremists to recruit and radicalize people.

We have seen too many stories of young people, many of which are socially and physically isolated, who found a sense of belonging with people who turn out to be members of a hate group. They get lured by gradually more extreme content and then risk the threat of losing these new social connections if they do not conform. And conforming too often includes taking part in acts of hatred, intimidation and even violence.

This is deeply concerning and this activity is magnified by social media's business model, which only makes the problem worse. The primary goal of social media platforms is gaining more active users. Their profitability depends on growth and engagement — more eyes on their pages for longer periods of time leads to more advertising dollars. And as research has shown, lies, outrage and novelty are more engaging than neutral, bland content.

It's not that employees at these companies make a conscious choice to allow this hateful and harmful content on their sites, but their algorithms are programmed to optimize growth and engagement often without consideration of the content that achieves it. These algorithms don't just let harmful content exist on the platform, they actually amplify it.

The deluge of harmful content drowns out counter messaging — content that could correct misinformation, undercut extremists' narratives and discredit the ideologies and actions of violent extremists. Because people are already overwhelmed with information, it's clear that this bad speech cannot be corrected solely with more speech.

Rather than acknowledge and tackle the root of the problems on their platforms, social media companies address only the public relations aspects of these problems. They rely heavily on user reporting to address extremist content — allowing disinformation, violent and inciting content to be viewed thousands of times before it's flagged or removed.

What's more, there is no doubt that some political figures are fanning the flames of extremism. This toxic political rancor is contributing to the rise of right-wing extremism, which has become the greatest terrorist threat facing our nation today. The majority of all terrorist incidents in the U.S. since 1994 were committed by right-wing extremists.

We've had several hearings over the past few years trying to shine light on these problems and imploring social media companies to act. It's clear they won't do it on their own. I look forward to hearing from our witnesses today about their experiences and their ideas for how Congress can help.