Pallone Stresses Need to Hold Big Tech Accountable by Sunsetting Section 230
"I reject Big Tech’s constant scare tactics about reforming Section 230. Reform will not 'break the internet' or hurt free speech. The First Amendment – not Section 230 – is the basis for our nation’s free speech protections and those protections will remain in place regardless of what happens to Section 230."
Energy and Commerce Committee Ranking Member Frank Pallone, Jr. (D-NJ) delivered the following opening remarks today at a Communications and Technology Subcommittee hearing on " Legislative Proposal to Sunset Section 230 of the Communications Decency Act:”
Today we continue the Committee’s work of holding Big Tech accountable by discussing draft legislation that Chair Rodgers and I circulated that would sunset Section 230 of the Communications Decency Act at the end of 2025. While I believe that Section 230 has outlived its usefulness and has played an outsized role in creating today’s “profits over people” internet, a sunset gives us time to have a serious conversation about what concepts are worth keeping.
Section 230 was codified nearly 30 years ago as a “Good Samaritan” statute designed to allow websites to restrict harmful content. While it was intended to be just one part of the Communications Decency Act, it was almost immediately left to exist on its own when most of that Act was deemed to be unconstitutional. Section 230 was written when the internet largely consisted of simple websites and electronic bulletin boards. Today, the internet is dominated by powerful trillion-dollar companies. Many of these companies have made their fortunes using sophisticated engagement and recommendation algorithms and artificial intelligence to harvest and manipulate our speech and data. All in an effort to maximize the time we spend on their platforms and to sell advertising.
Unfortunately, these platforms are not working for the American people, especially our children. But that shouldn’t surprise us. These companies aren’t required to operate in the public interest, like broadcasters. Nor do they have robust editorial standards, like newspapers. They aren’t a regulated industry, like so many other important sectors of our economy. The vast majority are publicly traded companies with a singular duty under corporate law – to maximize value for their shareholders by increasing their profits.
As a result, they face constant pressure to grow their user base, which these days means hooking children and teens. They introduce addictive features to keep us watching and clicking. They exploit our data to develop granular profiles on each of us to sell advertising, and then provoke our emotions to monetize our engagement. And with Section 230 operating as a shield to liability when people are harmed, making money remains the primary factor driving decisions.
As a result, provocative videos glorifying suicide and eating disorders, dangerous viral challenges, horrific child abuse images, merciless bullying and harassment, graphic violence, and other pervasive and targeted harmful content, is being fed nonstop to children and adults alike. Just this week, a popular event and ticketing platform was found to have been promoting illegal opioid sales to people searching for addiction recovery gatherings. We deserve better.
The fact that Section 230 has operated as a near complete immunity shield for social media companies is due to decades of judicial opinions trying to parse its ambiguities and contradictions. Judges have attempted to apply it to technologies and business models that could not have been envisioned when it was drafted. The Courts have expanded on Congress’s original intent and have created blanket protections for Big Tech that has resulted in these companies operating without any transparency or accountability. I do not believe that anyone could come before us now and credibly argue that we should draft Section 230 the same today.
Despite all of this, some courts have started to scrutinize the limits of Section 230 more closely. Moreover, major search engines have recently begun to substitute their own AI content over search results directing users to third-party sites. Not only does this demonstrate an intentional step outside of the shelter of Section 230’s liability shield and raise significant questions about its future relevance, but it also upsets settled assumptions about the economics of content creators and the reach of user speech.
It is only a matter of time before more courts chip away at Section 230, or the Supreme Court or technological progress upends it entirely. And state legislatures are growing impatient, increasingly passing bills seeking to introduce liability to tech platforms. But, for now, we are left with the status quo – a patchwork where, more often than not, Bad Samaritans receive broad protection from a statute intended to promote decency on the internet.
Congress should not wait for the courts—we should act. Our bipartisan draft legislation would require Big Tech and others to work with Congress over the next 18 months to develop and enact a new legal framework that works for the internet of today. I believe we can work together to develop a framework that restores the internet’s intended purpose of free expression, prosperity, and innovation.
And I reject Big Tech’s constant scare tactics about reforming Section 230. Reform will not “break the internet” or hurt free speech. The First Amendment – not Section 230 – is the basis for our nation’s free speech protections and those protections will remain in place regardless of what happens to Section 230.
We simply cannot allow Big Tech to continue to enjoy liability protections that no other industry receives. I look forward to the discussion, and I yield back the balance of my time.
###