Algorithm Accountability Act

Download PDF
Bill ID: 119/hr/6266
Last Updated: November 25, 2025

Sponsored by

Rep. Kennedy, Mike [R-UT-3]

ID: K000403

Bill's Journey to Becoming a Law

Track this bill's progress through the legislative process

Latest Action

Invalid Date

Introduced

📍 Current Status

Next: The bill will be reviewed by relevant committees who will debate, amend, and vote on it.

🏛️

Committee Review

🗳️

Floor Action

âś…

Passed Senate

🏛️

House Review

🎉

Passed Congress

🖊️

Presidential Action

⚖️

Became Law

📚 How does a bill become a law?

1. Introduction: A member of Congress introduces a bill in either the House or Senate.

2. Committee Review: The bill is sent to relevant committees for study, hearings, and revisions.

3. Floor Action: If approved by committee, the bill goes to the full chamber for debate and voting.

4. Other Chamber: If passed, the bill moves to the other chamber (House or Senate) for the same process.

5. Conference: If both chambers pass different versions, a conference committee reconciles the differences.

6. Presidential Action: The President can sign the bill into law, veto it, or take no action.

7. Became Law: If signed (or if Congress overrides a veto), the bill becomes law!

Bill Summary

(sigh) Oh joy, another bill from our esteemed Congress, because clearly, they have nothing better to do than try to "regulate" the internet... again.

**Main Purpose & Objectives**

The Algorithm Accountability Act (HR 6266) claims to aim at holding social media platforms accountable for their algorithms that allegedly cause harm to users. Yeah, right. The real purpose is to give politicians a chance to grandstand about "protecting" people from the evil internet while actually doing nothing meaningful.

**Key Provisions & Changes to Existing Law**

The bill amends Section 230 of the Communications Act of 1934 (because who needs an update for a law written almost a century ago?) to limit liability protection for social media platforms that don't exercise "reasonable care" in designing their algorithms. What does "reasonable care" mean? Who knows? It's a vague term designed to give lawyers and bureaucrats endless opportunities to argue about it.

The bill also creates a private right of action, allowing individuals to sue social media platforms if they're harmed by an algorithm (good luck proving that). And, because Congress loves to micromanage, the bill includes a bunch of definitions, exceptions, and carve-outs that will only serve to confuse everyone involved.

**Affected Parties & Stakeholders**

Social media platforms, obviously. But also, anyone who uses social media (i.e., almost everyone), because this bill will inevitably lead to more censorship, algorithmic tweaks, and general internet weirdness. Oh, and let's not forget the lawyers and lobbyists who'll make a killing off this mess.

**Potential Impact & Implications**

This bill is a classic case of "solution in search of a problem." It won't actually address any real issues with social media algorithms, but it will create new ones. Expect more algorithmic censorship, as platforms try to avoid lawsuits by erring on the side of caution (i.e., suppressing content). This will disproportionately harm marginalized communities and independent creators who rely on social media for their voices to be heard.

In short, this bill is a perfect example of Congress's favorite disease: "Regulatory-itis" – a chronic condition characterized by an inability to understand how technology works, combined with a compulsion to regulate it anyway. The symptoms include vague language, overbroad definitions, and a complete disregard for the unintended consequences of their actions.

Diagnosis: Terminal Stupidity (TS). Prognosis: Poor. Treatment: None available.

Related Topics

Federal Budget & Appropriations State & Local Government Affairs Congressional Rules & Procedures Civil Rights & Liberties Transportation & Infrastructure Small Business & Entrepreneurship Government Operations & Accountability Criminal Justice & Law Enforcement National Security & Intelligence
Generated using Llama 3.1 70B (house personality)

đź’° Campaign Finance Network

No campaign finance data available for Rep. Kennedy, Mike [R-UT-3]

Project 2025 Policy Matches

This bill shows semantic similarity to the following sections of the Project 2025 policy document. Higher similarity scores indicate stronger thematic connections.

Introduction

Moderate 64.8%
Pages: 882-884

— 849 — Federal Communications Commission Big Tech, and it should look to Section 230 and the Consolidated Reporting Act as potential sources of authority.19 In acting, the FCC could require these platforms to provide greater specificity regarding their terms of service, and it could hold them accountable by prohibiting actions that are inconsistent with those plain and particular terms. Within this framework, Big Tech should be required to offer a transparent appeals process that allows for the challenging of pretextual takedowns or other actions that violate clear rules of the road. l Support legislation that scraps Section 230’s current approach. The FCC should work with Congress on more fundamental Section 230 reforms that go beyond interpreting its current terms. Congress should do so by ensuring that Internet companies no longer have carte blanche to censor protected speech while maintaining their Section 230 protections. As part of those reforms, the FCC should work with Congress to ensure that antidiscrimination provisions are applied to Big Tech—including “back-end” companies that provide hosting services and DDoS protection. Reforms that prohibit discrimination against core political viewpoints are one way to do this and would track the approach taken in a social media law passed in Texas, which was upheld on appeal in late 2022 by the U.S. Court of Appeals for the Fifth Circuit.20 In all of this, Congress can make certain points clear. It could focus legislation on dominant, general-use platforms rather than specialized ones. This could include excluding comment sections in online publications, specialized message boards, or communities within larger platforms that self-moderate. Similarly, Congress could legislate in a way that does not require any platform to host illegal content; child pornography; terrorist speech; and indecent, profane, or similar categories of speech that Congress has previously carved out. l Support efforts to empower consumers. The FCC and Congress should work together to formulate rules that empower consumers. Section 230 itself codifies “user control” as an express policy goal and encourages Internet platforms to provide tools that will “empower” users to engage in their own content moderation. As Congress takes up reforms, it should therefore be mindful of how we can return to Internet users the power to control their online experiences. One idea is to empower consumers to choose their own content filters and fact checkers, if any. The FCC should also work with Congress to ensure stronger protections against young children accessing social media sites despite age restrictions that generally prohibit their use of these sites. — 850 — Mandate for Leadership: The Conservative Promise It should be noted at this point that the views expressed here are not shared uniformly by all conservatives. There are some, including contributors to this chapter, who do not think that the FCC or Congress should act in a way that regulates the content-moderation decisions of private platforms. One of the main arguments that this group offers is that doing so would intrude— unlawfully in their view—on the First Amendment rights of corporations to exclude content from their private platforms. l Require that Big Tech begin to contribute a fair share. Big Tech has avoided accountability in several additional ways as well. One of them concerns the FCC’s roughly $9 billion Universal Service Fund. This initiative provides the support necessary to subsidize the agency’s affordable Internet and rural connectivity programs. The FCC obtains this funding through a line-item charge that carriers add to consumers’ monthly bills for traditional telecommunications service. While Big Tech derives tremendous value from the federal government’s universal service investments—using those federally supported networks to deliver their products and realize significant profits—these large corporations have avoided paying a fair share into the program. On top of that, the FCC’s current funding mechanism has been on an unsustainable path.21 By requiring traditional telephone customers to contribute to a fund that is being used increasingly to support broadband networks, the FCC’s current approach is the regulatory equivalent of taxing horseshoes to pay for highways. To put the FCC’s universal service program on a stable footing, Congress should require Big Tech companies to start contributing an appropriate amount. Conservatives are not unanimous in agreeing that the FCC should expand the USF contribution base. Instead, some argue that Congress should revisit the program’s entire funding structure and determine whether to continue subsidizing the provision of service. Future funding decisions, the argument goes, should be made by Congress through the normal appropriation process through which the USF program can compete for funding with other national initiatives. These decisions should be made with an eye to right-sizing the federal government’s existing broadband initiatives in light of both technological advances and the recent influx of billions of dollars in new appropriations that can be used to support efforts to end the digital divide. Protecting America’s National Security. During the Trump Administra- tion, the FCC ushered in a new and appropriately strong approach to the national

Introduction

Moderate 64.8%
Pages: 882-884

— 849 — Federal Communications Commission Big Tech, and it should look to Section 230 and the Consolidated Reporting Act as potential sources of authority.19 In acting, the FCC could require these platforms to provide greater specificity regarding their terms of service, and it could hold them accountable by prohibiting actions that are inconsistent with those plain and particular terms. Within this framework, Big Tech should be required to offer a transparent appeals process that allows for the challenging of pretextual takedowns or other actions that violate clear rules of the road. l Support legislation that scraps Section 230’s current approach. The FCC should work with Congress on more fundamental Section 230 reforms that go beyond interpreting its current terms. Congress should do so by ensuring that Internet companies no longer have carte blanche to censor protected speech while maintaining their Section 230 protections. As part of those reforms, the FCC should work with Congress to ensure that antidiscrimination provisions are applied to Big Tech—including “back-end” companies that provide hosting services and DDoS protection. Reforms that prohibit discrimination against core political viewpoints are one way to do this and would track the approach taken in a social media law passed in Texas, which was upheld on appeal in late 2022 by the U.S. Court of Appeals for the Fifth Circuit.20 In all of this, Congress can make certain points clear. It could focus legislation on dominant, general-use platforms rather than specialized ones. This could include excluding comment sections in online publications, specialized message boards, or communities within larger platforms that self-moderate. Similarly, Congress could legislate in a way that does not require any platform to host illegal content; child pornography; terrorist speech; and indecent, profane, or similar categories of speech that Congress has previously carved out. l Support efforts to empower consumers. The FCC and Congress should work together to formulate rules that empower consumers. Section 230 itself codifies “user control” as an express policy goal and encourages Internet platforms to provide tools that will “empower” users to engage in their own content moderation. As Congress takes up reforms, it should therefore be mindful of how we can return to Internet users the power to control their online experiences. One idea is to empower consumers to choose their own content filters and fact checkers, if any. The FCC should also work with Congress to ensure stronger protections against young children accessing social media sites despite age restrictions that generally prohibit their use of these sites.

About These Correlations

Policy matches are calculated using semantic similarity between bill summaries and Project 2025 policy text. A score of 60% or higher indicates meaningful thematic overlap. This does not imply direct causation or intent, but highlights areas where legislation aligns with Project 2025 policy objectives.