Tech News

Federal moratoriums on state AI rules are approaching death. Why it matters

How state and local governments will regulate AI under proposals before Congress is restricted. AI leaders say the move will ensure the U.S. can lead innovation, but critics say it could lead to fewer consumer protections for rapidly developing technologies.

The proposal, adopted by the House of Representatives, says no state or political segment “can enforce any laws or regulations that regulate AI models, AI systems or automated decision-making systems” for 10 years. In May, the House added it to the full budget bill, which also included extending the 2017 federal tax break and cutting services such as Medicaid and SNAP. The Senate has made some changes that only needs a moratorium for states that accept funds as part of the $42.5 billion broadband, equity, access and deployment plan.

AI map collection

AI developers and some lawmakers say federal actions are meant to prevent states from cobbled with different rules and regulations in the U.S. that could slow down technology growth. Since Openai’s Chatgpt exploded on site in late 2022, the rapid growth of generative AI has allowed the company to incorporate technology into as much space as possible. The economic impact is significant, as the U.S. and China race is to understand which country’s technology will dominate, but the generated AI poses privacy, transparency and other risks to consumers that lawmakers are trying to reduce.

“[Congress has] For years, no meaningful protection legislation was done for consumers,” Ben Winters, director of AI and Privacy of the American Consumer Federation, told me: “If the federal government fails to take action and then they say no one else can take action, that will only benefit tech companies. ”

Efforts to limit states’ ability to regulate artificial intelligence could mean consumer protection around a technology that increasingly penetrates every aspect of American life. “There is a lot of discussion at the state level, and I think it’s important to solve this problem at multiple levels,” said Anjana Susarla, a professor of AI research at Michigan State University. “We can do it at the national level. We can do it at the state level. I think we all need both.”

Several states have begun regulating AI

The proposed language will prohibit the enforcement of any regulations, including those already on books. Exceptions are rules and laws that enable AI development and the rules and laws that apply the same standards to non-AI models and systems that perform similar things. These regulations have begun to pop up. The biggest focus is not in the United States, but in Europe where the EU has implemented standards in AI. But the country began to take action.

Colorado passed a series of consumer protections last year, which will take effect in 2026. Last year, California adopted more than a dozen AI-related laws. Other state laws and regulations often deal with specific issues, such as Deepfakes or requiring AI developers to publish information about their training data. At the local level, certain regulations also address potential employment discrimination if AI systems are used.

“In the case where they want to regulate in AI, the states are on the map,” said Arsen Kourinian, a partner at Meyer Brown Law Firm. So far, in 2025, state lawmakers have made at least 550 recommendations around AI, according to data from the national legislature. In a House Committee hearing last month, Rep. Jay Obernolte, a Republican from California, expressed hope to lead higher state regulations. “Our legislative runway is limited and we can solve this problem before states lead too far,” he said.

Although some states have laws on books, not all states have already entered into force or see any enforcement. This limits the potential short-term impact of the suspension, said Cobun Zweifel-Keegan, managing director of the Washington International Privacy Professionals Association. “No law enforcement is yet.”

Zweifel-Keegan said the suspension could prevent state lawmakers and policy makers from formulating and introducing new regulations. “The federal government will be the main and potentially sole regulator around AI systems,” he said.

What is the meaning of suspension of national artificial intelligence regulations

AI developers have asked for consistency and simplification to any guardrails placed on their work.

“As an industry and a country, as an industry and a country, both a clear federal standard,” Alexandr Wang, founder and CEO of Data Company Scale AI, told lawmakers at an April hearing. “But we need one, we need to be clear about a federal standard and have preemptive results to prevent you from having 50 different standards.”

At a Senate Commerce Committee hearing in May, OpenAI CEO Sam Altman told Senator Ted Cruz, a Republican from Texas, that the EU-style regulatory system was “disastrous” to the industry. Altman advises the industry to develop its own standards.

Altman asked, Hawaii Democrat Brian Schatz asked if industry self-regulation is sufficient, Altman said he thought some guardrails would be good, but, “it’s easy to go too far. (Disclosure: CNET’s parent company Ziff Davis filed a lawsuit against OpenAI in April, accusing Ziff Davis of infringing on Ziff Davis’ copyright in training and operating its AI system.)

However, not all AI companies are supporting the suspension. In a New York Times column, humanity CEO Dario Amodei called it a “too dull tool” and said the federal government should create transparency standards for AI companies. “Having this national transparency standard will not only help the public, but Congress understands how the technology is developed so that lawmakers can decide whether further government action is needed.”

Kourinian said the company’s concerns, the “deployment” of developers who create AI systems and interact with consumers, often stem from concerns that states will enforce important work, such as impact assessments or transparency notifications, before launching products. Consumer advocates say more regulations are needed and hindering the country’s capabilities could undermine users’ privacy and security.

The Curinians say the suspension of rules and laws on specific states could lead to more consumer protection issues in court or state attorney general in court or state attorney general. Existing laws surrounding unfair and deceptive practices still apply. “Time will show how the judge will explain these issues,” he said.

Susarla said the prevalence of AI across the industry means states may be able to regulate issues such as privacy and transparency more broadly without focusing on the technology. However, the suspension of AI regulations could lead to such policies being bound in litigation. “There has to be some kind of balance between ‘we don’t want to stop innovating’, but on the other hand, we also need to recognize that there may be real consequences,” she said.

Zweifel-Keegan said many policies surrounding AI systems governance did happen because of the so-called technically injusticable rules and laws. “It is also worth remembering that there are many existing laws and it is possible that new laws will not trigger a suspension, but as long as they apply to other systems, they will apply to AI systems.”

Senator Ted Cruz and Senator Maria Cantwell sat in Dais during a congressional hearing. Cantwell pointed, Cruz placed his hand on his chin.

The proposed 10-year national AI law has been in the hands of the U.S. Senate, and its Commerce, Science and Transportation Commission has held hearings on AI.

Nathan Howard/Bloomberg by Getty Images

Will the AI ​​suspension be passed?

As the bill is now in the hands of the U.S. Senate, and as more people realize the proposal, debates on suspending debate have been proposed. The proposal did clear a significant procedural barrier, and Senate members ruled that it did pass the so-called Bird Rule, which states that proposals included in the budget settlement plan must actually deal with the federal budget. Winters told me that the move to link the suspension of the disbursement with the states receiving bead funds might help.

Whether or not it is now passed in its current form is a procedural question, not a political question, Winter said. Senators on both sides, including Republican Josh Hawley and Marsha Blackburn, expressed their concerns about the kidnapping of the country.

“I do think that it will pass as the current writing, if not procedurally, is a strong public question.”

Subsequently, any bill approved by the Senate must also be accepted by the House, where it passes with the narrowest profits. Even some House members who voted for the bill expressed dislike of the person who paused, namely Rep. Marjorie Taylor Greene, a key ally of President Donald Trump. The Georgia Republican posted on X this week that she “stubbornly opposed” the suspension and she would not vote on the suspension measures as a bill.

At the state level, a letter signed by the attorney generals of 40 states on both sides called on Congress to refuse a moratorium, instead establishing a broader regulatory system. “The bill does not propose any regulatory plan to replace or supplement the laws enacted or currently being considered by states, leaving Americans completely free from the potential harm of AI,” they wrote.



Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button