INTERVIEW: Future of Life Institute’s Jason Van Beek outlines the risks of federal AI preemption pushed by Congress
THE LOWDOWN:
Congress is pushing a ten-year federal preemption on artificial intelligence (AI) technology, facing bipartisan pushback from states, organizations, and some federal lawmakers.
Jason Van Beek, FLI’s chief government affairs officer, outlined the risks of the federal preemption in an interview with the Washington Reporter, saying that the past couple of years have seen an effort by state legislatures to regulate AI at the state level “to regulate AI in some aspects.”
Van Beek noted that one of the “many risks that people are seeing with this development of AI and super intelligence” is that of job replacement, and not just like one of the kinds of jobs, but whole categories.”
Forty state attorneys general sent a bipartisan letter calling on Congress to remove the proposal and several Republican lawmakers on Capitol Hill have been outspoken against the proposal, forming strange bedfellows with unlikely allies.
In technology, Moore’s Law outlines the pace in which the power of the microchip doubles: approximately every two years. This law holds true with the quick advancements of technology we have seen over the past several decades.
Now, artificial intelligence (AI) has entered the chat, and is growing much, much faster. The conversation around AI recently reached a fever pitch in the political world, as well, with Congress eyeing a ten year federal preemption on the technology.
The move by Congress, which would prohibit states from enforcing AI-related laws for ten years, has ruffled bipartisan feathers in state governments. Forty state attorneys general sent a bipartisan letter calling on Congress to remove the proposal and several Republican lawmakers on Capitol Hill have been outspoken against the proposal, forming strange bedfellows.
On top of the political response to the proposal, groups like the Future of Life Institute (FLI) have published their own action plans with recommendations for the federal government. FLI has also taken their fight to the airwaves with a new six-figure ad campaign against the federal AI preemption.
Jason Van Beek, FLI’s chief government affairs officer, outlined the risks of the federal preemption in an interview with the Washington Reporter.
Van Beek told the Reporter that the past couple of years have seen an effort by state legislatures to regulate AI at the state level “to regulate AI in some aspects.”
“Some narrow aspects, some broader aspects, and this has been something that the AI companies have decided it’s probably not in their best interest, to put it mildly, and have decided to lobby Congress to preempt those efforts,” Van Beek said.
Van Beek noted the provision in President Donald Trump’s Big, Beautiful Bill that contains a “sweeping preemption for all of these efforts by the states and localities to regulate AI.”
“And so it would make all of those efforts or all of those laws non-binding, non-enforceable, and irrelevant,” Van Beek said. “The unfortunate part of what Congress is trying to do here is that what they’re passing is just a preemption. There’s no corresponding regulation of AI companies that is paired with it.”
“What is going on here is just a sweeping preemption,” he said. “No rules at all for AI companies and anything the states have passed is irrelevant,” he continued. “And, basically, the end result is that the sandwich shop down the street is going to have more rules to follow than some of these huge AI companies [that are] very much associated with Big Tech.”
Van Beek noted that one of the “many risks that people are seeing with this development of AI and super intelligence” is that of job replacement, and not just like one of the kinds of jobs, but whole categories.”
“That’s sort of a sober prediction of some of these folks,” Van Beek added. “And… there’s just no effort at all, as far as I can tell, to come to terms with that. It’s just basically, let’s go full speed ahead with developing this technology, let the companies run wild, and just be completely unprepared for some of these foreseeable aspects of this.”