Republican governors oppose 10-year moratorium on state AI laws in GOP tax bill
28 comments
·June 28, 2025chisleu
norir
I would love to see a reverse atlas shrugged where all the programmers just stopped working and we could see how much the executive class could do without them through the magic of ai. As it stands, I feel most workers are increasingly facilitating their own dispossession.
greybox
You're a principle engineer who doesn't see the value in training juniors ...
chisleu
I did not say that I don't see the value in training juniors. I said that I don't have a need for them anymore. I can teach Claude in 1 API call what takes a day to walk a junior through.
Furthermore, I think we are going to find less and less work for Juniors to do because Seniors are blasting through code at a faster and faster pace now.
I'm not the only one saying that the entry level market is already getting trashed...
hyperliner
This sounds like “You are a manager who doesn’t see the value in training typists” or “You are a refrigerator seller who doesn’t see the value in training icemen.”
lazyeye
It's more than this.
You may think your job's not at risk because you're a plumber. But you're not realising that you will be competing with millions of new plumbers fleeing AI-decimated industries pushing down wages dramatically.
And what if China wins on AI and now Huawei can produce tech gear that is dramatically superior/cheaper to global competitors. So now Chinese tech dominates the globe giving enormous power and control to the the CCP.
chisleu
Absolutely right.
null
leptons
So "states rights" doesn't really matter to these people like they've been saying it does.
siliconc0w
There are about a zillion examples of them citing a principal like, 'states rights' and then immediately abandoning it when it suits them for things like gun control, abortion access, seizing control of a state's national guard, gender affirming care, etc.
The problem is that they are directionally correct in that it would be bad to have a patch work of laws around AI but the alternative is we leave it to congress which has consistently shown an inability to thoughtfully regulate or reform anything - just pass mega spending bills and increase the debt limit.
mikem170
> it would be bad to have a patch work of laws around AI
Why would that be bad? And for who?
Wouldn't it be better to have a variety of laws around something new, and figure out over time what is optimal? Wouldn't this be better than having one set of laws that can be more easily compromised via regulatory capture? Why the common assumption that bigger and more uniform is better? Is that to encourage bigger companies and bigger profits? Has that been a good thing?
siliconc0w
Because if you want to sell an AI product you now need to hire an army of lawyers to do the state-by-state compliance. This dramatically increases the costs and slows down critical innovation. Another common argument is that any regulation will allow China to 'win the AI race' but I don't entirely agree with that premise - it's not a 'race' and if China 'wins' it'll be because they largely use their debt to finance effective high ROI industrial policy rather than mega tax breaks.
peterhadlaw
"care"
baby_souffle
> So "states rights" doesn't really matter to these people like they've been saying it does.
Nor does the deficit (and at least a dozen other big issues)
The term "performative bad faith" comes to mind...
frogperson
Just call it what it is. Fascism and authoritarianism.
FranzFerdiNaN
Yep. Conservatism only cares about one thing: protecting its own in-group while hurting the rest.
thrance
They have no values. The only thing one can find them consistently advocating for is their own selfish interests.
shortrounddev2
Republicans do not have principles, only an unceasing desire for power. Any time they quote some principle at you, they are lying. They are trying to manipulate your sense of fairness to cynically get what they want. They will stab you in the back at the first opportunity. Republicans can not be trusted under any circumstances
null
techpineapple
It’s wild the dichotomy between libertarian and trad conservative in the Republican Party. You’ve got both the people who want to automate all the jobs away, and Tucker Carlson saying if FSD eliminates 2 million trucking jobs than we shouldn’t do it.
ETH_start
Libertarians don't believe that automation leads to fewer jobs being available. They look at the past 200 years of automation and see that as more tasks are automated, labor productivity simply increases.
techpineapple
I think this has changed. Historically yes, you’re right, but I think modern thinkers either think productivity will accelerate so we can have UBI(Sam Altman), or in some cases, have a very utilitarian perspective that if we need less people, we need less people(Peter Thiel)
apwell23
they simply don't care about jobs numbers. chips fall where they may
GuinansEyebrows
'Dark Money' [0] describes how this ideological situation came to be. pretty interesting stuff.
The way I feel about this is. 1. It's going to pass 2. It's not going to get overturned by this Supreme Court 3. It's going to have the biggest impact on the world of any law.
Right now, The US and China are in an AI war. The US is doing everything it can to stop China from making progress on AI like it was a nuclear bomb. And it just might be that consequential in 10 years.
Where I am now is past the "3 sleepness nights" of 'Co-Intelligence' fame.
If you haven't seen a properly contexted (50k-100k tokens, depending on the size of the project(s)) LLM work in a code repo, then you have no idea why so many of us are terrified. LLMs are already taking jobs. My company laid off 7% of the workforce because of LLM's impact directly. I say that not because the CEO said it, but because I see it in my day to day. I'm a Principal Engineer and I just don't have need of Juniors anymore. They used to be super useful because they were teachable, and after some training you can offload more and more work to them and free up your time to work on harder problems.
With MCPs, LLMs aren't limited to the editor window anymore. My models update my JIRA tickets for me, rip content from the wiki into it's markdown memory bank which is kept in repo and accelerates everyone's work. It's connecting to databases to find out schemas and example column data. Shit, as I'm typing this it's currently deploying a new version of a container to ECR/ECS/Fargate with terraform for a little project I'm working on.
I believe we are in the very early days of this technology. I believe that over the next ten years, we are going to inundated with new potential for LLMs. It's impossible to foresee all the changes this is going to bring to society. The use cases are going to explode as each tiny new feature or new mode evolves.
My advice is to get off of the sidelines and level up your skills to include LLM integrations. Understand how they work, how to use them effectively, how to program system integrations for them... agents especially! Agents can be highly effective at many use cases. For instance, an Agent that watches a JIRA board for new tickets which contain prompts to be executed in certain repos, then executes the prompt and creates a PR for the changes. All in a context that is fully aware of your environment, deployment, CI/CD, secrets management, etc.
Anything will be possible sooner than we expect. It's going to impact the poorest people the most. A really cyberpunk reality could be upon us faster than we expect, including the starving masses stuggling to get enough to even survive.