Ask HN: Is your company forcing use of AI?
61 comments
·July 1, 2025peanutz454
Not forced. Encouraged. Everyone used it earlier without revealing it, now it is open. Knowing how and when to use your tools properly is a good idea.
InfamousRece
At least 20% of code must be AI generated with the goal of at least 80% by the end of the year. CEO declared that vibe coders create better solutions because they are “goal oriented” as opposed to traditional coders that are “process oriented”.
dep_b
We have Cursor, it’s a step back from using Claude + MCP and it hallucinates a lot because of poor context management. But that’s not the real reason I’m using LLM’s less than before.
* The codebase consists of many modules with their own repo each
* The test back end has a lot of gotchas, borked settings or broken ongoing work by the back-end team
* The code is incredibly layered
I’m spending up to 10% of my time writing actual code. The rest is overhead like multi-repo PR’s, debugging, talking to people etcetera.
Once I found the issue, the code is the easy part and explaining it all to the LLM is more work.
Assistive coding tools need to get a lot better
c12
Not forced but the tooling has been made available to those who ask. Work have provided Microsoft Copilot through Teams and Github Copilot through my IDE of choice.
I found the Microsoft Copilot to be reasonably good when given a complete context with extremely limited scope such as being provided a WSDL for a SOAP service and asked to write functions that make calls and then writing unit tests for the whole thing. This had a right way and a wrong way of doing things and it did it almost perfectly.
However, if you give it any problem that requires imagination with n+1 ways of being done it flounders and produces mostly garbage.
Compared to the Microsoft Copilot I found the Github Copilot to feel lobotomised! It failed on the aforementioned WSDL task and where Microsoft's could be asked "what inconsistencies can you see in this WSDL" and catch all of them, Github's was unable to answer beyond pointing out a spelling mistake I had already made it aware of.
I have personally tinkered with Claude, and its quite impressive.
My colleagues have had similar experiences, with some uninstalling the AI tooling out of frustration at how "useless" it is. Others, like myself, have begun using it for the grunt work; mostly as "inteligent boilerplate generator."
esskay
Ours is going the other way and wanting people not to use it. A losing battle but the people making decisions are a bit fuddy duddy about this sort of stuff, we just keep getting links posted about how much energy it consumes talking to chatgpt.
dyl000
The place I work seems to be open to the fact that its not an all seeing, all knowing force in the world. Though we do use it as a quicker search engine.
I've heard of companies that are shoehorning it into everything, I feel this is many companies just playing the game to get better valuations.
boxed
Using AI as a search engine seems like the worst of the worst. "You can't lick a badger twice" is a thing...
tupac_speedrap
Yep, the same thing happened while Blockchain was a thing, all the companies were suddenly doing it to look valuable in front of shareholders or the board but in reality it is a niche thing that isn't useful to most companies
pfp
At least in my 2.5 person devops team, no.
Also I can't imagine how being handed a bunch of autogenerated terraform and ansible code would help me. Maybe 10% of my time is spent actually writing the code, the rest is running it (ansible is slow), troubleshooting incidents, discussing how to solve them & how to implement new stuff, etc.
If someone works in a devops position where AI is more of a threat, I'd like to hear more about it.
frocodillo
No one’s being forced, but we’re encouraged to explore and experiment with AI tools. And not just for writing code. It's a quite firm belief in the company as a whole that the winners in the 'AI age' will be the companies that are able to utilize AI tools improve their internal workflows and become more productive. So we get to try out lots of different things, and we make sure to share our learnings with each other.
ceva
The company where i work is actually halting all AI Projects for next couple of months due to huge cost involved, however copilot stays. Fintech
cyphax
I work at a small web company (.net based, Netherlands) and we're just experimenting with it. We have a paid copilot subscription, but nothing about it is mandatory in any way. But this place is conservative in the sense that self hosting is the norm and cloud services like Azure or even github (we self host Gitea) are not, other than MS 365 for Teams and e-mail.
Kiyo-Lynn
It's not forced, but the atmosphere has definitely shifted. These days, before we even start on a task, the first question is often "Can we solve this with AI?" Over time, it starts to feel like we're working around the tool instead of focusing on the actual problem.
mrweasel
No, and most seems to avoid using AIs for pretty much anything. The usage I've seen has been mostly inspirational.
We are allowed to use AI for coding, with the understanding that we are responsible for the code generated by the AI, not legally obviously, but functionally.
I know Microsoft has slowly started to force everyone to use AI. Is it also happening in your company?