Ask HN: Has AI stolen the satisfaction from programming?
72 comments
·October 13, 2025conductr
If programming is woodworking, using AI is ikea assembly except they packed most the wrong parts in the box so I have to deal with customer service to go back and forth to get the right parts and the hardware parts don’t always function as intended leaving me to find my own.
It’s a different, less enjoyable, type of work in my opinion.
D13Fd
Plus the item you build may not be exactly what you initially requested, and you'll have to decide whether it's something you are willing to accept.
butlike
> It’s a different, less enjoyable, type of work
This is an elegant way of putting it. I like it
tpoacher
Not necessarily an identical thought to OP, but, anecdotally (n=1), my experience teaching the exact same course on Advanced Java Programming for the last 4 years has been that the students seem to be getting more and more cynical, and seem to think of programming as an art or as a noteworthy endeavour in itself less and less. Very few people have actually vocalised the "why do I even need to learn this if I can write a prompt" sentiment out loud, but it has been voiced, and even from those who don't say it there's a very definite 'vibe' that is all but screaming it.
Whereas the vibe in the lecture theatre 4 years ago was far more nerdy and enthusiastic. It makes me feel very sorry for this new generation that they will never get to enjoy the same feeling of satisfaction from solving a hard problem with code you thought and wrote from scratch.
Ironically, I've had to incorporate some AI stuff in my course as a result of needing to remain "current", which almost feels like it validates that cynical sentiment that this soulless way is the way to be doing things now.
nxor
Has the school changed?
And can we assume that because AI has made it easy to solve some hard problems, other hard problems won't arise?
Not that I don't agree
And hasn't the internet generally added to this attitude?
And if it makes you feel any better, as someone around that age, this environment seems to have also led some of us to go out of our way to not outsource all our thinking
abnercoimbre
The OP said coding now feels like:
> import solution from chatgpt
Which reminded me of all the students in classes (and online forums) mocking non-nerds who wanted easy answers to programming problems. It would seem the non-nerds are getting their way now.
andy99
I taught into to programming ~15-20 years ago. Back then everyone just copied each other’s assignments. Plus ça change
themafia
> That's the labor-saving promise.
Where are the labor saving _measurements_? You said it yourself:
> You'd think about the problem. Draw some diagrams. Understand what you're actually trying to do.
So why are we relying on "promises?"
> If I use AI to help, the work doesn't feel like mine.
And when you're experiencing an emergency and need to fix or patch it this comes back to haunt you.
> So all credit flows to the AI by default.
That's the point. Search for some of the code it "generates." You will almost certainly find large parts of it, verbatim, inside of a github repository or on an authors webpage. AI takes the credit so you don't get blamed for copyright theft.
> Am I alone in this?
I find the thing to be an overhyped scam at this point. So, no, not at all.
LordDragonfang
> You will almost certainly find large parts of it, verbatim, inside of a github repository or on an authors webpage. AI takes the credit so you don't get blamed for copyright theft.
Only if you're doing something trivial or highly common, in which case it's boilerplate that shouldn't be copyrighted. We already had this argument when Oracle sued Google over Java. We already had the "just stochastic parrots" conversation too, and concluded it's a specious argument.
heavyset_go
> We already had this argument when Oracle sued Google over Java.
"It's boilerplate therefore it isn't IP" isn't the argument that was made by Google, nor is it the argument that the case was decided upon.
It was decided that Google's use of the API met the four determining factors used by courts to ascertain whether use of IP is fair use. The court found that even though it was Oracle's copyrighted IP, it was still fair use to use it in the way Google did.
https://en.wikipedia.org/wiki/Google_LLC_v._Oracle_America,_...
themafia
> in which case it's boilerplate that shouldn't be copyrighted
Let's say it's boilerplate code filled with comments that are designed to assist in understanding the API being written against. Are the comments somehow not covered because they were added to "boilerplate code?" Even if they're reproduced verbatim as well?
> We already had the "just stochastic parrots" conversation too
Oh, I was not part of those conversations, perhaps you can link me to them? The mere stated existence of them is somewhat underwhelming and entirely unconvincing. Particularly when it seems easy to ask an LLM to generate code and then to search for elements of that code on the Internet. With that methodology you wouldn't need to rely on conversations but on actual hard data. Do you happen to know if that is also available?
mattlondon
> The new way: The entire premise of AI coding tools is to automate the thinking, not just the typing
I'd disagree. For me, I direct the AI to implement my plan - it handles the trivia of syntax and boilerplate etc.
I now work kinda at the "unit level" rather than the "syntax level" of old. AI never designs the code for me, more fills in the gaps.
I find this quite satisfying still - I get stuff done but in half the time because it handles all the boring crap - the typing - while I still call the shots.
mpliax
Don't you have to go over whatever the chatbot spurts? Isn't that part more boring than writing the code yourself ?
CuriouslyC
To be honest the only time I got satisfaction out of programming in the past was when I programmed a really hard algorithm or created a really beautiful design, and the AI doesn't replace me there, it just automates the menial part of the work.
recursivedoubts
Here is how I'm using it:
Do all the stuff you mention the old way. If I have a specific, crappy API that I have to deal with, I'll ask AI to generate the specific functionality I want with it (no more than a method or two). When it comes to testing, I'll write a few tests (some simple, some complicated) and then ask AI to generate a set of tests based on those examples. I then run and audit the tests to make sure they are sensible. I always end my prompts with "use the simplest, minimal code possible"
I am mostly keeping the joy of programming while still being more productive in areas I'm not great at (exhaustive testing, having patience with crappy APIs)
Not world changing, but it has increased my productivity I think.
saulpw
I agree completely, you are not alone! I've heard the argument "if you don't like AI just don't use it" but there is this nagging feeling just as you describe. Like the mere existence of AI as a coding tool has sucked all the dopamine out of my brain.
leakycap
> There's this constant background feeling that whatever I just did, someone else would've done it better and faster.
You're having imposter syndrome-type response to AI's ability to outcode a human.
We don't look at compliers and beat out fists that we can't write in assembly... why expect your human brain to code as easily or quickly as AI?
The problem you are solving now becomes the higher-level problem. You should absolutely be driving the projects and outcomes, but using AI along the way for programming is part of the satisfaction of being able to do so much more as one person.
mhaberl
>The new way: The entire premise of AI coding tools is to automate the thinking, not just the typing.
That’s the promise, but not the reality :) Try this: pick a random startup idea from the internet, something that would normally take 3–6 months to build without AI. Now go all in with AI. Don’t worry about enjoyment; just try to get it done.
You’ll notice pretty quickly that it doesn’t get you very far. Some things go faster, until you hit a wall (and you will hit it). Then you either have to redo parts or step back and actually understand what the AI built so far, so you can move forward where it can’t.
>I was thinking of all the classic exploratory learning blog posts. Things that sounded fun. Writing a toy database to understand how they work, implementing a small Redis clone. Now that feels stupid. Like I'd be wasting time on details the AI is supposed to handle.
It was "stupid" then - better alternatives already existed, but you do it to learn.
> Am I alone in this?
absolutly not but understand it is just a tool, not a replacement, use it and you will soon find the joy again, it is there
baq
It’s the opposite for me. I can get so much more of what I want done built quicker, and if I’m not familiar with a framework, it isn’t an issue anymore unless we’re talking about the bleeding edge.
nerdsniper
AI makes me a lot more adventurous in terms of the projects I take on. I usually end up having to rewrite everything from scratch after POC proves that my vision is possible and actually works - including the whole old-school RTFM.
It’s a huge help for diving into new frameworks, troubleshooting esoteric issues (even if it can’t solve it its a great rubber duck and usually highlights potential areas of concern for me to study), and just generally helping me get in the groove of actually DOING something instead of just thinking about it. And, once I do know what I’m doing and can guide it method by method and validate/correct what it outputs, it’s pretty good at typing faster than I can.
journal
It's draining. I think I've read more synthetic text in the last three years than all text I've ever encountered in life.
I've been trying to articulate why coding feels less pleasant now.
The problem: You can't win anymore.
The old way: You'd think about the problem. Draw some diagrams. Understand what you're actually trying to do. Then write the code. Understanding was mandatory. You solved it.
The new way: The entire premise of AI coding tools is to automate the thinking, not just the typing. You're supposed to describe a problem and get a solution without understanding the details. That's the labor-saving promise.
So I feel pressure to always, always, start by info dumping the problem description to AI and gamble for a one-shot. Voice transcription for 10 minutes, hit send, hope I get something first try, if not hope I can iterate until something works. And when even something does work = zero satisfaction because I don't have the same depth of understanding of the solution. Its no longer my code, my idea. It's just some code I found online. `import solution from chatgpt`
If I think about the problem, I feel inefficient. "Why did you waste 2 hours on that? AI would've done it in 10 minutes."
If I use AI to help, the work doesn't feel like mine. When I show it to anyone, the implicit response is: "Yeah, I could've prompted for that too."
The steering and judgment I apply to AI outputs is invisible. Nobody sees which suggestions I rejected, how I refined the prompts, or what decisions I made. So all credit flows to the AI by default.
The result: Nothing feels satisfying anymore. Every problem I solve by hand feels too slow. Every problem I solve with AI feels like it doesn't count. There's this constant background feeling that whatever I just did, someone else would've done it better and faster.
I was thinking of all the classic exploratory learning blog posts. Things that sounded fun. Writing a toy database to understand how they work, implementing a small Redis clone. Now that feels stupid. Like I'd be wasting time on details the AI is supposed to handle. It bothers me that my reaction to these blog posts has changed so much. 3 years ago I would be bookmarking a blog post to try it out for myself that weekend. Now those 200 lines of simple code feels only one sentence prompt away and thus waste of time.
Am I alone in this?
Does anyone else feel this pressure to skip understanding? Where thinking feels like you're not using the tool correctly? In the old days, I understood every problem I worked on. Now I feel pressure to skip understanding and just ship. I hate it.