Skip to content(if available)orjump to list(if available)

Does Visual Studio rot the mind? (2005)

indigovole

It's a really interesting question.

I'm sure you could go back 40 years earlier and find programmers complaining about using FORTRAN and COBOL compilers instead writing the assembly by hand.

I think that the assembler->compiler transition is a better metaphor for the brain->brain+AI transition than Visual Studio's old-hat autocomplete etc.

After working with Cursor for a couple of hours, I had a bunch of code that was working according to my tests, but when I integrated it, I found that Claude had inferred a completely different protocol for interacting with data structure than the rest of my code was using. Yeah, write better tests... but I then found that I did not really understand most of the code Claude had written, even though I'd approved every change on a granular level. Worked manually through an solid hour of wtf before I figured out the root cause, then Clauded my way through the fix.

I can picture an assembly developer having a similar experience trying to figure out why the compiler generated _this_ instead of _that_ in the decade where every byte of memory mattered.

Having lived through the dumb editor->IDE transition, though, I _never_ had anything like that experience of not understanding what I'd done in hours 1 and 2 at the very beginning of hour 3.

yoyohello13

This feels very similar to me as the "Tutorial Hell" effect. Where I can watch videos/read books, and fully feel like I understand everything. However, when hand touches keyboard I realize I didn't really retain any of it. I think that's something that makes AI code gen so dangerous. Even if you think you understand and can troubleshoot the output. Is your perception accurate?

irishloop

Yeah I never really learn something until I actually hack away at it, and even then I need to really understand things on a granular level.

cassepipe

I have done that too much. When learning now, when I read the solution, I always make sure that I am able to implement myself, else I don't consider I learned it. I apply the same for LLM code.

golergka

> Where I can watch videos/read books, and fully feel like I understand everything. However, when hand touches keyboard I realize I didn't really retain any of it.

Always type everything down from a tutorial when you follow it. Don't even copy and paste, literally type it down. And make small adjustments here and there, according your personal taste and experience.

ashoeafoot

The Torturial

jgrahamc

I was programming 40 years ago and was very happy to be able to use "high-level languages" and not write everything in assembly. The high-level languages enabled expressiveness that was hard with lower levels.

scarface_74

In 1985, any time I needed any level of performance on my 1Mhz Apple //e, I still had to use assembly or when BASIC didn’t expose the functionality I needed. Mostly around double hires graphics and sound.

jgrahamc

Yep. That stuff was necessary then and still is today. Just look at DeepSeek doing low(er) level NVIDIA stuff and making the GPUs work hard.

the__alchemist

This is a tangent, but perhaps relevant:

I think sending your LLM all relevant data structures (structs, enums, function signatures etc) is mandatory for any code-related queries. It will avoid problems like this; it seems required in many cases to get integratable results.

phyllistine

If you're active on social media (twitter), you will still see people, like the FFmpeg account, still bashing higher level languages (C) and praising hand written assembly.

mystified5016

Embedded programming is still like this. Most people just don't inspect the assembly produced by their compiler. Unless you're working on an extremely mainstream chip with a bleeding edge compiler, your assembly is going to be absolutely full of complete nonsense.

For instance, if you aren't aware, AVR and most other uCs have special registers and instructions for pointers. Say you put a pointer to an array in Z. You can load the value at Z and increment or decrement the pointer as a single instruction in a single cycle.

GCC triples the cost of this operation with some extremely naive implementations.

Instead of doing 'LD Z+' GCC gives you ``` inc Z ld Z dec Z ```

Among other similar annoyances. You can carefully massage the C++ code to get better assembly, but that can take many hours of crazy-making debugging. Sometimes it's best to just write the damn assembly by hand.

In this same project, I had to implement Morton ordering on a 3D bit field (don't ask). The C implementation was well over 200 instructions but by utilizing CPU features GCC doesn't know about, my optimized assembly is under 30 instructions.

Modern sky-high abstracted languages are the source of brain rot, not compilers or IDEs in general. Most programmers are completely and utterly detached from the system they're programming. I can't see how one could ever make any meaningful progress or optimization without any understanding of what the CPU actually does.

And this is why I like embedded. It's very firmly grounded in physical reality. My code is only slightly abstracted away from the machine itself. If I can understand the code, I understand the machine.

lukev

And this is appropriate for your domain and the jobs you work on.

If your job was to build websites, this would drive you insane.

I think I'm coming around to a similar position on AI dev tools: it just matters what you're trying to do. If it's a well known problem that's been done before, by all means. Claude Code is the new Ruby on Rails.

But if I need to do some big boy engineering to actually solve new problems, it's time to break out Emacs and get to work.

iuvcaw

The vast majority of time spent building software has little to do with optimization. Sky-high abstracted brain rot languages are useful precisely because usually you don’t need to worry about the type of details that you would if you were optimizing performance

And then if you are optimizing for performance, you can make an incredible amount of progress just fixing the crappy Java etc code before you need to drop down a layer of abstraction

Even hedge funds, which make money executing trades fractions of milliseconds quicker than others, use higher level languages and fix performance issues within those languages if needed

larve

As a long time embedded programmer, I don't understand this. Even 20 years ago, there is no way I really understood the machine, despite writing assembly and looking at compiler output.

10 years ago, running an arm core at 40 Mhz, I barely had the need to inspect my compiler's assembly. I still could roughly read things when I needed to (since embedded compilers tend to have bugs more regularly), but there's no way I could write assembly anymore. I had no qualms at the time using a massively inefficient library like arduino to try things out. If it works and the timing is correct, it works.

These days where I don't do embedded for work, I have no qualms writing my embedded projects in micropython. I want to build things, not micro optimize assembly.

null

[deleted]

kelseyfrog

Absolutely it does. In the same way Socrates warned us about books, VS externalizes our memory and ability. and makes us reliant on a tool to accomplish something we have the ability to do without. This reliance goes even further to make us dependent on it as our natural ability withers from neglect.

I cannot put it more plainly that it incentives us to make a part of us atrophy. It would be like us giving up the ability to run a mile because our reliance on cars weakened our legs and de-conditioned us to the point of making it physically impossible.

snarfy

If your job for 30 years is to move bags of cement 50 miles each day, is it not more productive to use a truck than your legs? Even if you use the truck so much you could not move a bag of cement with your legs anymore due to atrophy?

You could make the same argument about most tools. Do calculators rot the brain?

yoyohello13

> Do calculators rot the brain?

Yes, they unequivocally make people worse at mental math. Now whether that's bad or not is debatable. Same with any tool, it's about tradeoffs and only you can determine whether the tradeoffs make sense for you.

BuckRogers

>they unequivocally make people worse at mental math. Now whether that's bad or not is debatable.

Not debatable because as long as calculators exist and are available, nothing is lost. You can put that mental energy towards whatever is relevant. There's nothing special about math that elevates doing it in your own mind more valuable than other tasks. Any of us that do software for a living understand that mental work is real work and you are limited in your capacity per day. If your boss wants to pay you to do longhand division instead of getting projects done, I'd call that man a liar.

The people that make these arguments that we "lose" something are usually academics or others that aren't actually in the trenches getting things done. They think we're getting weaker mentally or as a society. I'm physically tired at the end of the day, and I sit at a desk. I'll do math when the calculators are gone, I have a lot of tasks and responsibilities otherwise.

jchw

> Do calculators rot the brain?

I don't know if I'd agree with the phrasing that it rots the brain, but broadly I would expect that if you rely on it for arithmetic your mental arithmetic will atrophy over time.

Unfortunately though it may not be such an exaggeration after all. It's probably too early to say, but there is definitely mounting evidence that some of these useful tools are literally, genuinely rotting our brains.

https://www.fatherly.com/news/gps-google-maps-cognitive-decl...

Even if we just take this at face value and assume it's accurate though, it's still not immediately clear what to actually do with this knowledge.

gspencley

> Unfortunately though it may not be such an exaggeration after all. It's probably too early to say, but there is definitely mounting evidence that some of these useful tools are literally, genuinely rotting our brains.

> https://www.fatherly.com/news/gps-google-maps-cognitive-decl...

I think you may have fallen into a bit of confirmation bias there. The article you linked to words itself like so:

"According to research published in the journal PLOS One, navigating with a map and compass might help stave off cognitive decline and symptoms associated with Alzheimer’s Disease."

This means that the study found that there may be positive benefits to cognitive health when one routinely engages in navigation related activities using a compass and a map.

That is a very very VERY different statement than: "Using Google Maps causes cognitive decline."

In order to demonstrate any kind of causality between using Maps and cognitive decline, you would have to start with an additional premise that everyone engaged in navigation using a compass and map on such a regular basis that the switch represents a switch to a less healthy lifestyle.

I think that statement is specious at best.

I'm old enough to remember living without Google Maps. And the amount of time that I reached for a paper map and a compass was so infrequent that Google Maps represented something new for me more than it represented something different. That is to say, it wasn't replacing something I did regularly so much as it gave me a reason to start using any kind of a "map" in the first place.

Most people I know would say the same thing. We had some maps skills when we needed them ... but we tried to avoid needing them more often than not.

So yeah, the study might have merit. But I don't think it suggests what you think it suggests.

w0m

> Do calculators rot the brain?

According to my math teachers in high school - yes.

sunshowers

How different is this argument from "does Rust rot the mind"? After all, Rust means that the part of the brain that is paranoid about careful memory safety issues no longer has to be engaged.

Like most things, the answer is that it depends. I think autocomplete is great, and I've come to appreciate LLMs doing boilerplate editing, but I'm quite worried about people using LLMs to do everything and pumping out garbage.

null

[deleted]

tarunkotia

Great point, to me it feels more like running barefoot versus with padded shoes. Barefoot running will build your foot muscle based on your natural form over time, whereas padded shoes may interfere with your natural form and at worst encourage bad form which may lead to injuries as your mileage increases.

Over the years in building software, I tend to spend more time thinking about how all the pieces (i.e. from package manager) fit together rather than building them from scratch and fitting two pieces together require deeper understanding of both the pieces and how the combination of the two behaves in the environment.

tomlue

Completely agree. Though I think this statement could benefit from pointing out that cars also help people go much faster, and do things they otherwise couldn't.

wahern

Relatedly, people who rely too much on GPS for navigation (i.e. online automated route planning), especially real-time, turn-by-turn instruction, seem to have poor spatial awareness, at least at the local geographic level. I doubt the loss of that skill is a meaningful impediment in modern life[1], but I personally would not want to lose it. Tools like Google Maps are extremely useful, but I use them to augment my own navigation and driving skills. I'll briefly study the route before departing, choose major thoroughfares to minimize complexity, and try to memorize the last few turns (signage and a practiced understanding of how highways and roads are laid out is sufficient for getting close enough).

[1] No impediment for them. It's an impediment for me when the car in front of me is clearly being driven by somebody blithely following the computer's instructions, with little if any anticipation of successive turns, and so driving slowly and even erratically.

aaronbaugher

Yes. You can see a difference between the person who learned to do a process "by hand" and then uses technology to make it faster or easier, versus the person who never learned to do it without the tech at all.

hooverd

The ICE and more generally the automobile has been a great technology and has lots of benefits. But we did all huff alarming amounts of lead for a generation and built our cities around them to our detriment.

fransje26

> But we did all huff alarming amounts of lead for a generation and built our cities around them to our detriment.

And yet, this has nothing to do with the ICE itself, and everything to do with the greed of the Ethyl Corporation and the generation of corrupted minions that knowingly enabled their disastrous scheme.

dartos

Besides going fast, what does a car allow people to do that they couldn’t before?

mikedelfino

Cars allow people to travel longer distances more conveniently, access remote areas, transport goods efficiently, and have greater independence in their daily lives. They also enable emergency services to respond quickly, support economic growth by facilitating trade, and provide opportunities for leisure travel that were previously impractical.

kelseyfrog

Go through drive-thrus.

daedrdev

In a day I can travel many times farther than I can walk, carrying hundreds or thousands of pounds of stuff with me if needed.

DontBreakAlex

Reach farther places? Move around heavy loads?

jason_zig

are you serious?

sbuttgereit

But also gives us back time and mental capacity to do other things which were previously out of reach because of what we had to focus our minds and time on.

In some cases, maybe even many or the majority of cases, that trade isn't a bad one. It really depends on how you use it.

bongodongobob

Socrates just argued about semantics. If you take anything away from those dialogues it should be that they were confused and didn't really have many good ideas.

hooverd

Being an active transit crank, I'd like to put out that if computers are a "bicycle for the mind", LLMs are an SUV.

QuercusMax

An SUV? Unnecessary for most people, a huge waste of power, and bad for the environment? Dangerous to bystanders?

fragmede

but great for the occupants

bluedino

There should be an updated article, "Does Visual Studio Code Rot the Mind?"

In the old days, when we set our development environments up by hand, you usually ended up learning quite a bit about your setup. Or you at least had to know things like what's installed, what paths are where, blah blah.[1]

With Visual Studio Code being what almost everyone uses these days, everything is a click-to-install plugin. People don't realize Python isn't part of Visual Studio Code. The same for gcc, ssh, etc. You end up with tickets like "Can't run Python", or "Can't connect to the server", but it's all Visual Studio Code issues.

[1]: And yes, I realize a lot of us didn't know what we were doing back then, and the shotgun approach of "`apt-get install foobar` fixed this for me!" was a standard 'solution'

Conscat

My background is heavily biased towards C++ but I don't really feel like you can make it work in VS Code without understanding, at minimum, where your clangd is actually located. The C++ plugins don't install what you need where you need them, and the launch.json really does not just work. My ex was unable to set up a C++26 toolchain for VS Code because without my help he couldn't configure the settings to connect to the right version of clang tools, and I don't think he ever got the GDB debug adapter working at all.

mattmanser

One of the definite positives of AI is that kind of stuff is now fairly easy to solve.

It surfaces a lot of the ways people found to fix those problems in forums/git issues/random blogs that were hard to track down past google spam.

And it tends to list the possible fixes in a nice little bullet point list of things to try to get everything working.

bluedino

> One of the definite positives of AI is that kind of stuff is now fairly easy to solve.

I agree that 'AI' feels like a fancier/faster 'Google' (I've done the thing where I find a Github issue from 4 years ago that explains the problem), but when we will see a local AI agent that looks at your current configuration/issue and solves the problem for you, and then gives you a summary of what it did?

woodrowbarlow

my version of this was "do convenient linux distros and binary package managers rot the mind?" after setting up a freebsd box with everything built from source and everything configured by hand. then again from building an arch-based Linux system from the wiki. you learn a lot from doing things by hand... but eventually you need to get stuff done.

yoyohello13

The number of times I've had to help a colleague troubleshoot some VSCode thing seems to bear this out. I'm regarded as a 'wizard' because I actually know how to use git/build tools without a gui. It's kind of shocking to me how many developers' knowledge ends at the run button.

It doesn't bother me too much, because I like being needed, lol. But it would probably make our team more productive if people bothered to learn how our tooling works.

dokyun

Seems like Visual Studio rots the mind so bad nowadays that people would rather clamor support for useless garbage like LSP in their editors, and if it can't be supported very well it's foremost the editor's fault, not the fault of the crap they want to tack on to an otherwise fine product. People have been doing this with Emacs, saying Emacs is slow because its LSP implementations are slow and unresponsive, not that LSP is badly designed, or that an editor shouldn't need an asynchronous network protocol to provide IDE features. It also demonstrates the mentality that people find it increasingly preferable to have their computing served to them on a silver platter by a third party than being able to control it locally.

korse

Required supplementary reading?!?

I like the verse form.

https://users.cs.utah.edu/~elb/folklore/mel.html

ferguess_k

I don't think Intellisense and autocomplete bring brain-rot. AI could, because it tries to solve the problem FOR you, while intellisense and autocomplete merely help you find stuffs. These two are completely different.

As much as I look up to Charles Petzold, I believe one of the best qualities of an engineer is to know how to prioritize things in life. We only have limited brain power, so it's best to use it in areas that we do care about, and keep the others to the lowest standard the customer can accept as possible. I'd argue that as long as you don't care about memorizing class names or variable names or API details, and as long as you don't become a slave of intellisense and auto-complete, you are perfectly fine.

I'd argue that this is even fine if you only use AI to bootstrap yourself or "discuss" with it as with a rubber duck.

inerte

The excitement about AI and Vibe Coding made me think about this (somewhat old) article.

voidfunc

Vibe... coding... I'm getting old, what is this?

jadbox

Throwing user stories at an LLM and hope it builds the right thing. It's like letting a product manager try to generate code without paying attention to the details. It's as terrible as it sounds, but debatably okay for fast prototypes.

qingcharles

I think this is what they're referring to -- see this video:

https://www.tiktok.com/@rileybrown.ai/video/7473731306845768...

null

[deleted]

null

[deleted]

shmoogy

Letting ai write most of the code (using cursor, windsurf, aider, or any similar solution).

You go back and forth with the ai (or let the agent mode and MCP interactions) figure out any build errors / exceptions and resolve them.

null

[deleted]

OulaX

Wow! That page has the best styling ever! No fuff, no SPA, no colors, nothing! Just text and straight to the point!

yupyupyups

Still rough around the edges on mobile. Too much margin on the right, and code blocks appear tiny.

dang

Related. Others?

Does Visual Studio rot the mind? (2005) - https://news.ycombinator.com/item?id=29760171 - Jan 2022 (143 comments)

Does Visual Studio Rot the Mind? (2005) - https://news.ycombinator.com/item?id=22258198 - Feb 2020 (118 comments)

Does Visual Studio rot the mind? - https://news.ycombinator.com/item?id=3386102 - Dec 2011 (2 comments)

Do modern IDEs make us dumber? - https://news.ycombinator.com/item?id=387495 - Dec 2008 (37 comments)

seltzered_

For those trying to remember, microsoft intellisense (autocompletion within the IDE) was a big subject of debate going back to ~1999 on whether it'll make programmers lazier or more productive.

Some of this stuff would be found in old slashdot discussions but it seems harder to find, im finding an old John Carmack interview as one mention: https://m.slashdot.org/story/7828

Here might be the original slashdot discussion of "Does visual studio rot the brain?" (2005) https://tech.slashdot.org/story/05/10/26/1935250/does-visual...

tra3

I see where this is going in the context of tools like Cursor. The common wisdom is that if you don't use it, you lose it. Muscles atrophy without stimulus (regular exercise).

On the other hand, this seems to echo the transition from assembly -> higher level languages. The complaint was that we lose the skill or the context to understand what the machine is really doing.

I've written a whole bunch of LLM-assisted code in the past few weeks so I sympathize. If I'm generous, I have a high level understanding of the generated code. If I have to troubleshoot it, I'm basically diving into an unknown code base.

Anyway, I'm just rambling here. More tools is better, it gives everyone an opportunity to work the way they want to.

_fat_santa

I feel like with tools like cursor, while you may be able to cobble together a todo app or something else simple without any programming knowledge, for anything bigger you will still need to know how the underlying thing works.

With everyone now using LLM's, the question goes form "who can code the best" to "who can prompt the best" but that still comes down to who knows more about the underlying code.

larve

Yes, better understanding of the stack makes you a more efficient prompt engineer. and yes I don't say this ironically, prompt engineering / vibe coding is not easy, as evidenced by the amount of people who think it's only good greenfield yeet prototypes.

A lot of my programming skills have atrophied over the last two years, and a lot of skills have become that much sharper (architecture, algorithm and design pattern knowledge, designing user facing tools and dev tools, setting up complex infrastructures, frontend design, ...) because they are what allow me to fly with LLMs.

null

[deleted]