Skip to content(if available)orjump to list(if available)

AI will make our children stupid

AI will make our children stupid

61 comments

·December 20, 2025

niceguy1827

Aren't people already stupid enough? The fact that the author wrote this article without verifying if the existing trend of children's IQ shows some level of stupidity.

And please excuse my language. I probably watch George Carlin videos a bit too much.

> For example, a 2018 analysis by researchers at Northwestern University and the University of Oregon found that average IQ scores in the U.S. began declining slightly after 1995, particularly in younger generations. This reversal mirrors findings in several European countries, including Norway, Denmark, and the UK.

https://nchstats.com/average-iq-by-state-in-us/

hn_throwaway_99

The human body is famously a "use it or lose it" system. For example, the US (and most of the developed world) has had a large reduction in grip strength since just 40 years ago as Americans get ever more sedentary. I think most people of a certain age can relate to how they've gotten a lot worse at remembering and following directions now that "the Google lady" just tells you right where to turn.

The same thing is happening/will happen with AI. If you don't go through the hard brain work of thinking things up for yourself, especially writing, your writing skills will deteriorate. We'll see that in a giant scale as more and more kids lean on ChatGPT to "check their homework".

ares623

“Bad thing X is already happening. If that’s not being solved then making X exponentially worse is therefore okay.”

What is 20 PRs per day worth.

Engineers will literally burn the world if it means looking good for their employers.

cons0le

Isn't AI too new to study it's effects on the kids?

bgwalter

The authors' (there are two) position is that yes, people are already stupid enough, but it will get much worse:

"We may soon look back on this era of TikTok, Love Island and Zack Polanski as an age of dignity and restraint."

FpUser

>"I probably watch George Carlin videos a bit too much."

My favorite. Love the guy. Too bad he is dead.

xnx

AI will be a super-tutor for the curious and a tool to outsource all thinking for the incurious.

WhyOhWhyQ

The job doesn't pay you to be curious. It pays you to get stuff done. Curiosity makes you jobless. Most of the Silcon Valley people who frequent this website larp as curious people, but are basically incurious status seekers.

fn-mote

> The job doesn't pay you to be curious.

YOUR job doesn’t pay you to be curious.

Well, you could say mine doesn’t either, literally, but the only reason I am in this role, and the driving force behind my major accomplishments in the last 10 years, has been my curiosity. It led me to do things nobody in my area had the (ability|foolishness) to do, and then it led me to develop enough improvements that things work really well now.

agumonkey

I'd be curious if jobs like yours are not on the tail side of the distribution. It's very common that in work groups, curiosity / creativity gets ignored if not punished. I've seen this even in small techies groups, there was a natural emergence of boundaries in which people don't get to think beyond (you're overstepping, that's not your role, you're doing too much). It seems a pavlovian reflex when leadership doesn't know how to operate without assigning roles.

wrs

I mean, think of all the people getting paid eight-digit compensation right now because they were curious about this dead-end deep learning stuff 15 years ago for no good reason!

armchairhacker

We need some curious people. Otherwise nothing gets discovered, including solutions to future problems.

Spooky23

We do. But the would-be modern nobility are quite happy with being a sort of feudal lord.

WhyOhWhyQ

Fully expecting to get banned for my comment, but I'll just go on. Look at the silicon valley heroes and they're all business types. There's a few rare exceptions.

AnimalMuppet

Curiosity as your only trait makes you jobless. Curiosity enough to learn something new can help you remain employed.

turtletontine

I don’t necessarily think you’re wrong, but I’m skeptical that the curious will really meaningfully learn from LLMs. There’s a huge gap between reading something and thinking “gee that’s interesting, I’m glad I know that now,” and really doing the work and deeply understanding things.

This is part of what good teaching is about! The most brilliant engaged students will watch a lecture and think “wow nice I understand it now!” and as soon as they try to do the homework they realize there’s all kinds of subtleties they didn’t consider. That’s why pedagogical well crafted assignments are so important, they force students to really learn and guide them along the way.

But of course, all this is difficult and time consuming, while having a “conversation” with a language model is quick and easy. It will even write you flowery compliments about how smart you are every time you ask a follow up question!

tnias23

I find LLMs useful for quickly building mental models for unfamiliar topics. This means that instead of beating my head against the wall trying to figure out the mental model, I can beat my head against the wall trying to do next steps, like learning the lower level details or the higher level implications. Whatever is lost not having to struggle through figuring out the mental model is easily outweighed by being able to spend that time applying myself elsewhere.

wrs

I have some success by trying to explain something to an LLM, having it correct me with its own explanation that isn’t quite right either, correcting it with a revised explanation, round and round until I think I get it.

Sort of the Feynman method but with an LLM rubber duck.

agumonkey

yes and it will mostly depends on the culture / economy. if you create incentives for kids to explore ideas through LLMs they'll become very knowledgeable (and maybe somehow confident). otherwise it will be the tiktok of cognition.

10 bucks there will be a law to enforce exponential backoff so that you need to get good after a few questions before the LLMs delays things by an hour

nineteen999

I mean, its totally possible to be curious about some things and less curious about others.

There's few things more annoying than a human that thinks it has the most accurate and up-to-date AI-level knowledge about everything.

AlexandrB

This is assuming the current AI business model (losing lots of money). As with the internet as a whole, AI companies will probably be incentivized to waste your time and increase "engagement" as they seek revenue. At that point, AI will only be a good tutor if you're extremely diligent at avoiding the engagement bait.

aeon_ai

Amen.

ChrisMarshallNY

Yeah, when they allowed calculators in the classroom, we all started getting dumber.

If that sounded silly, it was exactly what they said would happen, when that came to pass (I grew up in the last generation where they weren't allowed. I know lots of folks younger than me, that I think are smarter than I am).

barapa

I don't find this all that compelling. Different technologies can have different effects. And why would future effects be influenced by the accuracy of random people's predictions of other events in the past?

femiagbabiaka

The calculator analogy doesn’t really work, speaking as someone who is more of an AI booster than skeptic. The addition of calculators to the classroom necessitated a change in pedagogy. So now kids learn how to do math without them, and then add them once the fundamentals are there. Learning how to think is even more foundational.

PaulDavisThe1st

Quite debatable. Learning how to think frequently involves basic math skills like "hmm, they claim a two of order of magnitude effect, but is that even feasible?" When you can't do "math" like that in your head, your ability to think is significantly impaired, as we are currently seeing.

femiagbabiaka

I think we agree? If LLMs will be included in classroom learning at all, it has to be done with an understanding of how it will affect learning outcomes, and it’s not clear that the effect will be the same as introducing calculators was, at all.

jerome-jh

AI can be a great tool. It can make our children (and us) lazier, but not necessarily stupider. Short video platforms OTOH certainly make our children stupider (and depressed).

m4ck_

yeah but it's totally gonna usher us into a workless utopia where everyone has everything they ever wanted, because everything will be free! Or at least it will if we allow AI companies to operate completely unregulated and unimpeded.

null

[deleted]

pjerem

lol.

pluc

Don't worry; by the time your children are effectively stupid, you will be stupid enough not to realize it and instead will praise them for how well they can verbalize what they want. You will call it cognitive progress and you will thank AI for it.

aqula

People said the same thing about the internet, that you should be getting your information from actual books and that the internet will make you lazy and complacent. There was a time when you were encouraged to write code on paper first instead of typing it directly because it made you think clearly. Some time before calculators apparently dulled your mental faculties, so you should be hand rolling all your calculations. Go back in time far enough and you'll find Socrates disparaging writing because it weakens your memory and destroys your mind. And yet humanity is here and seems to be doing all right. Every generation has managed to produce smart people that have been able to push the boundaries of scientific and technological progress. If anything we may be getting smarter. What history has repeatedly shown is that when you reduce friction for the human brain, it goes and finds more complex things to do. Such a periodic removal of friction, may very much be a necessity for progress, because it allows the paradigm of thought to shift to a higher level. The same should happen with AI as well.

mo_42

Did the invention of the steam engine and all other heavy machines make us physically weaker? I guess so. People working on (literal) heavy stuff don't need the strength they used to.

But now they move around even more heavy stuff with machines.

I think something similar might happen to our brains. Maybe we won't be able to work ourselfs through every detail of a mathematical proof, of a software program, or a treatsie on philosophy. But we'll abe able to accomplish intellectual work that only really smart poeple could accomplish. I think this is what counts: outcome.

null

[deleted]

mwkaufma

I like a scathing critique of overly-hyped chatbots as much as the next guy, but leading with the pseudoscience of IQ scores has the persuasive impact of a farting noise.

spwa4

Hasn't Tiktok already done that?

Oh, and the extreme brain drain the west imposed on everyone else, from South Africa to China, resulting in no available "brains", let's say, in those countries, and in the rich countries only brains available that aren't invested in making westerners smart, along with a disdain among existing populations of professions that require brains.

FrankyHollywood

Don't know if TikTok is the problem, a generation ago (some) kids mindlessly watched cartoons for hours a day.

I think this is mostly about learning to think and develop grit.

As a kid when I wanted to play a game I had to learn dos commands, know how to troubleshoot a non functioning sounds blaster etc. Sometimes took me days to fix.

Doing this develops understanding of a domain, problem-solving skills and grit.

My kid just opens steam and everything works. Any question he has he asks AI. I am really curious what effect this will have on this generation. It is tempting to quickly say "they will be brain dead zombies" but that seems too simplistic.

In 20yrs we'll know!

Fire-Dragon-DoL

I keep seeing that people ask question to AI.

That sounds like a dedicated teacher though, not that bad?

Like asking questions and learning how and what questions to ask is an amazing skill

spwa4

> My kid just opens steam and everything works. Any question he has he asks AI

(... and then presumably he applies what the AI tells him, occasionally asking why)

Frankly, this is a much better and targeted way to learn. If this is what happens, great!

I mean, I'd give him an intro how to pirate games, because

1) it's a technical challenge with a built-in reward

2) AIs (especially Gemini, but more and more ChatGPT too) refuse to help doing it

So a truly excellent pursuit for learning!

But I do feel it's very different from what happens with smartphones and that is desperately bad.

johnfn

Yes, yes, it's certainly not social media, or the plethora of apps that cater to and in some ways create an ever-shortening attention span (Reddit, TikTok, Facebook, Instagram, ...). It's definitely that thing you can use to research and learn anything you could ever want -- that is the thing which will unquestionably make our children stupid.

forgetfreeman

If it takes a few thousand pages of textbooks or other reference material to gain competence with a given topic how is consuming superficial summaries provided by AI expected to produce comparable results?

NeutralCrane

> If it takes a few thousand pages of textbooks or other reference material to gain competence

This is a huge assumption and not one I’m sure holds up. In my experience gaining competence is often more a matter of hands on experimentation and experience, and the thousands of pages of reference material are there to get you to the point where you can start getting hands on experience, and debug your experiments when they don’t work. If AI can meaningfully cut back on that by more efficiently getting people to the experimentation stage, it absolutely will be more effective. And so far in my limited experience, it seems extremely promising.

grugagag

Stupid is a continuum. The tiktok stupid may pale in comparison if AI is blindly implemented at all levels of education.

chneu

I think blaming AI isn't quite right.

I think the current mentality of "Make every process in life as easy and time-efficient as possible" is the problem.

AI is just a tool. What someone does with it is up to them. The current desire to not do anything, however, means people will abuse AI to make their lives more segregated from the work that enables them.

As technology progresses, people are less connected to the how and why of life. This leads to people not understanding how to do basic things. Nobody can do anything on their own and they have to pay money to someone for really basic stuff. People can hardly go grocery shopping anymore as it takes too much time. Peak capitalism?

Really just watch Idiocracy. AI isnt the problem; people's desire to do as little as possible is the problem.