The average CPU performance of PCs and notebooks fell for the first time
242 comments
·February 12, 2025Sweepi
londons_explore
I feel like it could simply be that some big PC recycler, perhaps in a poorer nation reselling Windows Vista machines for $15 a pop, has decided to run this benchmark as part of their recycling process and it has skewed all the results.
silvestrov
For me it sounds more like cheap ChromeBooks because that is enough for most people's need.
Very little in education needs more computing power than that.
We've just told ourselves that computing power will always grow and that we will always need that growth.
That might not be be true and computers might be like cars: we don't need (want) faster cars, we want cheaper and safer cars. Top speed no longer sells.
echoangle
Are they currently selling new Chromebooks with 1366 x 768 screens?
jayd16
Explain the desktop and server numbers.
elif
Ehhh
Having tried to install a modern Linux on a 2012 laptop, it simply is not as cozy as your memory records it.
For reference even Linux designed for old hardware takes about 6 minutes to boot and even just using the console lags
iforgot22
Apparently Windows XP is the dominant desktop OS in Armenia. They were savvy enough not to go to Vista I guess. (Not that the country is anywhere near large enough to skew global benchmarks.)
babypuncher
It has to be something like this. I could believe slower machines growing in market share during high inflation, but nobody's making new laptops or monitors with 1366 x 768 panels.
DiscourseFan
As other commenter said, prob has more to do with growth of computing in the developing world. Very little labor now that can't be done by someone living in a country where breakfast costs less than 25 cents. I mean, theoretically at least. Most of those people probably aren't learning from professors who spend their whole day teaching and doing research, and that kind of activity can only be supported by a relatively wealthy society. But, the internet has also democratized a lot of that activity. Well, its complicated either way.
keyringlight
It seems like a similar challenge as looking at the steam hardware survey stats, it's probably the best public source of information but it's too much of an overview to make anything but the most general conclusions from it. Within that data there's going to be millions of different stories if you could divide it up a bit - what are common combinations of hardware, what are common combinations in different age ranges, what is used for playing games of different ages, what's common in different locations in the world. Valve could possibly tie hardware to software purchasing activity as well what is played.
taneq
> Most of those people probably aren't learning from professors who spend their whole day teaching and doing research, and that kind of activity can only be supported by a relatively wealthy society.
Nope, as your next sentence says, they're learning off forums, Stack Overflow, YouTube and ChatGPT like the rest of us. :P
JansjoFromIkea
Possibly a shift from bulkier older laptops to tablet hybrids?
1366 x 768 definitely sounds like it's just an increase in older machines though.
starkparker
1366x768 laptops I can find in 10 minutes of searching, all being sold as new, running from $150 to $900
HP 15-fd0023dx ($280, 12th-gen i3, Win 11 S, 8 GB RAM, 15.6" touchscreen)
HP 15-fc0025dx ($580, Ryzen 5 7520U, Win 11 S, 8 GB RAM, 15.6" touchscreen)
HP 15-fd0067nr, 15-fd0077nr ($510–$900, 13th-gen i5 or i7, Win 11 Home or Pro, 8 GB RAM, 15.6" non-touch)
HP 14-DQ0052DX Stream 14 ($230–$280, N4120, Win 11 Home or Pro, 4 or 8 GB RAM, 14" non-touch)
Lenovo IdeaPad 3 ($350, 11th-gen i3, Win 11 Home, 8 GB RAM, 15.6" touchscreen)
ASUS Chromebook CM14 ($155–$280, MediaTek Kompanio 520, 4 GB RAM, ChromeOS, 14" non-touch)
Sweepi
"12th-gen i3" is longer then "i3-1215U" or "1215U", but removes all important Information[1]. Why do you (and people in general) do this? I dont get it.
[1] a search engine will give you the intel ark page if you search for "1215U" which tells you thats a 2P+4E 15W Alder Lake CPU, however "12th-gen i3" could be a 4P+0E, but also a 2P+8E or a 4P+4E, and anything between 9W and 60W Basepower.
Isamu
As a developer I have not felt the need to buy a performance machine anymore, everything happens on remote machines and build farms.
wiredfool
As the chart is updated bi-weekly, but the data point that may change is for the current year. The first few days or weeks of a new year are less accurate compared to the end of a year.
moffkalast
January 2024: https://web.archive.org/web/20240123174954/https://www.cpube...
January 2023: https://web.archive.org/web/20230130185431/https://www.cpube...
Still, both previous years were notably up after the first month already. This is different and perhaps notable regardless. Either the release schedule is skewed forward or the hardware is genuinely stagnating. Or perhaps the benchmark is hitting some other CPU unrelated bottleneck.
lolinder
http://web.archive.org/web/20240221001449/https://www.cpuben...
They have more than twice as many samples on their Feb 10 update this year (47810) as they did last year on Feb 14 (22761). They have shown some growth year on year in sample size, but nowhere near doubling.
That suggests this month has been an outlier in having a strangely large number of samples already, which could all be related—maybe their software started being bundled with a specific OEM or was featured on a popular content creator or whatever.
As a sibling notes, less accurate just means less accurate, not necessarily skewed upward. There simply is less data to draw conclusions on than there will be later, so any medium-sized effect that adds 20k extra samples will have a larger effect now than later.
hsuduebc2
I had the same thought. The common reasons for such a decline would be something like change of CPU architecture prioritizing energy efficiency over raw performance, limitations in manufacturing processes, or changes in benchmarking methodologies wouldn't had such steep decline. So I would guess either change of methodology or as you said weaker CPUs in general are measured by that.
moffkalast
So if I understand right, more old hardware makes it into the sample now than before, increasing the unreliability of early data? That makes sense I guess.
palijer
This is just still a sample size of three for a data period that is stated to be less accurate" not "has lower values"
Just because the error isn't presented the same here doesn't mean it's error free...
ellisv
Really makes you wish the chart showed some measure of variance
throwaway287391
Or just have the last data point include everything from the full 12 month period before (as the title "year on year" would suggest) and maybe even put it in the correct place on the x-axis (e.g. for today, Feb 12, 2025, about 11.8% of the full year gap width from the (Dec 31) 2024 point).
contravariant
And then write an article every month about how the year on year difference has(n't) gone up/down yet.
SubiculumCode
variance. Have you noticed that the mainstream public-level discussion of almost any topic never progresses farther than a point estimate? Variance implies nuance, and nuance is annoying to those who'd just rather paint a story. Variance isn't even that much nuance, because it is also just a point estimate for the variability of a distribution, not even its shape. Public discourse is stuck at the mean point estimate, and as such, is constantly misled. This is all an analogy, but feels very true.
michaelt
> Have you noticed that the mainstream public-level discussion of almost any topic never progresses farther than a point estimate? [...] Variance isn't even that much nuance, because it is also just a point estimate for the variability of a distribution, not even its shape.
Next time you're doing something that isn't a professional STEM job, see how far you can get through your day without adding or multiplying.
Unless you're totting up your score in a board game or something of that ilk, you'll be amazed at how accommodating our society is to people who can't add or multiply.
Sure, when you're in the supermarket you can add up your purchases as you shop, if you want to. But if you don't want to, you can just buy about the same things every week for about the same price. Or keep a rough total of the big items. Or you can put things back once you see the total at the till. Or you can 'click and collect' to know exactly what the bill will be.
You don't see mainstream discussion of variance because 90% of the population don't know WTF a variance is.
contravariant
Sometimes even an average is too much to ask. One of my pet peeves is articles about "number changed". Averages and significance never enter the discussion.
Worst are the ones where he number is the number of times some random event happened. In that case there's a decent chance the difference is less than twice the square root, so assuming a Poisson distribution you know the difference is insignificant.
mcmoor
Seems like there's only bandwidth of 4bit that can be reserved for this, and variance couldn't make the cut. It's actually already generous since sometimes only 1 bit can barely make it through, whether something is "good" or "bad".
KeplerBoy
This reminds me of the presidential election where most credible institutions talked about 50/50 odds and where subsequently criticized after Trump's clear victory in terms of electoral votes.
Few people bothered to look at the probability distributions the forecasters published which showed decent probabilities for landslide wins in either direction.
kqr
We still know the current data point has about three times the standard error of the previous point, but I agree it's hard to say anything useful without knowledge of the within-point variance.
null
TheSpiceIsLife
So they have a dataset, and they’ve cherry picket a time period to time period, and not done any location based sorting etc, to suit a narrative.
Whatever pays the bill I suppose.
RachelF
I wonder why? Some possibilities:
1. Extra silicon area being used by NPUs and TPUs instead of extra performance?
2. Passmark runs under Windows, which is probably using increasing overhead with newer versions?
3. Fixes for speculative execution vulnerabilities?
4. Most computers are now "fast enough" for the average user, no need to buy top end.
5. Intel's new CPUs are slower than the old ones
jamesy0ung
> 1. Extra silicon area being used by NPUs and TPUs instead of extra performance?
I'm not an expert in silicon design, but I recall reading that there's a limit to how much power can be concentrated in a single package due to power density constraints, and that's why they are adding fluff instead of more cores.
>2. Passmark runs under Windows, which is probably using increasing overhead with newer versions?
This is a huge problem as well. I have a 2017 15" MacBook Pro and it's a decent computer, except it is horribly sluggish doing anything on it, even opening Finder. That is with a fresh install of macOS 13.7.1. If I install 10.13, it is snappy. If I bootcamp with Windows 11, it's actually quite snappy as well and I get 2 more hours of battery life somehow despite Windows in bootcamp not being able to control (power down) the dGPU like macOS can. Unfortunately I hate Windows, and Linux is very dodgy on this Mac.
gaudystead
That is WILD that Windows 11 runs faster than the machine's native OS...
Could this suggest that modern Windows is, dare I say it, MORE resource efficient than modern macOS?! That feels like a ludicrous statement to type, but the results seem to suggest as much.
bboygravity
My first thought is that Apple is throttling performance of older machines on purpose (again). As they did with the phones.
Would explain why Windows runs faster.
iforgot22
Modern macOS might be optimized for Apple Silicon CPUs. But even when it was Intel only, there were probably times when Windows was lighter, albeit bad in other ways.
imglorp
The pessimistic viewpoint is the hardware vendor would not mind if you felt your machine was slow and were motivated to upgrade to the latest model every year. The fact that they also control the OS means they have means, motive and opportunity to slow older models if their shareholders demanded.
jamesy0ung
I'm just as surprised. Also, I was using Windows 11 LTSC 2024, not the standard version, which could impact the validity of my comparison.
taurknaut
This doesn't feel terribly surprising to me. MacOS has always had impressively performant parts, but their upgrades always generally lower responsiveness. On modern hardware it's less perceptible obviously, and they want to sell machines, not software. But the last iteration that felt like it prioritized performance and snappiness was Snow Leopard, now twenty years ago.
I will say the problem was a lot worse before the core explosion. It was easy for a single process to bring the entire system to a drag. These days the computer is a lot better at adapting to variable loads.
I love my macs, and I'm about 10x as productive as on a windows machine after decades of daily usage (and probably a good 2x compared to linux). Performance, however, is not a good reason to use macos—it's the fact that the keybindings make sense and are coherent across an entire OS. You can use readline (emacs) bindings in any text field across the OS. And generally speaking, problems have a single solution that's relatively easy to google for. Bad-behaved apps aside (looking at you zoom and adobe) administering a mac is straightforward to reason about, and for the most part it's easy to ignore the app store, download apps, and run them by double clicking.
I love linux, but I will absolutely pay to not have to deal with X11 or Wayland for my day-to-day work. I also expect my employer to pay for this or manage my machine for me. I tried linux full-time for a year on a thinkpad and never want to go back. The only time that worked for me was working at google when someone else broadly managed my OS. macs the only unix I've ever used that felt designed around making my life easier and allowing me to focus on the work I want to do. Linux has made great strides in the last two decades but the two major changes, systemd and wayland, both indicate that the community is solving different problems than will get me to use it as a desktop. Which is fine; I prefer the mac-style to the ibm pc-style they're successfully replacing. Like KDE is very nice and usable, but it models the computer and documents and apps in a completely different way than I am used to or want to use.
flomo
I have a 2016 MBP-15 (sticky keyboard). I suspect Apple changed something so the fans no longer go into turbo vortex mode. Normally it isn't sluggish at all, but when it overheats, everything grinds to a halt.[1] (Presumably this is to keep the defective keyboard from melting again.[0]) Perhaps old OS/bootcamp still has the original fan profiles.
[0]Apple had a unpublicized extended warrantee on these, and rebuilt the entire thing twice.
[1] kernel_task suddenly goes to 400%, Hot.app reports 24%. Very little leeway between low energy and almost dead.
torginus
I think it's that the measure of modern CPU performance- multithreaded performance is worthless and has been worthless forever.
Most software engineers don't care to write multithreaded programs, as evidenced by the 2 most popular languages - Js and Python having very little support for it.
And it's no wonder, even when engineers do know how to write such code, most IRL problems outside of benchmarks don't really yield themselves to multithreading, and due to the parallizable part being limited, IRL gains are limited.
The only performance that actually matters is single thread performance. I think users realized this and with manufacturing technology getting more expensive, companies are no longer keen on selling 16 core machines (of which the end user will likely never use more than 2-3 cores) just so they can win benchmark bragging rights.
whilenot-dev
> The only performance that actually matters is single thread performance. I think users realized this and with manufacturing technology getting more expensive, companies are no longer keen on selling 16 core machines (of which the end user will likely never use more than 2-3 cores) just so they can win benchmark bragging rights.
How can you state something like this in all seriousness? One of the most used software application has to be the browser, and right now firefox runs 107 threads on my machine with 3 tabs open. gnome-shell runs 22 threads, and all I'm doing is reading HN. It's 2025 and multicore matters.
torginus
Those threads don't necessarily exist for performance reasons - there can be many reasons one starts a thread from processing UI events, to IO completion etc. I very much doubt Firefox has an easy time saturating your CPU with work outside of benchmarks.
silverlinedjik
>multithreaded performance is worthless and has been worthless forever.
I have a very opposite opinion; single threaded performance only matters to a point where any given task it's doing isn't unusable. Multithreaded performance is crucial from keeping the system from grinding to a halt because users always have multiple applications open at the same time. Five browser windows with 4-12 tabs on each, on three different browsers, 2-4 Word instances, some electron(equivalent) comms app, is much less unusual than I'd like it to be. I have used laptops with only two cores and it gave a new meaning to slow when you tried doing absolutely anything other than waiting for your one application to do something. Only having one application open at a time was somewhat useable.
znpy
> The only performance that actually matters is single thread performance.
Strong disagree, particularly on laptops.
Having some firefox thread compile/interpret/run some javascript 500 microseconds faster is not going to change my life very much.
Having four extra cores definitely will: it means i can keep more stuff open at the same time.
The pain is real, particularly on laptops: i've been searching for laptops with a high-end many-core cpu but without a dedicated gpu for years, and still haven't found anything decent.
I do run many virtual machines, containers, databases and stuff. The most "graphic-intensive" thing i run is the browser. Otherwise i spend most of my time in terminals and emacs.
adrian_b
The fact that multithreaded performance is worthless for you does not prove that this is true for most computer users.
During the last 20 years, i.e. during the time interval when my computers have been multi-core, their multithreaded performance has been much more important for professional uses than the single-threaded performance.
The performance with few active threads is mainly important for gamers and for the professional users who are forced by incompetent managers to use expensive proprietary applications that are licensed to be run only on a small number of cores (because the incompetent managers simultaneously avoid cheaper alternatives while not being willing to pay the license for using more CPU cores).
A decent single-threaded performance is necessary, because otherwise opening Web pages bloated by JS can feel too slow.
However, if the single-threaded performance varies by +/- 50% I do not care much. For most things where single-threaded performance matters, any reasonably recent CPU is able to do instantaneously what I am interested in.
On the other hand, where the execution time is determined strictly by the multithreaded performance, i.e. at compiling software projects or at running various engineering EDA/CAD applications, every percent of extra multithreaded performance may shorten the time until the results are ready, saving from minutes to hours or even days.
anal_reactor
Lots of problems can be nicely parallelized, but the cost of doing so usually isn't worth it, simply because the entity writing the software isn't the entity running it, so the software vendor can just say "get a better PC, I don't care". There was a period when having high requirements was a badge of honor for video games. When a company needs to pay for the computational power, suddenly all the code becomes multithreaded.
jart
Yes but look at the chart in the article. Both multi-threaded and single-threaded performance is getting slower on laptops. With desktops, multi-threaded is getting slower and single-threaded is staying the same.
moffkalast
Superscalarity is largely pointless yes, given that memory access between threads is almost always a pain, so two of them rarely process the same data and can't take advantage of using a single core's cache at the same time. It doesn't even make much sense in concept.
But multicore performance does matter significantly unless you're on a microcontroller running only one process on your entire machine. Just Chrome launches a quarter million processes by itself.
PeterStuer
Please open your task manager and see how much stuff is running.
The times where you saw one core pegged and the rest idle are a decade ago.
dbtc
Programmers failing to manufacture sufficient inefficiency to force upgrade.
johnnyanmac
Can't force upgrade with money you don't have. For a non-enthusiast, even the lowest specs are more than good enough for light media consumption and the economy would affect what they invest in.
rpcope1
As an experiment, I've tried working with a Dell Wyse 5070 with the memory maxed out. Even for development work, outside of some egregious compile times for large projects, it actually worked ok with Debian and XFCE for everything including some video conferencing. Even if you had money for an upgrade, it's not clear it's really necessary outside of a few niche domains. I still use my maxed out Dell Precision T1700 daily and haven't really found a reason to upgrade.
tourmalinetaco
As an enthusiast, I’ve found mid-tier hardware from over a decade ago can run the majority of games on medium/high without much problem. And that the majority of people in my life only need a laptop that can consistently stream 1080p and run a modern browser, maybe with some extra bits like Microsoft Office (although most younger users use G Suite, presumably because schooling preferred it growing up).
brokenmachine
I very much doubt that is the cause.
anal_reactor
I think it's reason #4. I bought a PC 12 years ago, the only upgrade I did was buying an SSD, and now it's being used by my father, who's perfectly happy with it, because it's fast enough to check mail and run MS Office. My current gaming PC was bought 4 years ago, and it's still going strong, I can play most games on high settings in 4k (thank you DLSS). I've noticed the same pattern with smartphones - my first smartphone turned into obsolete garbage within a year, my current smartphone is el cheapo bought 4 years ago, and it shows no signs of needing to be replaced.
ok I lied, my phone is only two years old, but what happened is that two years ago my phone experienced a sudden drop test on concrete followed by pressure test of a car tyre, and it was easier to buy a new one with nearly identical specs than to repair the old one.
bobthepanda
there's 4 which has a couple different component
* battery life is a lot higher than it used to be and a lot of devices have prioritized efficiency
* we haven't seen a ton of changes to the high-end for CPU. I struggle to think of what the killer application requiring more is right now; every consumer advancement is more GPU focused
AnthonyMouse
COVID. People start working from home or otherwise spending more time on computers instead of going outside, so they buy higher-end computers. Lockdowns end, people start picking the lower-end models again. On multi-threaded tasks, the difference between more and fewer cores is larger than the incremental improvements in per-thread performance over a couple years, so replacing older high core count CPUs with newer lower core count CPUs slightly lowers the average.
Macha
I don't buy this really. COVID to now is less than 1 laptop replacement cycle for non-techie users who will usually use a laptop until it stops functioning, and I don't think the techie users would upgrade at all if their only option is worse.
AnthonyMouse
A lot of laptops stop functioning because somebody spills coffee in it or runs it over with their car. There are also many organizations that just replace computers on a 3-5 year schedule even if they still work.
And worse is multi-dimensional. If you shelled out for a high-end desktop in 2020 it could have 12 or 16 cores, but a new PC would have DDR5 instead of DDR4 and possibly more of it, a faster SSD, a CPU with faster single thread performance, and then you want that but can't justify 12+ cores this time so you get 6 or 8 and everything is better except the multi-thread performance, which is worse but only slightly.
The reticence to replace them also works against you. The person who spilled coffee in their machine can't justify replacing it with something that nice, so they get a downgrade. Everyone else still has a decent machine and keeps what they have. Then the person who spilled their drink is the only one changing the average.
nubinetwork
> (increased overhead with newer versions of Windows)
> Fixes for speculative execution vulnerabilities?
I don't know if they'll keep doing it, but hardware unboxed had been doing various tests with Windows 10 vs 11, and mitigations on vs off, as well as things like SMT on vs off for things like AMD vcache issues, or P cores vs E cores on newer intels... it's interesting to see how hardware performs 6-12 months after release, because it can really go all over the place for seemingly no reason.
xen2xen1
It's either the Chromebook effect: people are just buying slower computers, possibly in poorer-ish countries.
Or: The average speed of new computers just isn't going up. People are just buying lower end computers because they're good enough. I think it's mostly that: the average is just trending a bit lower because lower is fine. The thirst for the high end is disappearing for most segments, because people don't need it.
lynguist
My hunch is that more people in countries like India use cpubenchmark / buy computers and try out cpubenchmark; and that in countries like those lower performing laptops outsell more powerful laptops by an enough large number that it shows.
It doesn’t have to be literally India, it’s an example for illustration.
baq
Or the ‘slow CPUs’ are actually truly Fast Enough. Intel N100 and N150 systems are low key amazing value.
bearjaws
I bought into the N100 hype train because of Reddit. Sure its an amazing value, but I hope anyone reading this isn't convinced it's remotely fast. I ended up going with a minisforum ryzen for about 2x the price and 4x the performance.
I was a bit bummed since I wanted to use it as a kind of streaming box, and while it can do it, it is definitely slow.
baq
I got a 16GB RAM 512GB SSD N100 minipc for less than $150 last month. Yes it could be faster (always true for any computer), but I feel I got way more than I paid for. Certainly a much better deal than an RPi 5.
PS. If you need more power than that I have a hard time coming up with a better deal than M4 Mac Mini, perhaps with some extra usb storage. It's possibly the only non-workstation computer that is worth buying in the not-inexpensive category (in the base version, obviously).
dartharva
Sounds possible, office laptops in India are generally shit-tier. But the largest non-bulk retail buyers of laptops in India seem to be (from my armchair market observation) school/college kids, and school/college kids increasingly prefer gaming laptops (India is the largest consumer of gaming laptops after the US) that will likely not give below-average benchmark results.
bjourne
Could it be related to world-wide inflation and decreasing real wages? Salaries have not kept up, hence people can't afford as powerful hardware as they used to. It also seems that top-of-the-line hardware don't depreciate as fast as it used to. Many several year's old GPUs are still expensive, for example.
lolinder
Someone already gave the answer:
https://news.ycombinator.com/item?id=43030465
It's caused by people failing to read the fine print on the graph.
rahimnathwani
I had the same instinct as you, but this comment changed my mind:
lolinder
Nah, I read that but I'm not convinced. The reply to them from palijer is correct: inaccurate does not mean lower, it just means inaccurate. Fewer samples means less data means more room for weird sampling errors skewing the data. There's nothing inherent in that that suggests we ought to assume an upward skew.
The best we can say is that the sampling skew this month is different than the sampling skew in past Januaries. That could be due to some interesting cause, it could be totally random, and there's zero point in speculating about it until we have more data.
ekianjo
> Many several year's old GPUs are still expensive,
They are propped up by demand and the fact that most of the new GPUs are marginally better than previous ones.
ttt3ts
New GPUs are quite a bit better than previous ones but perf oer dollar has been flat for a while now.
Also, if you're talking gaming GPUs old ones work fine given there has not been a new PlayStation or Xbox in many years. Min spec for many games is 5 years old tech
teaearlgraycold
For gaming GPUs the 5090 is something like 20-50% better than the 4090 in raster performance.
Rury
Sure they are better... but once you factor for die size, core count, wattage and so on... the improvements being made are less impressive than they first seem.
epolanski
That would not change the expectation of having faster hardware with time at the same/lower price.
slyall
Discussion from a day or so ago. 59 comments:
kqr
Given the strong prior pattern, I forecast this is due to
(a) changes in methodology/non-random sampling, with 60 % probability,
(b) random sampling effects that go away on longer timelines, with 20 % probability,
(c) any actual change in CPUs, with 20 % probability.
In my experience, when a metric shows weird behavious these are usually the reasons with roughly those proportions.
varispeed
Friend of mine bought Zephyrus G16 with Intel 185H and 32GB RAM. He thought this will help him studying and he chose it because some software he has to use is only available for Windows. He called me to have a look as the laptop has been sluggish for him.
This thing barely can handle Office, Teams and the browser. Hot, noisy as hell and performance wise I see no difference over laptops from decade ago. Tragic.
To be fair I don't think I could do anything. Task manager showed CPU as under utilised, yet fans were blasting, editing Word document looked like a slideshow.
ASUS tools showed no problems.
I don't know, feels like people are getting scammed with these new laptops.
I still have M1 Max and never experienced anything like this.
magicalhippo
Had something similar with my work laptop today. Lenovo X1 Carbon, couple of years old. Got reinstalled very recently.
Been fine, but suddenly it was slow af. Near zero CPU, GPU and disk usage in Windows Task Manager, but I could feel it was burning hot, which it wasn't just 15 minutes prior.
Did a reboot and all was fine again.
Surely some firmware that messed up, though no idea why.
Anyway, I'd start by removing crap, that goes for 3rd party anti-virus and vendor tools especially. Use something like Bulk Crap Uninstaller[1] or Revo[2], and then reinstall drivers.
Totally agreed on the sad state of laptops these days.
[1]: https://www.bcuninstaller.com/ (open source)
[2]: https://www.revouninstaller.com/products/revo-uninstaller-fr...
odo1242
What were the disk specs? HDD or SSD? GPU?
Windows is a heavy operating system compared to the others and can cause that problem but that likely isn't Windows alone causing the problem (though bloatware / other AV solutions could also have something to do with it)
varispeed
1TB NVMe and 4060 GPU
iforgot22
Sounds a lot like a dedicated GPU is running and heat-throttling the whole thing, since you don't see CPU usage.
I wanted to say gaming laptops are a scam, but the older Intel MBPs with dedicated GPUs suffered too. More trouble than they're worth.
varispeed
One thing I notice is that Google Drive was using 15% of ARC GPU (laptop also has nVidia 4060). When I shut down that process, it cooled down a bit after a while, but system is still sluggish and fans work all the time, even if I switched it to "Silent" mode.
iforgot22
Does it have an option to shut off the dGPU entirely and rely on integrated? Just as a test.
truekonrads
Uninstall all security except for Windows defender and see how it feels.
speed_spread
It's an Intel powered gaming laptop with discrete GPU running Windows. Of course it's gonna heat like crazy. But that box can also do things an M1 can't do. Like play games... And heat up a small room.
varispeed
The discrete GPU is barely used, so it shouldn't be getting hot.
speed_spread
Yeah, that's the Intel part. They do that. AMD would have been much better although still not M1 cool.
dusted
Interestingly, so did server cpu thread performance, I guess we're now starting to see more direct prioritization of core count versus core performance, which I kind of understand, it's superficially simpler to think about horizontal scaling, and for very many workloads, it's ideal, and for many it's at least doable, though I fear the invisible overhead in both compute and complexity
walrus01
I have a theory that cpu performance in laptops plateauing is something we'll see more of in the future, as manufacturers further optimize for battery life and being very thin. 15 years ago I wanted a very powerful MacBook pro. Now I'm fine with a MacBook air and if I need to do something that requires heavy lifting, I have remote access (ssh, vnc over ssh, etc) into a loud, beefy hypervisor located somewhere else with 512GB of RAM in it.
On the consumer level and for non technical end users, as more functions are offloaded to "the cloud" with peoples' subscriptions to office365, Google workspace, iCloud, whatever, having a ton of cpu power in the laptop also isn't as necessary anymore. Along with things like h.265 and av1 video encode and decode being done with GPU assist rather than purely in software.
zeroq
Some 15 years ago I was playing DOTA on what you'd call today a gaming laptop (HP HDX9200) which would occasionally overheat causing BSOD and everytime it happened it was hail mary, because the boot time took around 5 minutes and it was the exact time for servers to flag me as leaver and ban me from future games.
These days I have two main machines which boot to Windows in about a minute.
Somehow I remember my old 486 machine was able to boot to DOS within the snap of a power button.
aboardRat4
My experience with recent hardware is that it is very poorly designed, not as "computational hardware", but as a "consumer product".
Buying anything new for me is out of question since a few years ago, because it just cannot be guaranteed to survive half a year, hence now I always buy second-hand, because this way at least it was tested by the previous owner.
But even so, modern hardware is hellishly unreliable.
I bought a Dell 7 series laptop (17 inches), and:
1. Already had to replace the battery back panel three times, just because it is being held by tiny plastic hooks which are easily torn away.
2. It's assembly-reassembly process is a nightmare (which used to not be the case for Dell in the past).
3. Already had to replace the fans (the laptop is not even 5 years old).
but bear with me, the rest is more fun: 4. When running with a smaller battery (official), it CANNOT RUN THE GPU AT FULL SPEED EVEN WHEN PLUGGED INTO THE MAINS. Really? This is just ridiculous. A laptop using the battery when plugged in is just insane.
5. You MUST use a smaller battery to install a 2.5" SATA drive. So you much choose: either a SATA drive, or a GPU.
6. The power on/off button does not work if your BIOS battery has low charge. This is just maddening! What is the connection?
Maybe it's just me and one single laptop?
Well, in my experience, everything is getting similarly fragile.
My phone has its thermal sensor installed ON THE BATTERY, so if you replace the battery, you have to replace the sensor as well, which is, oh, well, a harder thing to manufacture than a battery and on many non-official batteries always returns 0 (0 Kelvin that is, -273 Centigrade).The amount of cacti you have to swallow to get used to modern hardware is just staggering.
What also happened in the same time frame according to this website:
So it comes down to: More old (ancient?) machines in the Dataset. Why? Unknown, but probably not indicating a trend regarding the hardware people use in the real World (TM) has changed.[1] https://www.pcbenchmarks.net/displays.html
[2] https://www.pcbenchmarks.net/number-of-cpu-cores.html
[3] https://www.memorybenchmark.net/amount-of-ram-installed.html
[from 3dcenter.org : https://www.3dcenter.org/news/news-des-12-februar-2025-0 [German]]