Skip to content(if available)orjump to list(if available)

The average CPU performance of PCs and notebooks fell for the first time

zabzonk

Could not get to the page.

But, I dunno. I just bought an Asus Zenbook with an intel 9 ultra with Arc graphics 32Gb ram, 1tb SSD, and it seems pretty nifty to me, especially at the price.

wiredfool

  As the chart is updated bi-weekly, but the data point that may change is for the current year. The first few days or weeks of a new year are less accurate compared to the end of a year.

ellisv

Really makes you wish the chart showed some measure of variance

null

[deleted]

throwaway287391

Or just have the last data point include everything from the full 12 month period before (as the title "year on year" would suggest) and maybe even put it in the correct place on the x-axis (e.g. for today, Feb 12, 2025, about 11.8% of the full year gap width from the (Dec 31) 2024 point).

moffkalast

January 2024: https://web.archive.org/web/20240123174954/https://www.cpube...

January 2023: https://web.archive.org/web/20230130185431/https://www.cpube...

Still, both previous years were notably up after the first month already. This is different and perhaps notable regardless. Either the release schedule is skewed forward or the hardware is genuinely stagnating. Or perhaps the benchmark is hitting some other CPU unrelated bottleneck.

lolinder

http://web.archive.org/web/20240221001449/https://www.cpuben...

They have more than twice as many samples on their Feb 10 update this year (47810) as they did last year on Feb 14 (22761). They have shown some growth year on year in sample size, but nowhere near doubling.

That suggests this month has been an outlier in having a strangely large number of samples already, which could all be related—maybe their software started being bundled with a specific OEM or was featured on a popular content creator or whatever.

As a sibling notes, less accurate just means less accurate, not necessarily skewed upward. There simply is less data to draw conclusions on than there will be later, so any medium-sized effect that adds 20k extra samples will have a larger effect now than later.

palijer

This is just still a sample size of three for a data period that is stated to be less accurate" not "has lower values"

Just because the error isn't presented the same here doesn't mean it's error free...

bjourne

Could it be related to world-wide inflation and decreasing real wages? Salaries have not kept up, hence people can't afford as powerful hardware as they used to. It also seems that top-of-the-line hardware don't depreciate as fast as it used to. Many several year's old GPUs are still expensive, for example.

epolanski

That would not change the expectation of having faster hardware with time at the same/lower price.

lolinder

Someone already gave the answer:

https://news.ycombinator.com/item?id=43030465

It's caused by people failing to read the fine print on the graph.

rahimnathwani

I had the same instinct as you, but this comment changed my mind:

https://news.ycombinator.com/item?id=43030614

lolinder

Nah, I read that but I'm not convinced. The reply to them from palijer is correct: inaccurate does not mean lower, it just means inaccurate. Fewer samples means less data means more room for weird sampling errors skewing the data. There's nothing inherent in that that suggests we ought to assume an upward skew.

The best we can say is that the sampling skew this month is different than the sampling skew in past Januaries. That could be due to some interesting cause, it could be totally random, and there's zero point in speculating about it until we have more data.

ekianjo

> Many several year's old GPUs are still expensive,

They are propped up by demand and the fact that most of the new GPUs are marginally better than previous ones.

ttt3ts

New GPUs are quite a bit better than previous ones but perf oer dollar has been flat for a while now.

Also, if you're talking gaming GPUs old ones work fine given there has not been a new PlayStation or Xbox in many years. Min spec for many games is 5 years old tech

teaearlgraycold

For gaming GPUs the 5090 is something like 20-50% better than the 4090 in raster performance.

RachelF

I wonder why? Some possibilities:

1. Extra silicon area being used by NPUs and TPUs instead of extra performance?

2. Passmark runs under Windows, which is probably using increasing overhead with newer versions?

3. Fixes for speculative execution vulnerabilities?

4. Most computers are now "fast enough" for the average user, no need to buy top end.

5. Intel's new CPUs are slower than the old ones

jamesy0ung

> 1. Extra silicon area being used by NPUs and TPUs instead of extra performance?

I'm not an expert in silicon design, but I recall reading that there's a limit to how much power can be concentrated in a single package due to power density constraints, and that's why they are adding fluff instead of more cores.

>2. Passmark runs under Windows, which is probably using increasing overhead with newer versions?

This is a huge problem as well. I have a 2017 15" MacBook Pro and it's a decent computer, except it is horribly sluggish doing anything on it, even opening Finder. That is with a fresh install of macOS 13.7.1. If I install 10.13, it is snappy. If I bootcamp with Windows 11, it's actually quite snappy as well and I get 2 more hours of battery life somehow despite Windows in bootcamp not being able to control (power down) the dGPU like macOS can. Unfortunately I hate Windows, and Linux is very dodgy on this Mac.

gaudystead

That is WILD that Windows 11 runs faster than the machine's native OS...

Could this suggest that modern Windows is, dare I say it, MORE resource efficient than modern macOS?! That feels like a ludicrous statement to type, but the results seem to suggest as much.

iforgot22

Modern macOS might be optimized for Apple Silicon CPUs. But even when it was Intel only, there were probably times when Windows was lighter, albeit bad in other ways.

jamesy0ung

I'm just as surprised. Also, I was using Windows 11 LTSC 2024, not the standard version, which could impact the validity of my comparison.

dbtc

Programmers failing to manufacture sufficient inefficiency to force upgrade.

johnnyanmac

Can't force upgrade with money you don't have. For a non-enthusiast, even the lowest specs are more than good enough for light media consumption and the economy would affect what they invest in.

rpcope1

As an experiment, I've tried working with a Dell Wyse 5070 with the memory maxed out. Even for development work, outside of some egregious compile times for large projects, it actually worked ok with Debian and XFCE for everything including some video conferencing. Even if you had money for an upgrade, it's not clear it's really necessary outside of a few niche domains. I still use my maxed out Dell Precision T1700 daily and haven't really found a reason to upgrade.

brokenmachine

I very much doubt that is the cause.

AnthonyMouse

COVID. People start working from home or otherwise spending more time on computers instead of going outside, so they buy higher-end computers. Lockdowns end, people start picking the lower-end models again. On multi-threaded tasks, the difference between more and fewer cores is larger than the incremental improvements in per-thread performance over a couple years, so replacing older high core count CPUs with newer lower core count CPUs slightly lowers the average.

Macha

I don't buy this really. COVID to now is less than 1 laptop replacement cycle for non-techie users who will usually use a laptop until it stops functioning, and I don't think the techie users would upgrade at all if their only option is worse.

bobthepanda

there's 4 which has a couple different component

* battery life is a lot higher than it used to be and a lot of devices have prioritized efficiency

* we haven't seen a ton of changes to the high-end for CPU. I struggle to think of what the killer application requiring more is right now; every consumer advancement is more GPU focused

Legend2440

You’re forgetting the most likely possibility: it’s an artifact of data collection or a bug in the benchmarking software.

dehrmann

4 has been true for over a decade.

gotoeleven

Literally the only reason you need a computer less than 10 years old for day to day tasks is to keep up with the layers of javascript that are larded onto lots of websites. It's sad.

Edit: oh and electron apps. Sweet lord, shouldnt we have learned in the 90s with java applets that you shouldnt use a garbage collected language for user interfaces?

iforgot22

I think even a blank Electron app is rather heavy because it's basically a whole web browser, which btw is written in C++. BUT my 2015 baseline MBP still feels fine.

pmontra

My laptop is 11 years old and handles Slack quite well, and Jira. I eventually maxed it out at 32 GB and that probably helps. It's only an i7-4xxx though.

Java in the 90s was really slow. It got much faster with the JIT compiler. JavaScript and browsers got many optimizations too. My laptop feels faster now than it was in 2014 (it has always run some version of Linux.)

The other problem of Java was the non native widgets, which were subjectively worse than most of today's HTML/JS widgets, but that's a matter of taste.

whatever1

What is wrong with garbage collection and UI?

bitwize

I use a GC'd language for UI every day -- Emacs -- and it runs well even on potato-tier hardware. The first GUI with all the features we'd recognize was written in a GC'd language, Smalltalk.

GCs introduce some overhead but they alone are not responsible for the bloat in Electron. JavaScript bloat is a development skill issue arising from a combination of factors, including retaining low-skill front end devs, little regard for dependency bloat, ads, and management prioritizing features over addressing performance issues.

bentcorner

But has it been true for the type of person who runs CPU benchmarks?

kome

i use a 10 years old macbook air and honestly i can do everything i need, statistics, light programming, browsing and photo editing. perfect.

i wait it to break before moving on, changing for the sake to changing feels like waste.

skyyler

I use a 10 years old macbook pro and the only thing I dislike is the battery life and the heat generated.

I simply can't afford the replacement 16" MBP right now. Hopefully it lasts another couple of years.

p_ing

> 2. Passmark runs under Windows, which is probably using increasing overhead with newer versions?

Shouldn't be an issue. Foreground applications do get a priority boost; I don't know if Passmark increases priority of itself, tho. Provided there isn't any background indexing occurring, i.e. letting the OS idle after install.

NBJack

As someone obsessing over small performance gains for his processor recently, there is definitely overhead to consider. Note by default you get things now like Teams, Copilot, the revised search indexing system, Skype, etc. Passmark in many tests will max out all cores at once; this tends to make it sensitive to 'background' processes (even small ones).

slyall

Discussion from a day or so ago. 59 comments:

https://news.ycombinator.com/item?id=43017612

altairprime

It’s remarkable how carefully the page is worded to not incorporate Apple’s desktop and mobile CPU performance changes over time - but they screwed up because they said Linux, which doesn’t exclude Apple Silicon like it used to thanks to Asahi.

markhahn

average reflects the population, of course. max graph shows no such fall - laptops going up even.

how about this interpretation: desktops are fast enough; laptops are getting faster but people tend to buy more usable laptops, not faster ones.

nxpnsv

Arguably a majority of improvements in 2025 haven’t happened yet, seems premature to conclude anything in February.

layer8

The chart also shows that single-thread performance hasn’t improved that much in the past twelve or so years, compared to Moore’s law. And this is compounded by Wirth’s law [0].

[0] https://en.wikipedia.org/wiki/Wirth%27s_law

Osiris

Looking at the top end chart, which flat lined, it seems that the biggest contributor would be a slower release cycle of the most performant chips. It looks like the score of the top end chips were dragging up the average. With the top spot not changing, the average is falling.

I'm curious what median and std dev look like.

vladms

People got smarter and buy only what they need?

kelvinjps10

I'm using a thinkpad t480 and it runs everything fine