Skip to content(if available)orjump to list(if available)

Apple M5 chip

Apple M5 chip

601 comments

·October 15, 2025

mumber_typhoon

The M5 MacBook Pro still gets the Broadcom WiFi chip but the M5 iPad Pros get the N1 and C1X (Sweet).

All in all, apple is doing some incredible things with hardware.

Software teams at apple really need to get their act together. The M1 itself is so powerful that nobody really needs to upgrade that for most things most people do on their computers. Tahoe however makes my M1 Air feel sluggish doing the exact same tasks ive been last couple of years. I really hope this is not intentional from Apple to make me upgrade. That would be a big let down.

seunosewa

My M1 Air got very sluggish after upgrading to Tahoe but then it started behaving normally after a couple of days. Hopefully, you'll experience the same soon.

kokada

> Tahoe however makes my M1 Air feel sluggish doing the exact same tasks ive been last couple of years.

I have a work provided M2 Pro with 32GB of RAM. After the Tahoe upgrade it feels like one of the sluggish PCs at the house. It is the only one that I can see the mouse teleporting sometimes when I move it fast. This is after disabling transparency in Accessibility settings mind you, it was even worse before.

speedgoose

Do you have a few electron powered apps that didn’t get updated yet?

Electron used to override a private function that makes the Mac OS sluggish on Tahoe, and apparently no one uses Electron apps while doing testing at Apple.

kokada

I keep my applications pretty much up-to-date but I didn't check the release notes for each Electron application that I have to make sure they're updated. I still think this is a failure of macOS, since one misbehaving application shouldn't bring the whole environment to slow to a crawl.

What I can say is that while the situation is much better than at Day 1, the whole Tahoe experience is not as fluid as Sequoia.

Also, it doesn't really matter to me if this was a private function or not, if this was Windows or Gnome/KDE people would blame the developers of the desktop instead.

placatedmayhem

The check script I've been recommending is here:

https://github.com/tkafka/detect-electron-apps-on-mac

About half of the apps I use regularly have been fixed. Some might never be fixed, though...

michelb

The OS and stock apps are much slower in Tahoe even. And the UI updates/interactions are also slower. I’m lucky I only upgraded my least used machine, and that’s a well stocked M2.

EasyMark

This is why I stay on previous release until at least 0.2 or 0.3 to let them work out the bugs so I dont' have to deal with them, there was nothing in 26 that felt pressing to me that I would need to update

kobalsky

my tinfoil-hat theory is that on each OS iteration Apple adds a new feature that leverages the latest chips hardware acceleration features and for older chips they do software-only implementations.

they ship-of-thesseus the crap out of their OS but replacing with parts that need these new hardware features that run slow on older chips due to software-only implementations.

I got the first generation iPad Pro, which is e-waste now, but I use it as a screen for my CCTV, it cannot even display the virtual keyboard without stuttering like crazy, it lags switching apps, there's a delay for everything, this thing was smooth as butter on release.

thewebguyd

I have the 4th gen (2020) iPad Pro with the A12X Bionic, the same chip they put in the Apple Silicon transition dev kits. With iPadOS 26 it's become barely usable, despite still being performant as ever on iPadOS 18. I'm talking huge drop in performance, stutters and slow downs everywhere.

I was considering just replacing the battery and keeping it for several more years but now I feel forced to upgrade which has me considering whether I still want/need an iPad since I'd also have to buy a new magic keyboard since they redesigned it, and they bumped the price ($1299 now vs. $999 when I got the 4th gen) so I'd be looking at $1700. Trying to hold out for an iPad Air with ProMotion.

I may be in the minority here, but I think 5 years is too short of a lifespan for these devices at this point. Early days when things were advancing like crazy, sure. But now? I have 8 year old computers that are still just fine, and with the M-series chips I'd expect at least 10 years of usable life at minimum (battery not withstanding)

tsunamifury

Transparency disabling ads anothe draw layer that is opaque on top making it even worse than when it’s on

ExoticPearTree

26.0.1 fixed the sluggishness. 26.0 was pretty unstable - felt like a game dropping frames.

kokada

26.0.1 is better, but I can still get sluggishness in a few specific cases.

I just got one example while passing the mouse quickly through my dock (I still use the magnify animation) and I can clearly see it dropping a few frames. This never happened in macOS 15.

fersarr

same here

SkyPuncher

There are so many software related things that drive me absolutely loony with Apple right now.

* My iPhone as a remote for my Apple TV has randomly stopped deciding it can control the volume - despite the "Now Playing" UI offering an audio control that works.

There auth screens drive me crazy:

* Why cannot I not punch in a password while Face ID is working? If I'm skiing, I know Face ID isn't gong to work, stop making me wait.

* Likewise, on Apple TV the parental control input requires me to explicitly choose to enter a Pin Code. Why? Just show me the Pin Code screen. If I can approve from my device, I will.

  * Similarly, if I use my phone as a remote, why do I need to manually click out of the remote to get to the parental control approval screen. I'm literally using my phone. Just auto-approve.

sotix

Why can I not use my password manager for my Apple ID but can use it for any other password field? Instead I have to switch to my password manager, copy the password, reopen the App Store, select get app, and paste the password in the Apple ID login pop up in the 10 seconds before my password clears from my clipboard.

strbean

> * Why cannot I not punch in a password while Face ID is working? If I'm skiing, I know Face ID isn't gong to work, stop making me wait.

Funny, a similar thing has been driving me crazy on my Ubuntu 20.04 laptop with fingerprint login. When unlocking, I can either enter a password or use fingerprint. On boot, I am not allowed to enter a password until I fail with fingerprint. If I use fingerprint to log in on boot, I have to enter my password anyways once logged in to unlock my keychain.

I should probably just figure out a way to disable fingerprint on boot and only use it for the lock screen.

JumpCrisscross

> Tahoe however makes my M1 Air feel sluggish

Counterpoint: my M1 Pro was a turtle for a few weeks and then stopped doing nonsense in the background and is back to its zippy self. (Still buggy. But that would be true on new hardware, too.)

random3

This needs benchmarks.

Sad if true. I feel my M1 max sluggish too lately. After bragging that this was the longest lived work machine I had and thinking I'm good to wait for M6. This is not good for business, but IMO you need more than raw power to justify upgrades even for professional use - form factor, screen quality, battery, etc.

I think they bet a lot of hardware money on AI capabilities, but failed to deliver the software, so there was no real reason to upgrade because of AI features in the chip (which is literally what they boast on the first line of the announcement - yet nobody cares about making more cute faces)

It's not 100% their fault. Everyone got onto the LLM bandwagon like it's "the thing" so even if they didn't believe it they sill needed something. Except an OS is not a chat interface, and LLMs do suck at stricter things.

Insanity

Yeah I love my M1 iPad Pro. But the "liquid glass" update made it feel slower. Really only the 'unlock' feels slower, once I'm using it it's fine. But it's slightly annoying and does make me want to update this year to the m5.

But it's a glorified Kindle and YouTube box, so I'm hesitating a little bit.

asimovDev

my dad's got a pre AS iPad Pro and it's so bad after updating to 26. My 6th gen iPad on iOS 17 felt faster than this

knowitnone3

"make me want to update this year to the m5." Then Apple software devs did what they were told

ksec

The Broadcom WiFi support 320Mhz while N1 is stuck with 160Mhz. There were report of N1 not supporting 4096 QAM as well but I didn't check.

ExoticPearTree

> The Broadcom WiFi support 320Mhz while N1 is stuck with 160Mhz.

I was at a Wi-Fi vendor presentation a while back and they said that 160 Mhz is pretty improbable unless you're leaving alone and no wireless networks around you. And 320 Mhz even less so.

In real life probably the best you can get is 80 Mhz in a really good wireless environment.

amluto

I would believe that MLO or similar features could make it a bit more likely that large amounts of bandwidth would be useful, as it allows using discontiguous frequencies.

WiFi does currently get anywhere near the bandwidth that these huge channels advertise in realistic environments.

shadowpho

For which band? I run 160/160 on 5/6ghz and it’s nice. They are short range enough to work. For 2.4 yeah 20mhz only

zdw

From Apple's support docs:

https://support.apple.com/guide/deployment/wi-fi-ethernet-sp...

No devices support 320Mhz bandwidths, and only supports 160Mhz on 6GHz band on MacBooks and iPads. Some iPhones support 160Mhz on 5GHz as well.

MrAlex94

Does it? If it’s the same WiFi chip used in other M4 Mac’s then it’s still limited to 160MHz:

https://support.apple.com/en-gb/guide/deployment/dep268652e6...

HumblyTossed

"stuck".

An infinitely small percentage of people can take advantage of 320Mhz. It's fine.

londons_explore

Today. But in 3 years time it'll be widespread and your Mac will be the one with the sluggish WiFi connection that jams up the airwaves for all other devices too.

t-3

I doubt the number of people in both "has no neighbors" and "owns Apple hardware" camps are significant at all.

MrBuddyCasino

I don’t think 4096 QAM is realistic anyway, except if your router is 10 cm away from your laptop.

lawlessone

>The M1 itself is so powerful that nobody really needs to upgrade that for most things most people do on their computers

a rant on my part, but a computer from 10 years ago would be fine for what most people do on their computer, only for software bloat..

hannesfur

It’s unfortunate that this announcement is still unspecific about what they improved in the Neural Engine. Since all we know about the Neural Engine comes from Apple papers or reverse engineering efforts (https://github.com/hollance/neural-engine), it’s plausible that they addressed some quirks to enable better transformer performance. They have written quite interesting papers on transformers on the Neural Engine:

- https://machinelearning.apple.com/research/neural-engine-tra...

- https://machinelearning.apple.com/research/vision-transforme...

Things have definitely gotten better with MLX on the software side, though it still seems they could do more in that area (let’s see what the M5 Max brings). But even if they made big strides here, it won’t help previous generations, and the main thing limiting Apple Intelligence (in my opinion) will continue to be the 8 GB of unified memory they still insist on.

trymas

> the main thing limiting Apple Intelligence (in my opinion) will continue to be the 8 GB of unified memory they still insist on.

As you said - it won’t help previous generations, though since last year (or two??) all macs start with 16GB of memory. Even entry level macbook airs.

hannesfur

Thats true! I was referring to their wider line up, especially the iPad, where users will expect the same performance as on the Mac’s (they payed for an Mx chip) and they sold me an iPad Air this year that comes with a really fast M3 and still only 8 GB of RAM (you only get 16 on the iPad Pro btw if you go with at least 1TB of storage on the M4 Pro one)

moi2388

Why would you expect the same performance on iPad and MacBook Pro?

The latter has up to 128GB of memory?

raverbashing

I bet Cook authorized the upgrade with grinned teeth and I was all for it

liuliu

Faster compute helps, for things like vision language model that requires bigger context to be filled. My understanding is that ANE is still optimized for convolution load, and compute efficiency while the new neural accelerators optimized for flexibility and performance.

zozbot234

The old ANE enabled arbitrary statically scheduled multiply-add, of INT8 or FP16. That's good for convolution but not specifically geared for it.

liuliu

I am not an expert on ANE, but I think it is related to the size of register files and how that is smaller than what we need for GEMM on modern transformers (especially these fat ones with MoE).

hannesfur

That would be an interesting approach if true. I hope someone gets to the bottom of it once we have hardware in our hands.

fooblaster

MLX doesn't use the neural engine still right? I still wish they would abandon that unit and just center everything around metal and tensor units on the GPU.

hannesfur

Oh, I overlooked that! You are right. Surprising… since Apple has shown that it’s possible through CoreML (https://github.com/apple/ml-ane-transformers)

I would hope that the Foundation Models (https://developer.apple.com/documentation/foundationmodels) use the neural engine.

fooblaster

The neural engine not having a native programming model makes it effectively a dead end for external model development. It seems like a legacy unit that was designed for cnns with limited receptive fields, and just isn't programmable enough to be useful for the total set of models and their operators available today.

hannesfur

Edit: Foundation Models use the Neural Engine. They are referring to a Neural Engine compatible K/V cache in this announcement: https://machinelearning.apple.com/research/introducing-apple...

zozbot234

Wrt. language models/transformers, the neural engine/NPU is still potentially useful for the pre-processing step, which is generally compute-limited. For token generation you need memory bandwidth so GPU compute with neural/tensor accelerators is preferable.

fooblaster

I think I'd still rather have the hardware area put into tensor cores for the GPU instead of this unit that's only programmable with onnx.

llm_nerd

MLX is a training/research framework, and the work product is usually a CoreML model. A CoreML model will use any and all resources that are available to it, at least if the resource fits for the need.

The ANE is for very low power, very specific inference tasks. There is no universe where Apple abandons it, and it's super weird how much anti-ANE rhetoric there is on this site, as if there can only be one tool for an infinite selection of needs. The ANE is how your iPhone extracts every bit of text from images and subject matter information from photos with little fanfare or heat, or without destroying your battery, among many other uses. It is extremely useful for what it does.

>tensor units on the GPU

The M5 / A19 Pro are the first chips with so-called tensor units. e.g. matmul on the GPU. The ANE used to be the only tensor-like thing on the system, albeit as mentioned designed to be super efficient and for very specific purposes. That doesn't mean Apple is going to abandon the ANE, and instead they made it faster and more capable again.

zozbot234

> ...and it's super weird how much anti-ANE rhetoric there is on this site, as if there can only be one tool for an infinite selection of needs

That seems like a strange comment. I've remarked in this thread (and other threads on this site) about what's known re: low-level ANE capabilities, and it seems to have significant potential overall, even for some part of LLM processing. I'm not expecting it to be best-in-class at everything, though. Just like most other NPUs that are also showing up on recent laptop hardware.

almostgotcaught

> the work product is usually a CoreML model.

What work product? Who is running models on Apple hardware in prod?

zuspotirko

ofc true. Unified memory is always less than vram. And my 16GB vram aren't enough.

But I think it's also a huge issue Apple makes storage so expensive. If Apple wants local AI to answer your questions it should be able to take your calender, emails, text messages, photos, journal entries etc. into account. It can't do that as nicely as long as customers opt for only 256GB or 1TB devices due to cost

JKCalhoun

I can only guess that significant changes in hardware have longer lead times than software (for example). I suppose I am not expecting anything game-changing until the M6.

paxys

M5 is 4-6x more powerful than M4, which was 5x more powerful than M3, which was 4x more powerful than M2, which was 4x more powerful than M1, which itself was 6x faster than an equivalent Intel processor. Great!

Looking at my Macbook though, I can say with utmost certainty that it isn't 4000x faster than the Intel one I had 5 years ago.

So, where is the disconnect here? Why is actual user experience not able to keep up with benchmarks and marketing?

quitit

You wrote:

>Looking at my Macbook though, I can say with utmost certainty that it isn't 4000x faster than the Intel one I had 5 years ago. So, where is the disconnect here?

They wrote:

> Together, they deliver up to 15 percent faster multithreaded performance over M4

The problem is comprehension, not marketing.

CryptoBanker

I think you’re the one misreading here. The 15% refers to CPU speed while the 6x, etc. multiples refer to GPU speed

thebitguru

Apple has also seemingly stopped caring about the quality and efficiency of their software. You can see this especially in the latest iOS/iPadOS/macOS 26 versions of their operating systems. They need their software leadership to match their hardware leadership, otherwise good hardware with bad software still leads to bad product, which is what we are seeing now.

heresie-dabord

> Apple has also seemingly stopped caring about the quality and efficiency of their software.

Hardware has improved significantly, but it needs software to enable me to enjoy using it.

Apple is not the only major company that has completely abandoned the users.

The fastest CPUs and GPUs with the most RAM will not make me happier being targeted by commercial surveillance mechanisms, social-media applications, and hallucinating LLM systems.

taf2

i think 15.6.1 (24G90) will be my last mac osx... omarchy is blazing fast

drcongo

I see this sentiment a lot, but I've found the OS26 releases to be considerably better than the last few years' OS releases, especially macOS which actually feels coherent now compared to the last few years of janky half baked UI.

cmcaleer

It is frankly ridiculous how unintuitive it was to add an email account to Mail on iOS. This is possibly the most basic functionality I would expect an email client to have. One would expect that they go to their list of mailboxes and add a new account.

No. You exit the mail app -> Go to settings -> apps -> scroll through a massive list (that you usually just use for notification settings btw) to go to mail -> mail accounts -> add new account.

Just a simple six-step process after you’ve already hunted for it in the mail app.

jrmg

There’s an “Accounts...” entry in the main “Mail” menu.

You can also click the “+” button at the bottom of the list of accounts in the “Accounts” panel in Mail's settings window.

ant6n

I think the most most basic integration w.r.t. email I want from Apple is that I want to set up another email program besides “Mail” as the default email program, but without having to set up Mail first.

random3

The disconnect is that you're reading sideways.

First line on their website:

> M5 delivers over 4x the peak GPU compute performance for AI compared to M4

It's the GPU not the CPU (which you compare with your old Intel) and it's an AI workload, not your regular workload (which again is what you compare)

bangaladore

And they are comparing peak compute. Which means essentially nothing.

random3

There was a time when Apple decided throwing random technical numbers shouldn't be the news (those were following the times of Megahertz counting). These times have been changing post Steve Jobs. This said, it is a chip announcement rather than a product announcement, so maybe that is the news.

tempodox

Do not trust any statistics you did not fake yourself.

potatolicious

Because there's more to "actual user experience" than peak CPU/GPU/NPU workload.

Firstly, the M5 isn't 4-6x more powerful than M4 - the claim is only for GPU, only for one narrow workload, not overall performance uplift. Overall performance uplift looks like ~20% over M4, and probably +100% over M1 or so.

But there is absolutely a massive sea change in the MacBook since Intel 5 years ago: your peak workloads haven't changed much, but the hardware improvements give you radically different UX.

For one thing, the Intel laptops absolutely burned through the battery. Five years ago the notion of the all-day laptop was a fantasy. Even relatively light users were tethered to chargers most of the day. This is now almost fully a thing of the past. Unless your workloads are very heavy, it is now safe to charge the laptop once a day. I can go many hours in my workday without charging. I can go through a long flight without any battery anxiety. This is a massive change in how people use laptops.

Secondly is heat and comfort. The Intel Macs spun their fans up at even mild workloads, creating noise and heat - they were often very uncomfortably warm. Similar workloads are now completely silent with the device barely getting warmer than ambient temp.

Thirdly is allowing more advanced uses on lower-spec and less expensive machines. For example, the notion of rendering and editing video on a Intel MacBook Air was a total pipe dream. Now a base spec MacBook Air can do... a lot that once forced you into a much higher price point/size/weight.

A lot of these HN conversations feel like sports car fans complaining: "all this R&D and why doesn't my car go 500mph yet?" - there are other dimensions being optimized for!

cj

I’m not sure I see the disconnect.

At our company we used to buy everyone MacBook Pros by default.

After the M-series chip, the MBPs are just too powerful and no longer necessary for the average white collar worker (they seem like “actual” pro machines, now) to the point where we now order MacBook Airs for new employees.

I feel like until recently, you really needed a MBP to get a decent UX (even just using chrome). But now there doesn’t seem to be a major compromise when buying an Air for half the price, at least compared to 3-5 years ago.

charliebwrites

Anecdotal, but I switched to an M3 MBA from an M1 MBP for my iOS and other dev related work

I’ve had zero problems with lag or compile time (prior to macOS 26 anyway)

The only thing it can’t do is run Ableton in a low latency way without strongly changing the defaults

You press a key on the keyboard to play a note and half a second later you hear it

Other than that, zero regrets

hartator

> After the M-series chip, the MBPs are just too powerful and no longer necessary for the average white collar worker (they seem like “actual” pro machines, now) to the point where we now order regular MacBooks (not Pro’s) for new employees

Regular MBs are not really a thing anymore. You mean Airs?

cj

Yes, sorry :) just edited to fix.

wlesieutre

What's crazy about that to me is the Macbook Air doesn't even have a fan. The power efficiency of the ARM chips is really something.

condiment

It's GPU performance.

Spin up ollama and run some inference on your 5-year-old intel macbook. You won't see 4000x performance improvement (because performance is bottlenecked outside of the GPU), but you might be in the right order of magnitude.

blihp

Not possible given the anemic memory bandwidth [1]... you can scale up the compute all you want but if the memory doesn't scale up as well you're not going to see anywhere near those numbers.

[1] The memory bandwidth is fine for CPU workloads, but not for GPU / NN workloads.

jandrese

Comparing GPU performance to some half decade old Intel IGP seems like lying with statistics.

"Look how many times faster our car is![1]"

[1] Compared to a paraplegic octogenarian in a broken wheelchair!"

umanwizard

Well, Apple isn’t making that comparison, the OP was.

null

[deleted]

0x457

Well, if you read the very next thing after 4x, you will notice it says "the peak GPU compute performance for AI compared to M4".

The disconnect here is that you can't read. Sorry, no other way to say it.

semiinfinitely

All those extra flops are spent computing light refraction in the liquid glass of the ui

gcr

So how many hardware systems does Apple silicon have for doing matrix multiplies now?

1. CPU, via SIMD/NEON instructions (just dot products)

2. CPU, via AMX coprocessor (entire matrix multiplies, M1-M3)

3. CPU, via SME (M4)

4. GPU, via Metal (compute shaders + simdgroup-matrix + mps matrix kernels)

4. Neural Engine via CoreML (advisory)

Apple also appears to be adding a “Neural Accelerator” to each core on the M5?

nullbyte

Thankfully I think libraries like Pytorch abstract this stuff away. But it seems very convoluted if you're building something from the ground up.

oskarkk

Would it be possible to use all of them at the same time? Not necessarily in a practical way, but just for fun? Could different ways of doing this on CPU be done in some extent by one core at the same time, given it's superscalar?

staticfloat

This is a very old answer about the M1, but yes what you’re saying is possible: https://stackoverflow.com/a/67590869/230778

hannesfur

I inferred that they meant the neural engine cores by neural accelerators or it could be a bigger/different AMX (which really should become a standard btw)

toddmorey

The modern Apple feels like their hardware teams way outperforming the software teams.

linguae

This is not the first time this has happened in Apple’s history. The transition from the 68k architecture to the PowerPC brought major performance improvements, but Apple’s software didn’t take full advantage of it. If I remember correctly, even after the PowerPC switch, core elements of the classic Mac OS still ran in emulation as late as Mac OS 9. Additionally, the classic Mac OS lacked protected memory and preemptive multitasking, leading to relatively frequent crashes. Taligent and Copland were attempts to address these issues, but they both faced development hell, culminating with the purchase of NeXT and the development of Mac OS X. But by the time Mac OS X was released, PowerPC was becoming less competitive than the x86, culminating with the Intel switch in 2006. At this point it was Apple’s software that distinguished Macs from the competition, which remained the case until the M1 Macs were released five years ago.

mikepurvis

Sixteen years ago, John Gruber wrote:

> Hardware and software both matter, and Apple’s history shows that there’s a good argument to be made for developing integrated hardware and software. But if you asked me which matters more, I wouldn’t hesitate to say software. All things considered I’d much prefer a PC running Mac OS X to a Mac running Windows.

https://daringfireball.net/2009/11/the_os_opportunity

At the time I'd only been a Mac user for a few years and I would have strongly agreed. But definitely things have shifted— I've been back on Windows/WSL for a number of years, and it's software quality/compatibility issues that are a lot of what keeps me from trying another Mac. Certainly I'm far more tempted by the hardware experience than I am the software, and it's not even really close.

selectodude

That’s so wild to me - my personal laptop is still a Mac but I’m in windows all day for work. Some of the new direction of macOS isn’t awesome but the basics are still rock solid. Touchpad is perfect, sleep works 100% of the time for days on end, still has UNIX underneath.

KeplerBoy

I bet most people around here would prefer fully supported linux over mac os on their apple silicon.

klooney

Advertisements in Windows seem like a deal breaker to me, but I've been gone for a while.

qwertytyyuu

these days i'd rather have macbook running windows than macos running on standard windows laptop of the same form factor, purely for the efficiency of apple silicon.

lenkite

Windows would have beat MacOS only if Microsoft had just done one small, teeny-weeny thing - just left the OS alone after Win 10.

lotsofpulp

Seeing my wife have to deal with BSOD and tedious restarts for Windows updates and myriad just to use Teams/Excel makes me think the software issues are far worse on the Windows side.

Not once in 10 years have I had ti troubleshoot while she uses her personal macOS, but a Dell Latitude laptop in 2025 still can’t just “open lid, work, close lid”.

And it’s slower. And eats more battery.

larodi

Curiously every big player/vendor doing something remotely relevant to GPU/NPU/APU etc. sees massive growth. Apple's M-processors are much better in terms price/value ratio for current ML pipelines. But Apple do not have server line, which then seems to be super massive problem for their products, even though their products actually compete with NVidia in the consumer market, which is very substantial position, software or not.

AMD was also lagging with drivers, but now we see OpenAI swearing they gonna buy loads of their products, which so many people were not favor of liek just 5-7 years ago.

samwillis

Software is very easy to bloat, expand scope, and grow to do more than really needed, or just to release apps that are then forgotten about.

Hardware is naturally limited in scope due to manufacturing costs, and doesn't "grow" in the same way. You replace features and components rather than constantly add to them.

Apple needs someone to come in and aggressively cut scope in the software, removing features and products that are not needed. Pair it down to something manageable and sustainable.

pxc

> pare down products and features

macOS has way too many products but far too few features. In terms of feature-completeness, it's already crippled. What OS features can macOS afford to lose?

coredog64

I would say it's less about losing and more about focus. Identify the lines of business you don't want to be in and sell those features to a third party who can then bundle them for $1/$10/$20. A $2T company just doesn't care, but I would bet that those excised features would be good enough for a smaller software house.

(I have the same complaint about AWS, where a bunch of services are in KTLO and would be better served by not being inside AWS)

6SixTy

macOS has like no features already, and they keep removing more.

panick21_

If you think hardware can't bloat, I suggest you look into the history of Intels attempt to replace x86. Or the VAX. Not to mention tons of minicomputer companies who built ever more complex minis. And not to mention the supercomputer startup bubble.

geodel

Well besides software that runs in data centers/ cloud most other software is turning to crap. And people who think this crap is fine have now reached to position of responsibility at lot of companies. So things would go only worse from here.

sho_hn

Except community-developed open source software, which (slowly, perhaps) keeps getting better and has high resistance to enshittification.

geodel

The OSS that keeps getting "better" is one that accept lot user feature requests and/or implementation. Else maintainers are hostile to users. And when they do accept most of those requests and code we all know how it goes.

Noaidi

This right here is moving me back to GrapheneOS and Linux. I was lucky enough to be able to uninstall Liquid glAss before the embargo. I will miss the power efficiency of my M1, but the trade off keep looking better and better.

being poor, I need to sell my Macbook to get money to pay of my 16e, then sell the 16e and use that money to but a Pixel 9, then probably a but a Thinkpad Carbon X1. Just saying all that to show you the lengths I am going through to boycott/battle the enshitification.

Aperocky

Remember log4j? I don't share your enthusiasm.

At least its open source and free I guess.

whitehexagon

I dunno, didnt they already crack the 400GB/s memory bandwidth some years ago? This seems like just another small bump to handle latest OS effects sludge.

Now the M1 range, that really was an impressive 'outperform' moment of engineering for them, but otherwise this is just a clock-work MBA driven trickle of slightly better over-hyped future eWaste.

To outperform during this crisis, hardware engineers worth their salt need to designing long lived boxes with internals that can be easily repaired or upgraded. "yeah but the RAM connections are fiddly" Great, now that sounds like a challenge worth solving.

But you are right about the software. Installing Asahi makes me feel like I own my compter again.

astroflection

https://asahilinux.org/

"Linux on Apple Silicon: Asahi Linux aims to bring you a polished Linux® experience on Apple Silicon Macs."

Why the "®" after Linux? I think this is the first time I've seen this.

utf_8x

The Linux "brand" is trademarked by Linus Torvalds, presumably to stop things like "Microsoft® Linux®" from happening...

alexanderson

Apple has always been a hardware company first - think of how they sell consumers computers with the OS for free, while Microsoft primarily just sells the OS (when comparing the consumer business; I don’t want to get into all the other stuff Microsoft does).

Now that they own the SoC design pipeline, they’re really able to flex these muscles.

ViktorRay

Steve Jobs himself said that Apple sees itself as a software company

https://youtu.be/dEeyaAUCyZs

The above link is a video where he mentions that.

It is true that Apple’s major software products like iOS and MacOS are only available on Apple’s own hardware. But the Steve Jobs justification for this (which he said in a different interview I can’t find right now so I will paraphrase) is that he felt Apple made the best hardware and software in the world so he wanted Apple’s customers to experience the best software on the best hardware possible which he felt only Apple could provide. (I wish I could find the exact quote.)

Anyway according to Steve Jobs Apple is a software first company.

alt227

Apple has always been a software first company, and they only sell the hardware as a vehicle to their software. They regularly say this themselves and have always called themselves a software company. Compare their hardware revenues with that of the app store and icloud subscriptions, you will see where they make most of their money.

EDIT: I seem to be getting downvoted, so I will just leave this here for people to see I am not lying:

https://www.businessinsider.com/tim-cook-apple-is-not-a-hard...

achierius

> Compare their hardware revenues with that of the app store and icloud subscriptions, you will see where they make most of their money.

Yes, it's $70B a year from iPhones alone and $23B from the totality of the Services org. (including all app store / subscription proceeds). Significantly more than 50% of the company's total profits come from hardware sales.

dylan604

Apple has always? Sure, maybe today with collection % of sales from apps it looks like a software company. If there was no iDevcies, there'd be no need for app store. Your link is all about Cook, yet he was not always the CEO. Woz didn't care what software you ran, he just wanted the computer to be usable so you could run whatever software. Jobs wanted to restrict things, but it was still about running the hardware. Whatever Cook thinks Apple is now does not make it always been as you claim

RossBencina

Apple has been calling themselves a consumer electronics company since at least 2006.

jsnell

Sure, let's compare.

Apple's product revenue in this fiscal year has been $233B, with a gross margin of $86B.

Their services revenue is $80B with $60B gross margin.

ksec

It goes back even further, Steve Jobs said Apple is a software company, you just have to buy its hardware to use it. It is the whole experience.

wat10000

I did that comparison and they make the vast majority of their money on hardware. Half of their revenue is iPhone, a quarter is services, and the remaining quarter is divided up among the other hardware products.

Regardless of revenue, Apple isn't a hardware company or a software company. It's a product company. The hardware doesn't exist merely to run the software, nor does the software exist merely to give functionality to the hardware. Both exist to create the product. Neither side is the "main" one, they're both parts of what ultimately ships.

HumblyTossed

Tim is the CEO, he's going to say whatever he needs to in the moment to drive investment.

Apple is and always has been a HW company first.

bombcar

Tim Apple is notoriously misinformed about his own company.

Hamuko

Not really. Back in the day you wouldn't buy a MacBook because it was powerful. Most likely it had a very shitty Intel CPU with not a lot of cores and with thermal challenges, and the reason you bought it was because macOS.

hamdingers

Nope, many bought it in spite of macOS because it was a durable laptop with an excellent screen, good keyboard, and (afaik still) the only trackpad that didn't suck.

chasil

And in many decades past, OpenStep was slowly moving its GUI from Next hardware to software sales on various UNIX platforms and Windows NT.

And this would eventually evolve into MacOS.

https://en.wikipedia.org/wiki/OpenStep

fnord123

The intel laptops also grounded into the user. I still can't believe they didn't have a recall to sort that out.

alt227

> very shitty Intel CPU with not a lot of cores and with thermal challenges

Very often the intel chips in macbooks were stellar, they were just seriously inhibited by Apples terrible cooling designs and so were permanently throttled.

They could never provide decent cooling for the chips coupled with their desire to make paper thin devices.

qwertytyyuu

not just mac os, also the decent keyboard and actually good display, guarenteed.

SCdF

I don't think it's the modern Apple, I think that's just Apple.

I remember using iTunes when fixing the name of an album was a modal blocking function that had to write to each and every MP3, one by one, in the slowest write I have ever experienced in updating file metadata. Give me a magnetised needle and a steady hand and I could have done it faster.

A long time ago they had some pretty cool design guides, and the visual design has often been nice, but other than that I don't think their software has been notable for its quality.

fidotron

What I would do for Snow Leopard on the M class hardware.

RossBencina

You could run it in an emulator.

asimovDev

do you mean literally 10.6 on AS or do you mean something as good as it was

fidotron

Something that good.

It was coherent, (relatively) bug free, and lacked the idiot level iOSification and nagging that is creeping in all over MacOS today.

I haven't had to restart Finder until recently, but now even that has trouble with things like network drives.

I'm positive there are many internals today that are far better than in Snow Leopard, but it's outweighed by user visible problems.

It shouldn't surprise you I think that Android Jelly Bean was the best phone OS ever made as well, and they went completely in the wrong direction after that.

tyrellj

This seems to be pretty true in general. SBC companies are not competing with Raspberry Pi because their software is quite a bit behind (boot loaders, linux kernel support, etc). Particle released a really cool dev board recently, but the software is lacking. Qualcomm struggled with their new CPU launch with poor support as well. It sometimes takes a while for new Intel processor features to be supported in the toolchains, kernel, and then get used in software.

Aside from that, I think of Apple as a hardware company that must write software to sell their devices, maybe this isn't true anymore but that's how I used to view them. Maintaining and updating as much software as Apple owns is no small task either.

alberth

Apple is binning the iPad Pro chips:

   Storage      CPU
   ≤ 512GB      3 P-cores (and 6 E-cores)
   1TB+         4 P-cores (and 6 E-cores)
https://www.apple.com/ipad-pro/specs/

xangel

[flagged]

mohsen1

First time seeing Apple using "AI" in their marketing material. It was "Machine Learning" and "Apple Intelligence" before...

mentalgear

Unfortunately, they have also succumbed to the AI hype machine. Apple, calling it by its actual name "machine learning" was about the only thing I still liked about Apple.

rpdillon

Wait, didn't they try to backronym their way into "Apple Intelligence" last cycle?

https://www.apple.com/apple-intelligence/

kryllic

Probably don't want to draw more attention to their ongoing lawsuits [1]. Apple, for all its faults, does enjoy consistency and the unruly nature of LLM's is something I'm shocked they thought they could tame in a short amount of time. The fallout of the hilariously bad news/message "summaries" were more than enough to spook Apple from allowing that to go much further.

>Built into your iPhone, iPad, Mac, and Apple Vision Pro* to help you write, express yourself, and get things done effortlessly.** Designed with groundbreaking privacy at every step.

The asterisks are really icing on the cake here.

---

[1] https://news.bloomberglaw.com/ip-law/apple-accused-of-ai-cop...

kgwgk

> actual name "machine learning"

Yesterday’s hype is today’s humility.

vessenes

I like sniping - but I could make a product call here to support the messaging - when it's running outside diffusion models and LLMs (as per the press release) we could call that AI. Agreed that they should at least have mentioned Apple Intelligence in their PR though

vayup

I am sure by AI they mean Apple Intelligence:-)

null

[deleted]

low_tech_punk

Not all is lost: AI can still be acronym for Apple Intelligence.

outcoldman

Marketing:

M5 announcement [1] says 4x the peak GPU compute performance for AI compared to M4. I guess in the lab?

Both iPad and MBP M5 [2][3] say "delivering up to 3.5x the AI performance". But all the examples of AI (in [3]), they are 1.2-2.3X faster than M4. So where this 3.5X is coming from? What tests did Apple do to show that?

---

1. https://www.apple.com/newsroom/2025/10/apple-unleashes-m5-th...

2. https://www.apple.com/newsroom/2025/10/apple-unveils-new-14-...

3. https://www.apple.com/newsroom/2025/10/apple-introduces-the-...

storus

M5 is supposed to support FP4 natively which would explain the speed up on Q4 quantized models (down from BF16).

relativeadv

Its not uncommon for Apple and others to compare against two generations ago rather than the immediately preceding one

outcoldman

I referenced everything about comparing to M4. I left outside the comparison with M1.

yalogin

It feels like apple is “ a square peg in a round hole” when it comes to AI - atleast for now.

They are not the hardware provider like nvidia, they don’t do the software and services like OpenAI or even Microsoft/oracle. So they are struggling to find a foothold here. I am sure they are working on a lot of things but the only way to showcase them is through their phone which ironically enough feels like not the best path for apple.

Apple’s best option is to put llms locally on the phone and claim privacy (which is true) but they may end up in the same Siri vs others situation, where Siri always is the dumber one.

This is interesting to see how it plays out

mirekrusin

Being late in AI race or not entering it from training side is not necessarily bad, others have burned tons of money, if Apple enters with their hardware first (only?) it may disrupt status quo from consumer side. It's not impossible that they'll produce hardware everybody will want to run local models that will be on par with closed ones. If this happens it may change real money flow (as opposed to investor based on imaginary evaluation money that can evaporate).

mft_

They are the leader in manufacturing consumer systems with sufficient high-bandwidth memory to enable decent-sized LLMs to be run locally with reasonable performance. If you want to run something that needs >=32GB of memory (which is frankly bottom-end for a somewhat capable LLM) they're your only widely-available choice (otherwise you've got the rare Strix Halo AI Max+ 395 chip, or you need multiple GPUs, or maybe a self-build based around a Threadripper.)

This might not be widely recognised, as the proportion of people wanting to run capable LLMs locally is likely a rounding error versus the people who use ChatGPT/Claude/Gemini regularly. It's also not something that Apple market on, as they can't monetize it. However, as time goes on and memory and compute power gradually decrease in price, and also maybe as local LLMs continue to increase in ability (?) it may become more and more relevant.

null

[deleted]

mattray0295

They push these new generations out so quick, and with crazy performance boosts. Impressive

whitepoplar

Any word on whether this chip has "Memory Integrity Enforcement" capability, as included in Apple's A19/A19 Pro chips?

https://security.apple.com/blog/memory-integrity-enforcement...

SG-

it's the same core, so more than likely yes.

bfrog

The big win would be a linux capable device. I don't have any interest in mac os x but the apple m parts always seem amazing.

In theory this would be where qualcomm would come in and provide something but in practice they seem to be stuck in qualcomm land where only lawyers matter and actual users and developers can get stuffed.

cogman10

Yeah, this is the biggest hole in ARM offerings.

The only well supported devices are either phones or servers with very little in between.

Even common consumer devices like wifi routers will have ARM SOCs with pinned version of the kernel they are attached to which will get supported for 1 to 2 years at most.

mrkeen

I have a pretty good time on Asahi Fedora (macbook air M1). It supposedly also supports M2 but no higher.

And it's a PITA to install (needs to be started within macosx, using scripts, with the partitions already in a good state)

Gethsemane

If I was less lazy I could probably find this answer online, but how do you find the battery life these days? I'd love to make the switch, but that's the only thing holding me back...

2OEH8eoCRo0

How's Thunderbolt and display port alt mode?

walterbell

Apparently the Windows exclusivity period has ended, so Google will support Android and ChromeOS on Qualcomm X2-based devices, https://news.ycombinator.com/item?id=45368167