Skip to content(if available)orjump to list(if available)

Intel Arc Celestial dGPU seems to be first casualty of Nvidia partnership

bobajeff

Suddenly, what Intel's CEO means by new products must deliver 50% gross profit [1], and it being too late to catch up with AI [2] is starting to be clearer.

[1]: https://www.tomshardware.com/tech-industry/semiconductors/in...

[2]: https://www.tomshardware.com/tech-industry/intel-ceo-says-it...

baq

CEO of a silicon company saying his business is "too late for AI" is a CEO either without a vision or guts, an accountant, the safe option. If it's anywhere close to true, Intel is looking to sell itself for parts.

BeetleB

There's a whole backstory to this.

When he joined only a few months ago, he set the vision of making Intel a worthy participant in the AI space.

Then just a few months later, he announced "we cannot compete".

What happened in the middle? Recent articles came out about the conflict between him and Frank Yeary, the head of the Intel board. He wanted to acquire a hot AI startup, and Frank opposed it. Two factions were formed in the Board, and they lost a lot of time battling it out. While this was going on, a FAANG came in and bought the startup.

I think his announcement that Intel cannot compete was his way of saying "I cannot do it with the current Intel board."

horsawlarway

Feels like a fair statement.

My read is basically that Intel's board is frustrated they can't part the company out, take their cash, and go home.

I'd also be incredibly frustrated working with a board that seems deadset on actively sabotaging the company for short term gains.

Scramblejams

What startup was it?

tester756

The real quote is:

>"On training, I think it is too late for us,"

Not too late for AI, but too late for training meanwhile there's inference opportunity or something like that

aDyslecticCrow

Too late for the AI boom if they have to spend another 2 years and manufacturing investments to get a product out for the segment. We are over-inflated on AI hype. Its relevance will remain but betting a company on it isn't a wise idea.

checker659

> of a silicon company

With their own fabs, at that

freedomben

Agreed, either their business situation is far more critical than we know, this is a gross indictment of their R&D, or this is malpractice on the part of the leadership

h2zizzle

Or, a sly way of calling the AI bubble.

moralestapia

Well, but if it's true and there's a better strategy, why wouldn't he do it?

Seems like you'd prefer yet another +1 selling AI oil and promises ...

phkahler

Cutting products that don't have 50 percent margins seems like a bad choice when their goal should be filling their advanced fabs. Keeping that investment at or near capacity should be their goal. They said they'd have to cancel one node if the foundry business couldn't get enough customers, and yet they're willing to cut their own product line? Sure they need to make a profit, but IMHO they should be after volume at this point.

KronisLV

Even the Arc B580 GPUs could have been a bigger win if they were actually ever in stock at MSRP, I say that as someone who owns one in my daily driver PC. Yet it seemed oddly close to a paper launch, or nowhere near the demand, to the point where the prices were so far above MSRP that it made the value really bad.

Same as how they messed up the Core Ultra desktop launch, of their own volition - by setting the prices so high that they can’t even compete with their own 13th and 14th gen chips, not even mentioning Ryzen CPUs that are mostly better in both absolute terms and in the price/perf. A sidegrade isn’t the end of the world but a badly overpriced sidegrade is dead on arrival.

Idk what Intel is doing.

tester756

The real quote is:

>"On training, I think it is too late for us,"

Not too late for AI, but too late for training meanwhile there's inference opportunity or something like that

risho

i will note that their source appears to be moore's law is dead which is a speculative youtube channel that has a long history of being wrong about the death of arc. dude has been predicting the imminent death of arc since the first one released years ago. it wouldn't surprise me if this did lead to the death of arc, but it certainly isn't because this moron predicted it.

chao-

It is very hard to put any belief in the rumor mill surrounding Intel's discrete desktop GPUs. Already this year, there have been at least three "leaks" saying "It's canceled!", and every time, a counter-rumor comes about saying "It isn't canceled!"

In all accounts I have seen, their single SKU from this second generation consumer lineup has been well-received. Yet the article says "what can only be categorized as a shaky and often rudderless business", without any justification.

Yes, it is worth pondering what the Nvidia investment means for Intel Arc Graphics, but "rudderless"? Really?

belval

Honestly, the rumor mill surrounding Intel is actually very similar to AMD 2015-2016 pre-Zen (not saying that they will see the same outcome). I swear I have seen the same "x86 license is not transferable [other company] might sue them" 9 years ago or "Product Y will be discontinued".

When it comes to GPUs, a $4T company probably couldn't care less what their $150B partner does in their spare time as long as they prioritize the partnership. Especially when the GPUs in question are low-end units, in a segment that Nvidia has no competition in and not even shipping that many. If they actually asked them to kill it, it would be 100% out of pettiness.

Sometimes I wonder if these articles are written for clicks and these "leakers" are actually just the authors making stuff up and getting it right from time to time.

chao-

From a corporate strategy perspective, cancel Arc or keep Arc, I can see it both ways.

Intel has so many other GPU-adjacent products and they will doubtless be continuing most of them, even if they don't pursue Arc further: Jaguar Shores, Flex GPUs for VDI, and of course their Xe integrated graphics. I could possibly see Intel not ship a successor to Flex? Maybe? I cannot see a world where they abandon Xe (first-party laptop graphics) or Jaguar Shores ("rack scale" datacenter "GPUs").

With all of that effort going into GPU-ish designs, is there enough overlap that the output/artifacts from those products support and benefit Arc? Or if Arc only continues to be a mid-tier success, is it thus a waste of fab allocation, a loss of potential profit, and an unnecessary expense in terms of engineers maintaining drivers, and so forth? That is the part I do not know, and why I could see it going either way.

I want to acknowledge that I am speaking out of my depth a bit: I have not read all of Intel's quarterly financials, and not followed every zig and zag of every product line. Yet while can see it both ways, in no world do I trust these supposed leaks.

belval

> From a corporate strategy perspective, cancel Arc or keep Arc, I can see it both ways.

Me too, I just really really doubt that it would come from Nvidia

cubefox

Yeah that is bizarre. They have been very focused and even managed to upstage AMD by several years in the ML acceleration department (XeSS).

jeffbee

There has never been any information conveyed by the "Moore's Law Is Dead" account. If you want to know whether Intel has cancelled their next dGPU, you might as well ask a coin.

TiredOfLife

Source is "Moore's law is dead" youtuber. A coin toss is more reliable than him.

gregbot

Really? Ive been following him for years and he has always been 100% accurate. What has he been wrong about?

dralley

I agree that he's not that bad, but he's definitely not 100% accurate, in particular with respect to Intel.

Notably this is about the 3rd time in 2 years that he's reported that the Intel dGPU efforts are being killed off.

Even on the latest developments the reporting is contradictory, so someone is wrong and I suspect it's him. https://www.techpowerup.com/341149/intel-arc-gpus-remain-in-...

gregbot

So far everything he said in that video has happened and he did not say that intel would never release another dGPU just that it would be a token release which is absolutely exactly what has happened

nodja

They had videos saying intel was gonna cancel the dGPU division and focus on datacenter pretty much since the intel cards came out. Amongst many other things they've said. I used to follow them too, but they speak with too much confidence about things they know nothing about.

They're a channel focused on leaks, but most of their leaks are just industry insider gossip masked as factual to farm clicks. Their leaks are useless for any sort of predictions, but may be interesting if you'd like to know what insiders are thinking.

A quick google search also yielded this[1] 2-year old reddit thread that shows videos they deleted because their predictions were incorrect. There's probably many more. (That subreddit seems to be dedicated to trashing MLID.)

[1] https://www.reddit.com/r/BustedSilicon/comments/yo9l2i/colle...

gregbot

> gossip masked as factual to farm clicks

Instead of invectives could you just say what specific leak of his was inaccurate? Everything he said about intel dGPU has happened exactly as he said it would. Have you watched his video about that yourself?

carlhjerpe

Yeah all the videos I saw where he was right had 100% accuracy, which you'll be reminded of in the next video, the times he was wrong won't be advertised the same.

gregbot

why dont you just say what he’s been wrong about?

flufluflufluffy

Can someone explain what the heck Battlemage means in this context?

nodja

Intel Arc - Intel's dedicated GPUs, each GPU generation has a name in alphabetical order, names are taken from nerd culture.

Alchemist - First gen GPUs A310 GPUs are the low end, A770 are the high end. Powerful hardware for cheap, very spotty software at release. Got fixed up later.

Battlemage - Second gen (current gen), only B570 and B580 GPUs came out. They said weren't gonna release more Battlemage GPUs after these because they wanted to focus on Celestial, but probably went back on it seeing how well the B580 was reviewed and the B770 is due to be released by the end of the year.

Celestial - Next gen GPUs, they were expected for release early 2026. This article claims it was cancelled, but personally I think it's too late to cancel a GPU this late in production. Especially when they basically skipped a generation to get it out faster.

daemonologist

Battlemage, aka X^e2, is Intel's current and second-generation GPU architecture. (Like RDNA 4 for AMD or Blackwell for Nvidia.)

ripbozo

Codename of the Intel Arc B-series gpu lineup

2OEH8eoCRo0

I think we overestimate desktop GPU relevance. Are gaming GPUs really that lucrative?

mrweasel

If they weren't why would Nvidia keep making them? They do seem like an increasingly niche product, but apparently not enough that Nvidia is willing to just exit the market and focus on the datacenters.

They aren't just for gaming, there's also high-end workstations, but that's probably even more niche.

MostlyStable

I'm honestly curious why they keep making them. As far as I can tell, NVIDIA can sell literally as many datacenter AI chips as they can produce, and that would probably continue to be true even if they significantly increased prices. And even without increasing prices, the datacenter products are considerably higher margin than the consumer GPUs. Every consumer GPU they sell is lost revenue in comparison to using that fab capacity for a datacenter product.

The only reason I can imagine for them leaving the money on the table is that they think that the AI boom won't last that much longer and they don't want to kill their reputation in the consumer market. But even in that case, I'm not sure it really makes that much sense.

Maybe if consumer GPUs were literally just datacenter silicon that didn't make the grade or something, it would make sense but I don't think that's the case.

tempest_

Have you seen the latest generation of Nvidia gaming cards? They are increasingly looking like an after thought.

eYrKEC2

Intel has always pursued agglomeration into the main CPU. They sucked up the math co-processor. They sucked up the frontside bus logic. They sucked up the DDR controllers more and more. They have sucked in integrated graphics.

Everything on-die, and with chiplets in-package, is the Intel way.

Default, average integrated graphics will continue to "statisfice" for a greater and greater portion of the market with integrated graphics continuing to grow in power.

carlhjerpe

Intel made fun of AMD for "taping chips together". Intel did everything on a monolithic die for about way too long.

The smaller the node the smaller the yield, chiplets is a necessity now (or architectural changes like Cerebras).

eYrKEC2

Running tests and then fusing off broken cores or shared caches helps to recover lots of yield for bigger chips. Certain parts of the silicon is not redundant, but Intel's designs have redundancy for core pieces and chunks that are very large and hence probabilistically more prone to a manufacturing error.

nodja

They're a market entry point. CUDA became popular not because it was good, but because it was accessible. If you need to spend $10k minimum on hardware just to test the waters of what you're trying to do, that's a lot to think about, and possibly tons of paperwork if it's not your money. But if you can test it on $300 hardware that you probably already own anyway...

justincormack

Gaming GPUs make up 7% of Nvidia's business, 93% is datacentre. So, no.

hhh

No. It used to be more even between datacenter and gaming for NVIDIA, but that's not been the case for a few years. Gaming has brought in less money than networking (mellanox) since '24 Q4.

https://morethanmoore.substack.com/p/nvidia-2026-q2-financia...

vlovich123

But the same thing that makes GPUs powerful at rendering is what AI needs - modern gaming GPUs are basically supercomputers that provide Hw and Sw to do programmable embarrassingly parallel work. That is modern game rendering but also AI and crypto (and various science engineering) which is the second revolution that Intel completely missed (the first one being mobile).

patagurbon

AI (apparently) needs much lower precision in training and certainly in inference than gaming requires though. A very very large part of the die on modern datacenter GPUs is effectively useless for gaming

jlarocco

I don't think anybody is using gaming GPUs to do serious AI at this point, though.

pjmlp

Depends if one cares about a PlayStation/XBox like experience, or Switch like.

gpderetta

they kept nvidia in business for a long time until their datacenter breakthrough.

anonym29

the value proposition of Intel's graphics division wasn't in the current generation gaming GPUs, it was the growth of talent internally that could target higher and higher end chips at a much lower price than Nvidia until they were knocking on the door of the A100/H200-class chips - the chips that Nvidia produces for $2k and then sells for $40k.

Not to mention, Intel having vertical integration gave Intel flexibility, customization, and some cost saving advantages that Nvidia didn't have as much of, Nvidia being a fabless designer who are themselves a customer of another for-profit fab (TSMC).

If TFA is true, this was an anticompetitive move by Nvidia to preemptively decapitate their biggest competitor in 2030's datacenter GPU market.