Skip to content(if available)orjump to list(if available)

Looking Ahead at Intel's Xe3 GPU Architecture

Tsiklon

I’m really hopeful for the future of Intel’s GPU pipeline. B580 proved there is an audience for their parts.

If a C770 or C970 card can prove to be a contender like B580 is at the respective price point then people will buy them.

Tsiklon

From some of the interviews i saw at B580’s launch, it seems like Intel know the shortcomings of the Alchemist and Battlemage architectures, but weren’t able to change them before launch.

Of course these are parts that have been in the works for several years now. They’ve had time to see what Nvidia and AMD are doing with their competitive products

Venn1

Looking ahead is challenging. Intel launched the B580 on December 13th, and it sold out within hours. We're still waiting on a restock.

tylerchurch

I’ve been tracking it for weeks and it comes back in stock only to sell out minutes later. I’ve had 2 or 3 times where I thought I got one, only to get the order cancelled email.

I can’t tell if there’s just no stock, or huge demand, but either way it seems like a great product. If only you could buy it.

cptskippy

I hope this means they're going to continue to invest in the GPU line. Competition is good.

immibis

Where do they get all this information about things like registers? I thought GPU ISAs were treated like trade secrets?

wmf

Intel and AMD GPUs have public documentation and open source drivers.

mandevil

That's a privilege that the dominant players in the market can get away with. If you are the #1, everyone will do what they need to to run on your stuff. But the players trying desperately to get programmers (and customers) to pay attention to them need to make things as easy as possible to use.

CyberDildonics

That didn't answer the question at all.

brcmthrowaway

It seems these chips have all sort of hardcoded and heuristic knowledge literally baked into them. Disappointing.

Why can't AI come up with some kind of fast universal computation machine that doesnt have the need for the siliconized version of ifdefs?

ok_computer

Yep, the crowning achievement of machine complexity in recent human history is easily generalized in computational efficacy by effectively markov chain chatbots that were trained on project Gutenberg, reddit comments, and github python and javascript repos.

wegfawefgawefg

theres lots of ai other than llms

ok_computer

Ok fair point.

One such method was an improvement in matrix matrix multiplication in 2022. So I’m sure there is a potential discovery of many efficient numerical algorithms that will be uncovered with machine learning.

https://www.nature.com/articles/s41586-022-05172-4

My parent comment was a knee jerk response to the over confidence and wishful thinking of computer work people to a magic AI box solving all the hard physical problems like its some simple conversation prompt away.

acchow

Do you mean FPGAs?

Because fixed silicon inevitably has fixed "baked in" choices.

immibis

This is, like, the entirety of computing. Did x86 spring fully-formed from Zeus's forehead or did people bake in their knowledge of what programmers wanted their programs to do?

When you write programs do you not use heuristics? And tests?

Jotalea

AI doesn't "create", it "modifies" existing data.

rileymat2

Isn't a collage a creative work?