Skip to content(if available)orjump to list(if available)

Jensen Huang keynote at CES 2025 [video]

ksec

I actually got a lot more upbeat about the potential of AI after watching this keynote more so than any other demonstration or presentation.

Blackwell is still based on N4, a derivative of 5nm. We know we will have N3 GPU next year, and they should be working with TSMC on capacity planning for N2. Currently Blackwell is pretty much limited by capacity of either TSMC, HBM or packaging. And unless the AI hype dies off soon ( which I doubt will happen ), we should still have at least another 5 years of these GPU improvements. We will have another 5 - 10x performance increase.

And they have foreshadowed their PC entry with MediaTek Partnership. ( I wonder why they dont just acquired Mediatek ) and may be even Smartphone or Tablet with Geforce GPU IP.

The future is exciting.

dehrmann

I was taking these sort of improvements for granted. They'll certainly make ai cheaper, but most of the cost is up-front for generative models, and the models don't seem to be getting linearly better.

brookst

My guess is mediatek’s margins are much lower so an acquisition would tank Nvidia’s stock price by leading to lower returns. That and/or not wanting the distraction of operating mediatek’s many businesses that aren’t aligned to Nvidia core competence.

bloomingkales

There was something eerily epic about his assertions. To suggest that they are all of those companies is pretty wild.

AI is bewitching.

AlotOfReading

The same industry opposition that killed the ARM acquisition would kill a mediatek acquisition.

behnamoh

I liked this part:

    "one small step at a time, and one giant leap, together."
I didn't like this part:

    5090 for $2000, about $500 more than 4090 when it was announced.
They didn't mention VRAM amount though, and I doubt it's more than 24GB. If Apple M4 Ultra gets close to 1.8 TB/s bandwidth of 5090, it'll crush GeForce once and for all (and for good).

Also nitpick: the opening video said tokens are responsible for all AI, but that only applies to a subset of AI models...

jsheard

It's 32GB, that was leaked well before the event.

> If Apple M4 Ultra gets close to 1.8 TB/s bandwidth of 5090

If past trends hold (Ultra = 2x Max) it'll be around 1.1 TB/s, so closer to the 4090.

behnamoh

Jensen didn't talk about it. Maybe he knows it's embarrassingly low. Everyone knows nvidia won't give us more VRAM to avoid cannibalizing their enterprise products.

aseipp

The official specs for the 5090 have been out for days on nvidia.com, and they explicitly state it's 32GB of GDDR7 with a 512-bit bus, for a total of 1.8TB/s of bandwidth.

null

[deleted]

ksec

For what 5090 is offering with 32GB RAM I thought it is a pretty decent price comparatively to 4090. I thought the whole lineup is really well priced.

toshinoriyagi

edit: 32GB is confirmed here https://www.nvidia.com/en-us/geforce/graphics-cards/50-serie...

Supposedly, this image of a Inno3D 5090 box leaked, revealing 32GB of VRAM. It seems like the 5090 will be more of a true halo product, given the pricing of the other cards.

https://www.techpowerup.com/330538/first-nvidia-geforce-rtx-...

magicalhippo

AMD barely mentioned their next-gen GPUs, NVIDIA came out swinging right from the start. AMD announced two new models which by their own cryptic slide wouldn't even compete with their current top-end. Then NVIDIA came and announced a 4090-performance GPU for $549...

If that's not just hot hair from NVIDIA, I totally get the business decision from AMD but man, would love some more competition in the higher end.

Permik

It depends totally how you define "hot air" in this instance. Later in the keynote there's a mention how the $549 card achieves the 4090-like rendering performance with DLSS, i.e. not with raw graphics, number crunching horsepower.

Personally? It's a no for me dawg, DLSS unfortunately doesn't actually replace the need for the raw GPGPU crunch.

For the average layman and a consumer? Nvidia will be selling those GPU's like hotcakes.

y-c-o-m-b

I kind of agree with this. On the one hand when DLSS works well, it's amazing. I didn't notice any issues at all playing Senua's Sacrifice and Senua's Saga. They were both as good as playing on everything maxed without DLSS. On the other hand when using it on Jedi Survivor, it's a very janky experience. You can clearly see artifacts and it trying to "self-correct" when using DLSS vs without DLSS.

nomel

> Then NVIDIA came and announced a 4090-performance GPU for $549

Never trust vendor performance claims (these specifically rely on 3x frame generation), and never assume cards will be available at MSRP.

belter

Correct: "Debunking the CUDA Myth Towards GPU-Based AI Systems" - https://arxiv.org/abs/2501.00210

ekianjo

> Then NVIDIA came and announced a 4090-performance GPU for $549...

That's obviously a marketing hoax. Spray DLSS and other tricks and you can make this kind of claims while the raw power is clearly on the side of the 4090

behnamoh

AMD has strategically decided to abandon the enthusiasts market [0]. Really sad that nvidia's monopoly has got solidified even further. That's one of the reasons I contribute to Apple's MLX instead of CUDA.

0: https://www.techradar.com/computing/gpu/amd-announces-new-ra...

Update: typical HN behavior. someone is downvoting all my comments one by one...

jiggawatts

Someone with a lot of NVIDIA shares, no doubt…

blitzar

More likely someone holding the AMD bag

edit: confirmed.

ksec

>Update: typical HN behavior. someone is downvoting all my comments one by one...

You cant downvote older comments or comments older than 1 / 2 days.

null

[deleted]

null

[deleted]

blitzar

[flagged]

tim333

I'm not sure what you mean by that? Jensen seemed quite normal in his presentation. Some of the prerecorded video was a bit iffy - "this is how intelligence is made" totally ignoring natural intelligence. Coked up behaviour is more like Will Smith punching Chris Rock.

I thought the synthetic data generation for self driving was interesting https://youtu.be/k82RwXqZHY8?t=4369 I could have used something like that myself for learning flying.

beernet

Not sure about that, but resentfulness, arrogance and hubris are more present than ever on HN.

seaofliberty

[flagged]

behnamoh

He tried so much to look funny but it didn't look good on him. And at one point he said 4090 is like taking home an equivalent of a $10,000 PC, and he expected the crowd to clap, but no one, literally no one, clapped. Because everyone knows GeForce prices have been monopoly prices for years and 4090 is intentionally crippled by nvidia drivers to be less capable than it really is (see George Hotz in-depth analysis).

Jach

To me it seemed like he tried to justify its price as if it were a small part of a $10k setup with lots of RGB lighting but for some reason mainly for watching movies. No one has one just for watching movies, and it's likely more than the rest of the gaming PC build combined. Monitor possibly included.

moogly

Someone should tell Mr. Huang that it's not 20 years ago and that their products now cost 2x the amount of all other components in that "pc entertainment system control center" combined. He seems to not know, out-of-touch billionaire as he is.

seaofliberty

Haha. So true. I am writing a program to draw facial geometry and similarity between him and Jackie Chan is too good to be true.

HeatrayEnjoyer

How is that relevant to anything at all?