Nvidia emulation journey, part 1: RIVA 128/NV3 architecture history and overview
26 comments
·February 27, 2025rayiner
Is this an especially odd architecture? What do other GPUs look like?
phendrenad2
> Note: Documents wanted
> If you are in possession of any of:
> NVIDIA RIVA 128 Programmers’ Reference Manual
> NVIDIA RIVA 128 Customer Evaluation Kit (we have the NV1 CEK version 1.22)
> NVIDIA RIVA 128 Turnkey Manufacturing Package
> Source code (drivers, VBIOS, etc) related to the NVIDIA RIVA 128
> Any similar documents, excluding the well-known datasheet, with technical information about a GPU going by the name “NV3”, “STG-3000”, “RIVA 128”, “NV3T”, “RIVA 128 Turbo” (an early name for the ZX) or “RIVA 128 ZX”
> Any document, code, or materials relating to a graphics card by NVIDIA, in association with Sega, Helios Semiconductor or SGS-Thomson (now STMicroelectronics) codenamed “Mutara”, “Mutara V08”, or “NV2”, or relating to a cancelled Sega console codenamed “V08”
> Any documentation relating to RIVA TNT
> Any NVIDIA SDK version that is not 0.81 or 0.83
I feel this. A lot of information has been lost.
Sjonny
> 5.0 came out late during development of the chip, which turned out to be mostly compliant, with the exception of some blending modes such as additive blending which Jensen Huang later claimed was due to Microsoft not giving them the specification in time.
Not sure if this is the same thing I had, but on my Riva128 the alpha blending wasn't properly implemented. I distinctly recall playing Unreal Tournament and when I fired the rocket launcher there were big black squared with a smoke texture on them slowly rotating :D couldn't see where I was shooting :D
pavlov
Yes, that would be an artifact of missing additive blending.
It simply means that each newly rendered polygon’s RGB values are added together with the pixel values already in the frame buffer. It’s good for lighting effects (although not a very realistic simulation of light’s behavior unless your frame buffer is linear light rather than gamma corrected, but that effectively requires floating point RGB which wasn’t available on gaming cards until 2003).
starfrost013
Iirc, Riva 128 only supports 8 of the 32 D3D5 blend modes, or something. Usually in the case of GPUs that don't support all blending modes the Direct3D HAL will attempt to compensate by using a different blending mode, or just give up and render opaque pixels. The results are usually pretty ugly. Riva 128 is one of the better ones for the era in this regard.
msk-lywenn
I doubt it required floating point rgb and, iirc, it came (for real) much later. GPUs used fixed point math behind the scenes for a long time. The only thing you need to get proper additive blending is saturation, so that you don't overflow, like on the N64.
userbinator
What may be called graphics commands in other GPU architectures are instead called graphics objects in the NV3 and all other NVIDIA architectures.
I think this choice of terminology reflects both the era in which it was chosen (OOP was a huge trend back then), and the mindset of those who worked on the architecture (software-oriented). In contrast, Intel calls them commands/instructions/opcodes, as did the old 8514/A, arguably the one that started it all.
A specialized hardware accelerator for the manner by which Windows 95’s GDI (and its DIB Engine?) renders text.
Drawing text (from bitmap font data) is a very common 2D accelerator feature.
Dwedit
Riva 128 + Pentium II 233MHz + Corn Emulator = Mario 64 at full speed on your PC.
hex4def6
+ Bleem for your PS1 emulation needs :)
bilegeek
Amazing that they're able to make progress with such little public documentation.
userbinator
I'd assume any driver source code, which the Linux world has produced a lot of, can serve as a source (pun intended) of documentation.
There's also https://envytools.readthedocs.io/en/latest/hw/intro.html
tiahura
The expectation around the Riva 128 was intense. 16bit color, integrated 2d/3d, and a reasonable price were going to doom 3dfx. It was a little underwhelming, and wasn't until the TNT, TNT2, and Geforce 256 that it really became obvious that these guys were on a path to rule the market.
It really would be cool if someone could get a sitdown with Jensen to reminisce about the Riva 128 period.
Who else bought NVDA back in '99?
raphlinus
You might enjoy this talk by Erik Lindholm (now retired), who talks about Riva 128 and many of the other early Nvidia cards: https://ubc.ca.panopto.com/Panopto/Pages/Viewer.aspx?id=880a...
corysama
I bought a Riva 128. If I had bought NVDA instead, I’d be a lot better off! xD
ajross
Not sure I'd characterize NV3 as a "success". It probably made money, and kept the company above water. But they didn't have a genuinely "successful" product until the TNT shipped in 1998. At this stage, 3dfx completely owned the market, to the extent that lots of notionally "Direct3D" games wouldn't generally run on anything else. NVIDIA and ATI were playing "chase the game with driver updates" on every AAA launch trying to avoid being broken by default.
Which makes it, IMHO, a weird target to try to emulate. NV2 was a real product and sold some units, but it's otherwise more or less forgotten. Like, if you were deciding on a system from the early 70's to research/emulate, would you pick the Data General Nova or the PDP-11?
starfrost013
Most 3dfx cards are already emulated. I'm just a crackhead. NV2 was not a real product, it was cancelled. You are talking about NV1
ajross
Heh, no, I was talking about NV3. It's just a typo in the last paragraph.
starfrost013
Ohhh. Actually, quite a lot of NV3s were sold...You can view that just from Nvidia's revenue totals, and reviews at the time. Note the standards for image quality increased very quickly, so it was described as decent in 1997, but awful in 1999.
qingcharles
Were there any games or apps specifically tied to these cards, or did everything go through D3D at this point?
I remember some earlier titles that were locked to specific cards such as the Matrox ones and didn't support any other accelerators.
pcwalton
3dfx' proprietary Glide was still very popular in 1997. UltraHLE wouldn't come out until 1999 and that was famously Glide only.
ielillo
From memory unreal tournament and other fps that used the same engine had support for opengl, glide, d3d , s3tc and software rendering. It was one of the most compatible render engine
kevingadd
In this era there were lots of competing APIs like Glide, S3 MeTaL, D3D, OpenGL. So you could end up with games that only supported particular APIs and not the one(s) supported by your GPU. At the time D3D and OpenGL were kind of a lowest common denominator, unless the game happened to support your particular vendor's OpenGL extensions (if they had any).
As a designer of Weitek's VGA core, this is a very interesting read. I had no idea how valuable the core was to nVidia. As Weitek was going under, I also remember interviewing with 3dfx and thinking how arrogant they were. I'm not surprised they eventually lost