How I learned Vulkan and wrote a small game engine with it (2024)
33 comments
·November 21, 2025tombert
thegrim33
SDL 3.0 introduced their GPU API a year or so ago, which is an abstraction layer on top of vulkan/others, might want to check it out.
Although after writing an entire engine with it, I ended up wanting more control, more perf, and to not be limited by the lowest common denominator limits of the various backends, and just ended up switching back to a Vulkan-based engine.
However, I took a lot of learnings from the SDL GPU code, such as their approach to synchronization, which was a pattern that solved a lot of problems for me in my Vulkan engine, and made things a lot easier/nicer to work with.
ryandrake
I'm working with SDL GPU now, and while it's nice, it hasn't quite cracked the cross platform nut yet. You still need to maintain and load platform-specific shaders for each incompatible ecosystem, or you need a set of "source of truth" HLSL shaders that your build system processes into platform-specific shaders, through a set of disparate tools that you have to download from all over the place, that really should be one tool. I have high hopes for SDL_shadercross to one day become that tool.
HexDecOctBin
SDL GPU is extremely disappointing in that it follows the Vulkan 1.0 model of static pipelines and rigid workflows. Using Vulkan 1.3 with a few extensions is actually far more ergonomic beyond a basic "Hello, World" than using SDL GPU.
ryandrake
As someone who did OpenGL programming for a very, very long time, I fully agree with you. Without OpenGL being maintained, we are missing a critical “middle” drawing API. We have the very high level game engines, and very low level things like Vulkan and Metal which are basically thin abstractions on top of GPU hardware. But we are missing that fun “draw a triangle” middle API that lets you pick up and learn 3D Graphics (as opposed to the very different “learn GPU programming” goal).
If I was a beginner looking to get a basic understanding of graphics and wanted to play around, I shouldn’t have to know or care what a “shader” is or what a vertex buffer and index buffer are and why you’d use them. These low level concepts are just unnecessary “learning cliffs” that are only useful to existing experts in the field.
Maybe unpopular opinion: only a relative handful of developers working on actually making game engines need the detailed control Vulkan gives you. They are willing to put up with the minutiae and boilerplate needed to work at that low level because they need it. Everyone else would be better off with OpenGL.
phendrenad2
OpenGL is still being maintained, it just isn't being updated. Since OpenGL 4.0 or something we've had vertex and pixel shaders. As a non-AAA developer, I can't imagine anything else I'd really need.
BTW: If anyone says OpenGL is "deprecated", laugh in their face.
ryandrake
OK, maybe OpenGL is not "unmaintained" but the major OS and hardware vendors have certainly handed him his hat.
engeljohnb
If I were starting a new project, would it be unwise to just use OpenGL? It's what I'm used to, but people seem to talk about it as if it's deprecated or something.
I know it is on Apple, but let's just assume I don't care about Apple specifically.
rabf
OpenGL is still be the best for compatibility in my opinion. I have been able to get my software using OpenGL to run on Linux, Windows, old/new phones, Intel integrated graphics and Nvidia. Unless you have very specific requirements it does everything you need and with a little care, plenty fast.
simonask
`wgpu` in Rust is an excellent middle ground, matching the abstraction level of WebGPU. More capable than OpenGL, but you don’t have to deal with things like resource barriers and layout transitions.
The reason you don’t is that it does an amount of bookkeeping for you at runtime, only supports using a single, general queue per device, and several other limitations that only matter when you want to max out the capabilities of the hardware.
Vulkan is miserable, but several things are improved by using a few extensions supported by almost all relevant vendors. The misery mostly pays off, but there are a couple of cases where the API asks you for a lot of detail which all major drivers then happily go ahead ignore completely.
raincole
How easy is it to integrate wgpu if the rest of your game is developed with a language that isn't rust? (e.g. C# or C++)
tombert
I'll definitely give wgpu a look. I don't need to make something that competes with Unreal 5 or anything, but I do think it would be neat to have my own engine.
foltik
Could you say more about which extensions you’re referring to? I’ve often heard this take, but found details vague and practical comparisons hard to find.
attheicearcade
Not the same commenter, but I’d guess: enabling some features for bindless textures and also vk 1.3 dynamic rendering to skip renderpass and framebuffer juggling
user____name
If you don't need 4K PBR rendering, a software renderer is a lot of fun to write.
tombert
Interesting. I wouldn't actually mind learning how to do that; any tips on how/where to get started?
rabf
Tsoding has been live streaming the development of a software renderer as of late: https://www.youtube.com/watch?v=maSIQg8IFRI
junon
Getting a triangle on the screen is the hello world of 3D applications. Many such guides for your backend of choice. From there it becomes learning how the shaders work, internalizing projection matrices (if you're doing 3D) which takes a bit of thinking, then slowly as you build up enough abstractions turns back into a more "normal" data structures problem surrounding whatever it is you're actually building. But it's broad, be prepared for that.
Definitely recommend starting with a more "batteries included" framework, then trying your hand at opengl, then Vulkan will at least make a bit more sense. SDL is a decent place to start.
A lot of the friction is due to the tooling and debugging, so learning how to do that earlier rather than later will be quite beneficial.
diath
If you want something like SDL but for 3D, check out Raylib.
gyomu
To this day, the best 3D API I’ve used (and I’ve tried quite a few over the years) is Apple’s SceneKit. Just the right levels of abstraction needed to get things on the screen in a productive, performant manner for most common use cases, from data visualization to games, with no cruft.
Sadly 1) Apple only, 2) soft deprecated.
fingerlocks
Trying to write a ground up game engine in Metal is a very serious exercise in self-discipline. Literally everything you need is right at your finger tips with RealityKit / old SceneKit. It’s so tempting to cheat or take a few short cuts. There’s even a fully featured physics engine in there.
rudedogg
RealityKit is pretty cool and the replacement it seems. Still Apple only though, and I find the feedback loop slow/frustrating due to Swift
I find SDL3 more fun and interesting, but it’s a ton of work to to get going.
cuckmaxxed
[dead]
anvuong
Vulkan was one of the hardest thing I've ever tried to learn. It's so unintuitive and tedious that seemingly drains the joy out of programming. Tiny brain =(
ryandrake
You don't have a tiny brain. Vulkan is a low-level chip abstraction API, and is about as joyful to use as a low-level USB API. For a more fun experience with very small amounts of source code needed to get started, I'd recommend trying OpenGL (especially pre-2.0 when they introduced shaders and started down the GPU-programming path), but the industry is dead-set on killing OpenGL for some reason.
zffr
Does anyone know why the industry is killing OpenGL?
raincole
I think anyone who ever looked at typical Vulcan code examples would reach the same conclusion: it's not for application/game developers.
I really hope SDL3 or wgpu could be the abstraction layer that settles all these down. I personally bet on SDL3 just because they have support from Valve, a company that has reasons to care about cross platform gaming. But I would look into wgpu too (...if I were better at rust, sigh)
bsder
You don't have a tiny brain--programming Vulkan/DX12 sucks.
The question you need to ask is: "Do I need my graphics to be multithreaded?"
If the answer is "No"--don't use Vulkan/DX12! You wind up with all the complexity and absolutely zero of the benefits.
If performance isn't a problem, using anything else--OpenGL, DirectX 11, game engines, etc.
Once performance becomes the problem, then you can think about Vulkan/DX12.
jesse__
> Starting your engine development by doing a Minecraft clone with multiplayer support is probably not a good idea.
Plenty of people make minecraft-like games as their first engine. As far as voxel engines go, a minecraft clone is "hello, world."
jesse__
I love that it's becoming kind of cool to do hobby game engines. I've been working on a hobby engine for 10 years and it's been a very rewarding experience.
gnabgib
(2024) At the time (625 points, 260 comments) https://news.ycombinator.com/item?id=40595741
nodesocket
I am fascinated with 3D/Gaming programming and watch a few YouTubers stream while they build games[1]. Honestly, it feels insanely more complicated than my wheelhouse of webapps and DevOps. As soon as you dive in, pixel shaders, compute shaders, geometry, linear algebra, partial differential equations (PDE). Brain meld.
jakogut
Note that I have retained the original title of the post, but I am not the author.
My opinions of Vulkan have not changed significantly since this was posted a year ago https://news.ycombinator.com/item?id=40601605
I'm sure Vulkan is fun and wonderful for people who really want low level control of the graphic stack, but I found it completely miserable to use. I still haven't really found a graphics API that works at the level I want that I enjoyed using; I would like to get more into graphics programming since I do think it would be fun to build a game engine, but I will admit that even getting started with the low level Vulkan stuff is still scary to me.
I think what I want is something like how SDL does 2D graphics, but for 3D. My understanding is that for 3D in SDL you just drop into OpenGL or something, which isn't quite what I want.
Maybe WebGPU would be something I could have fun working on.