Skip to content(if available)orjump to list(if available)

Subpixel Snake [video]

Subpixel Snake [video]

63 comments

·January 24, 2025

Sohcahtoa82

The linked Subpixel Zoo article taught me that Pentile is still actually incredibly popular.

My first Pentile screen was on my Motorola Droid 4 phone, and it was awful. Small text was often impossible to read depending on the color of the text and its background. The massive gaps between colors made solid red, green, or blue areas of the screen look like a checkerboard. It basically had a screen-door effect before VR became mainstream and made "screen-door effect" a household term.

So it hit me as a surprise to know that Pentile is still popular and used today. I guess it's just gotten better? Smaller gaps between the subpixels? Maybe higher resolutions and pixel densities hide the weaknesses shown in my Droid 4?

mdasen

Early PenTile displays often had a different arrangement: https://en.wikipedia.org/wiki/PenTile_matrix_family#/media/F...

You can see that it's Blue, Green, Red, Green on the horizontal/vertical axis so each red sub pixel is separated by two green and one blue sub pixels.

Modern PenTile displays usually use a triangular layout: https://static1.xdaimages.com/wordpress/wp-content/uploads/w...

I'm not an expert at text rendering, but it seems like you'd be able to get an RGB sub pixel combination a lot closer together with this triangular layout than the linear one.

But also, the Droid 4 was simply lower resolution. Apple moved to 330 pixels per inch in 2010 and the Droid 4 was 275 PPI in 2012. So the Droid 4 was poor resolution even for its time and the PenTile layout would make it even worse by removing a third of the sub pixels.

Today, the Galaxy S25 has 416 PPI and the iPhone 16 460 PPI so it is dramatically more pixels.

Pixel density would have the largest impact, but I think that the triangular layout that modern displays use probably also helps. You talk about the screen door effect and I feel like the triangular layout wouldn't have as much of that issue (but I'm not an expert on sub pixel rendering).

wffurr

>> Maybe higher resolutions and pixel densities hide the weaknesses shown in my Droid 4?

That's exactly it. Droid 4 resolution was low enough that the subpixel arrangement was clearly visible. Newer displays are dense enough that subpixels aren't visible at all.

tantalor

The snake moves weird because the subpixels aren't square.

I would increase the snake's horizontal speed (per subpixel) relative to vertical, so speed in either direction is the same from the user's perspective in the actual viewport.

mrandish

As an obsessive arcade retro-gamer who custom built a high-end emulation cabinet with a 27-inch quad-sync analog RGB CRT - I approve of this video! As soon as he described running into the green pixels problem, I knew he was going to learn something interesting. Sub-pixel structure, phosphor chromaticity, etc are such a fun rabbit hole to dive down. And it's still highly relevant in today's OLED, QLED, etc displays.

Also, a tip for when you play classic arcade or console retro games from the 80s and 90s: they will look much better (and be more authentic) if you play on a CRT - or, if playing via emulation, just turn on a CRT emulation pixel shader (CRT Royale is good). These pixels are art which the original devs and artists intentionally hand-crafted to use the way CRTs blend colors and scanlines anti-alias naturally. You deserve to see them as they were intended to be seen. Just look at what you've been missing: https://i.redd.it/9fmozdvt6vya1.jpg

playa1

I’m a fan of authentic retro hardware. That sounds like an awesome cabinet, I would love to spend hours and pockets full of quarters playing on it.

This post caused me to go down a rabbit hole about CRT simulation.

Looks like it is a thing.

https://github.com/blurbusters/crt-beam-simulator

mrandish

> That sounds like an awesome cabinet

Why, yes! Yes it is. :-)

If it sounds good, you should consider one of your own. My goal was not the retro nostalgia of my recreating my parent's shitty 1970s living room TV but instead creating a cabinet that would allow me to explore these games each in their authentically correct resolution and frame rate and at the maximum quality and fidelity possible. That's why I chose an analog RGB monitor and the last, fastest GPU ever made with a native analog RGB signal path (R9 380X). I run a special version of the MAME emulator called GroovyMAME made just enable technically correct native display via analog signals on a CRT. http://forum.arcadecontrols.com/index.php/board,52.0.html

I created this cabinet over ten years ago, before emulation pixel shaders were a thing. If you don't want to go to the effort, expense and time of acquiring and calibrating a high-end CRT, good pixel shaders can now get you >98% of the way there much easier and cheaper.

mrandish

Someone else asked for more info, so I wrote a much more detailed post here: https://news.ycombinator.com/item?id=42823507

azinman2

I’d love to hear more about your setup. Do you have a blog or anything documenting it?

mrandish

I probably should put a permanent post up somewhere but I'll just give you a recap of info that may matter if you're interested in having your own emulation arcade cabinet. I highly recommend this forum to deep dive because it has sub-forums for controls, monitors, cabinets, etc. http://forum.arcadecontrols.com/

First, you need to understand your goal in creating a cabinet. Unlike some people, my goal was NOT recreating long-ago nostalgia like playing console games on my parent's living room TV. There's nothing wrong with that but as someone who wrote games professionally back in the 80s and later became a video engineer, I wanted to play these games in their original native resolutions, aspect ratios and frame rates. But my goal went beyond original authenticity to ideal authenticity. Even back in the day, an arcade cabinet's monitor would be left on 100 hours a week in an arcade and after five years be pretty thrashed. The joystick mechanisms would be worn and imprecise. Sometimes arcade manufacturers would cut corners by sourcing an inferior button instead of the top of the line. I had no interest in recreating that. I wanted to create the experience of playing the originals in their most ideal form. How they looked (or would have looked) with a brand new top of the line, period correct monitor perfectly calibrated, and pristine high-quality controls. What the manufacturer would have made in the 80s or 90s with no cost corners cut.

My cab is based around a 27" Wells Gardner D9200 quad-sync analog RGB CRT. Wells Gardner made high-quality industrial CRTs specifically to go in arcade cabinets (Atari, Sega, Namco, etc). The D9200 is one of the last and best monitors they made and I bought it new from them shortly before they went out of business. It's very flexible as it scans four ranges of frequencies 15khz, 24khz, 31.5khz and 38khz. This covers very nearly all of the resolutions and frequencies of any CRT raster-based arcade machines ever made by global arcade manufacturers. 38khz supports resolutions up to 800x600 non-interlaced which is what I run my game selection interface in. Scanning to higher frequencies is also nice for running games from some later consoles like Dreamcast which were capable of displaying 480p natively. This lets me run all the classic arcade games in the native resolution, frame rates and frequencies. No scaling, averaging or interpolation.

For my CRT to switch between all these frequencies on the fly it must be sent a properly formatted signal. Doing this natively is tricky and requires using a GPU with native analog RGB output. I use the last, fastest native analog RGB GPU - the Radeon R9 380x. The real magic however is using special display drivers and a special version of the MAME emulator called GroovyMAME to generate precisely correct horizontal and vertical frequencies matching the game code written for each arcade cabinet's original display hardware. GroovyMAME and the community around it have done remarkable work crucial to accurate historical preservation through precise emulation. Much of their work has now been mainstreamed into MAME, making it more accurate than ever. Dive into that rabbit hole here: http://forum.arcadecontrols.com/index.php/board,52.0.html

To be clear, my high-end monitor and highly-tuned signal chain probably allow most of these games to look better than the original monitor in the original cabinet. While perfectly authentic, they aren't exactly 'historically accurate' because an average cabinet in an average arcade in the 1980s probably looked worse due to age, use and abuse. However, intentionally degrading original content to look worse to match some historical average jank, seems wrong to me. It's true some of the original monitors were connected with composite video, not component. Some of the cabinets had cheap, poorly shielded cables while mine has a double shielded broadcast studio cable with ferrite cores at both ends to eliminate cross-talk and noise. So I'm playing the original game code but presented as the people who made these games would have wanted their own personal cabinet - if they could take one home. However, I draw the line at modern revisionism like AI upscaling or frame gen. Because that's no longer the original pixels and frames in their most ideal form.

Next is choosing your controls. Fortunately, many of the manufacturers of original arcade cabinet controls are still around like Happ (buttons), Wico (joysticks), etc. My cabinet has controls for two players as well as a trackball for games like Marble Madness and a counter-weighted spinner for games like Tempest. These are all interfaced to the emulation PC in the cabinet through USB control boards made by companies like Ultimarc. Each of the buttons is also backlit by an RGB LED and the colors of each button change to the button color that was on the original cabinet, for example, when playing Joust player 1 is yellow and player 2 is blue. This also indicates which controls are active in each game.

Selecting games is done via a joystick driven menu. Software to do this is called a frontend and there are a variety ranging from open source to commercial. I use a commercial one called Launchbox because it handles calling various emulators, interfacing with control boards, organizing and maintaining the game library of thousands of titles across a dozen platforms very well. I actually use the BigBox mode of Launchbox which is made for dedicated emulation cabinets. Another nice touch is integrating various databases arcade historians have created. While browsing the game library it's fascinating to read the history of how the game was made, see the original arcade cabinet and the launch advertisement along with the usual game logo, title screen and gameplay video. Linked data like this allows you to follow the evolution of various game types, companies and franchises over time from their origin to their end point.

CONCLUSION: All of the above is, admittedly, pretty obsessive. If you want a terrific arcade/console emulation cabinet you DO NOT need to do what I did (or even half of it). However, I recommend not just buying a cheap mini cabinet from Costco. To be fair, while the worst cheapies are awful, the best of that class isn't that bad. But you can do much better with just a little more money, thought and care. Things like authentic arcade controls, and rolling your own cheap, used PC will allow you to run a frontend you can add other platforms and games to - and - MOST IMPORTANTLY run a CRT emulation pixel shader on the output. I recently upgraded the PC in my cabinet and bought a used corporate PC on eBay for less than $100 delivered. It's more than fast enough to emulate everything up to PS2 perfectly and I have no interest in emulating later consoles on a CRT cabinet because that's when games started being written for flat screens. I love my CRT but I'm not a purist. CRTs are expensive, hard to maintain and finnicky analog gear. As a video engineer I have to admit recent versions of the best CRT emulation shaders like CRT Royal running on a high-end flat screen are very impressive. If I was building my cabinet today, I might go with a very carefully selected, high-end flat screen instead of a CRT. Frankly, the kind of flat screen I'd want might cost more than a very good used CRT but it would provide some flexibility to do things a CRT can't. And there would be some trade-offs vs my best-ever-made CRT but engineering is all about trade-offs and there's nothing that's ever going to be perfectly ideal on every dimension someone like me cares about.

azinman2

Thank you for all this. Quite the dedication! How often do you play it?

starfezzy

Not a fan of the blurry LCD look - it's like someone hacked a .ini to set antialiasing 4x higher than its limit then placed a screen door in front of the monitor.

I gamed during the transition from CRTs to LCDs. Nobody was preferring the "graphics" of a CRT.

The real downgrade was when gaming shifted from PC to console and killed dedicated servers. We used to pick servers with 10ms latency. Now people think 60-100ms+ is fine.

crazygringo

> You deserve to see them as they were intended to be seen.

I've never bought that argument, and I grew up playing games on CRT's.

The reality is that different CRT's had wildly different characteristics in terms of sharpness, color, and other artifacts -- and it also varied tremendously depending on the type of video output/input. Were you hooked up to a TV? A cheap monitor? An expensive monitor?

About the only thing you can say for sure is that CRT's were blurrier. And the comparison image you provide is completely misleading because the brightnesses are totally different, which suggests that the LCD/LED version isn't using correct gamma. If you used a random CRT her skin tone also had a good chance of turning greenish or whatever because that's just what CRT's did -- the color artifacts could be atrocious.

I definitely appreciate wanting to blur the image in an emulator to remove the jaggies, and the retro CRT effects are a cool novelty. But I just don't buy the argument that it's "how the games were intended to be seen", because there was just way too much variation and the screen quality was so low. It's like only listening to music from the 90's on cheap speakers over the roar of the road because it's how the music was "intended to be heard". No it wasn't. It's just the best you had at the time.

amiga386

> I just don't buy the argument that it's "how the games were intended to be seen"

I do, though.

At its most extreme, compare how this CGA imagery looks like on an NTSC television, with the crisp, clean signals that generated it. The demo makers here absolutely intend you to see this via NTSC, it will look like complete trash if you don't.

https://int10h.org/blog/img/1k16_flowergirl_cga_1024_colors....

(from https://int10h.org/blog/2015/04/cga-in-1024-colors-new-mode-...)

This article gives you more examples: https://www.datagubbe.se/crt/

And it links to this video with yet more examples: https://www.tiktok.com/@mylifeisanrpg/video/7164193246401383...

There's no mistaking it. The artists on those 80s/90s games worked with the expectation of how it looked on display hardware of the time. The actual pixels, in complete precision on a vastly higher resolution display, look like shit. Or they look "retro".

crazygringo

Your first link is using weird color hacks that maybe could have worked on some specific hardware, but nothing like that was used on any average popular video game of the time as far as I know.

So that's not an example of how regular video game artists were working, it's an example of how some (current-day?) demoscene people are trying to push specific vintage hardware to its limits.

And like I said -- you can apply a blur filter (or basic interpolation) to get rid of jaggies, that's totally understandable. The pixels weren't meant to be sharp square blocks, just blobs of color. But a lot of these pages are showing how CRT's supposedly looked so much better are doing a lot of cherry-picking -- the reality was that they looked like blurry color-distorted wavy jittery messes just as often. There just wasn't any kind of consistency between dramatically different displays. Artists couldn't plan for some highly specific amount of horizontal smear to look "just right" because there was gigantic variance.

wang_li

I don't buy it either. Showing something that came out 30 years after the time in question is not supportive of the argument. People wrote games and developed games and made game art on CRTs. They just developed on what they had. No one was sitting down and factoring in blur, scanlines, phosphor persistence, or etc.

mrandish

GP here. I don't want to repeat the lengthy technical explanation I already posted in another response downthread, so please refer to that: https://news.ycombinator.com/item?id=42817006

> The reality is that different CRT's had wildly different characteristics in terms of sharpness, color, and other artifacts -- and it also varied tremendously depending on the type of video output/input.

As a video engineer old enough to have started my career in the analog era, I fully agree composite video displayed on consumer TVs could vary wildly. I already explained the technical point about decoding the signal information properly in my other post but you're making a different point about variability, so I'll add that just because TVs could be mis-adjusted (or even broken) doesn't mean there's not a technically correct way to display the encoded image data. This is why we used color bars to calibrate TVs.

> I definitely appreciate wanting to blur the image in an emulator to remove the jaggies

But that's not my point, blur was an undesirable artifact of the composite video standard. 80s and 90s 2D pixel art was hand-crafted knowing that the blur would blend some colors together, minimize interlace artifacts and soften hard edges. However, I only use shaders that model a small amount of blur and I run my CRT in analog RGB instead of composite, which can be quite sharp. My goal is not nostalgia for the analog past or to degrade a game's output as much as my parent's shitty 1970s living room TV did. I had to engineer analog video in that past - and I hated it's shortcomings every day. When I play 80s and 90s video games, whether via shaders or on my analog RGB CRT, it probably looks quite a bit sharper and clearer than the original artists ever saw it - but that's not due to some subjective up-res or up-scaling - it's due to accurately decoding and displaying the original content to the technical standard it was created to comply with (even if many consumer TVs didn't live up to that standard).

In the 90s I worked at a TV station and after hours we'd bring in consoles just to play them on the $3,000 Sony BVM broadcast reference monitor. And they looked great! That's what I'm after. Accurately reflecting the original artist's intent in the maximum possible quality - without slipping over the line into editorializing colors or pixels that were never in the original data in the first place. I want to play the game as it would have looked back in the day on the best video output available, through the best cable available and on the best screen money could buy. And via emulation and shaders, now everyone can have that experience!

dahart

You have very valid points, variation in CRTs was very high back in the day, and the example image does have a gamma/brightness discrepancy, I agree with that. Back when CRTs were dominant, gamma and brightness were all over the map, almost nobody knew what those were. You couldn’t even count on where the visible edges of the screen were. And you’re right that saying “the way it was intended” is perhaps slightly hyperbolic or maybe isn’t quite meant the way you’re taking it. It’s not that using CRTs or not was a choice, but it is fair to say artists used CRTs when creating game art and intended for it to look as good as it could on CRTs, and they did not intend for the pixels to turn into multi-pixel solid color blocks.

Yes exactly CRTs were blurrier, and that alone affects artistic choices. It is fair to say that CRT art looks different than LCD art because CRTs are blurrier. Games developed on CRTs with low resolutions don’t look as good when displayed on high res LCDs with up-resing and nearest neighbor sampling. The problem with using a solid 2x2, 3x3, 4x4 block of LCD pixels to represent a low res CRT pixel is that it’s a poor reconstruction, introduces unwanted high frequencies, and looks very different from the original. It’s true from a signal processing perspective that 4-up LCD reconstruction of old CRT art is quite wrong and bad.

This does extend into music, kinda. We can look at music from the 30s and 50s for an even stronger example - early recorded music was both technically limited to, and also artistically designed for, a frequency range of, I don’t know, like 500-3k Hz. Some audiophiles do argue that using an old record player to play old vinyl is a superior experience to listening to a digitized copy on modern hardware, and often with the same argument - that the old stuff is the way it was intended to be heard.

However, the analogy to music is slightly broken since today’s digital music - unlike LCD up-resing of old games - never tried to reconstruct old music using nearest neighbor sampling. When you do that with audio, you can instantly hear it’s all wrong. If you were actually comparing nearest-neighbor audio reconstruction to blurry reconstruction, you would 100% agree that the blurry reconstruction was the ‘way it was intended to be heard’. The biggest problem with this whole argument that neither you nor the parent addressed is that LCD nearest-neighbor reconstruction is crappy, and as long as we try to blur when using LCDs, most of this discussion is moot.

So anyway, in many ways I think your argument already does agree with the idea that games designed on CRTs look better on CRTs than, e.g., 4-up reconstructions on LCDs. The entire sticking point in your argument might hinge on how you interpret the word “intended”. I’m saying the original argument isn’t necessarily claiming that the intent was conscious or explicit, it’s merely saying that the intent was a byproduct of having used CRTs during the creation process. In that sense, it’s a valid argument.

mrandish

I largely agree with your points, especially about 4-up reconstruction.

> variation in CRTs was very high back in the day

I wanted to add some more info around this point. In cases of home consoles this is true (because they hooked up to whatever TV you had) but there's one very large case where it's not true - and it's a case that matters quite a bit, especially from a historical preservation perspective.

Most arcade cabinets were made on factory assembly lines and used bare industrial CRTs. These CRTs were made by a handful of companies and arcade manufacturers selected the CRT model for a game by its specifications, which often differed from CRTs designed for use in consumer TVs. We know exactly which CRT (or CRTs) were used in most arcade cabinets and the detailed manufacturer specifications and schematics for those CRTs are preserved and online. When researching the proper modeline frequencies to set my quad-sync monitor to (because it's a chameleon), I look up the specifications of the original CRT in the original cabinet. The game developers usually had one of these industrial CRTs on their desk, so that they were developing for the exact CRT that would be in their game's arcade cabinet.

But it's even more precise than that. Many game ROMs have a set of hidden factory calibration screens with alignment grids and color bars. On the manufacturer's assembly line, after installing and connecting the CRT, workers fired up the game, went into these screens and adjusted the internal controls of the CRT so the horizontal & vertical positions and sizes of the grids were correct as well as the color bars via the tint control. I use these calibration screens to this day to properly set up my CRT to match the adjustments of the CRTs in the original cabinets (which the game ROM was written and tested against). Because my monitor handles so many ranges of frequencies, it stores and recalls these horiz/vert/tint adjustments for each unique scanning frequency (along with other adjustments like pincushion, skew, bow, etc). Historians have even managed to preserve some of the instruction sheets written for the factory floor workers to use when adjusting the CRTs to the intended spec.

Fun photo of the Ms. Pacman assembly line: https://thedoteaters.com/tde/wp-content/uploads/2013/03/pac-...

null

[deleted]

gwbas1c

I'm pretty sure those are similar, but different, images.

Having grown up with CRTs, very few games look "better" on them; mostly games that used interlacing to create flashing effects. (Edit: Forgot that light guns need CRTs due to timing.)

Otherwise, CRTs are like vinyl: Some people will invent all kinds of reasons why they are better, but in practice, they aren't.

pdpi

The argument isn’t “CRTs are better”. You’re right — they’re not. The argument is that pixel art from that era was designed around the specific limitations of CRTs, and takes advantage of the specific way that CRTs would mess with the pixels.

This is similar to what happened with electric guitars — you can make cheap amps with barely any distortion these days, but that sucks horribly for playing music composed around the presence of distortion. E.g. amp distortion tends to add a bunch of harmonic content that makes major/minor triads sound pretty bad, which is why power chords are popular. On the other hand, power chords sound pretty terrible without distortion, because they need that extra harmonic content to sound good!

sim7c00

nice points. the parallel with guitar really made it click for me thx =) makes total sense!

cubefox

CRTs are definitely much better than OLED or LCD in one major aspect: Motion clarity. OLED and LCD are sample-and-hold screens, meaning they will display a frame for the entirety of a frame time, like 1/60th of a second at 60 FPS. A CRT displays frames just for a fraction of the frame time (the rest of the time they are dark), which prevents perceptible blur when our eyes track moving objects, which they do all the time in order to prevent blur. More details here:

https://news.ycombinator.com/item?id=42604613

paulbgd

As a user of a crt pc monitor and a 240hz oled, the motion clarity of the oled is pretty darn close now. I’d bet 480hz is the point where the smoothness of modern panels finally catches up to the crts.

sim7c00

the vinyl comparison doesn't hold because music isn't composed on vinyl. saying vinyl is better is like saying jpeg images are better than png or something.. its the storage format/medium. it does impact the sound, but not the way people composed afaik.

the crts were used as the medium to compose the thing on for these artists. they saw their art come to life on them and found ways to optimise for that.

stavros

There was a post on here a few weeks ago that claimed that this isn't true, and that artists created the images on much better displays, that didn't have the limitations that the average CRT of the time had. Unfortunately, I can't find the post.

mrandish

I agree with you about vinyl but I think you're misunderstanding my point about CRTs. I'm not claiming CRTs are inherently "better" either technically or aesthetically. In general, they're not. I'm not like some audiophiles who argue vinyl, tube amplification and "magic copper" cables are better - denying both signal theory (Nyquist et al) and objective data (double blind A/B/X tests). Modern digital formats and devices are almost always better overall. The cases where they aren't are rare, very specific and, even then, 'better-ness' is only in certain ways and not others.

My background is in video engineering and the point I'm making here is very specific. It only applies to hand-crafted retro game pixel art created in the CRT era. And my point isn't even about CRTs! It's about how the RS-170A composite video standard that CRTs used encodes color. The "A" in RS-170A added color to the existing black and white television standard while remaining compatible with old B&W TVs. It was sort of a clever analog-era compression hack. I'll skip the technical details and history here (but both are fascinating) and simplify the takeaway. Broadly speaking, in digital representations of composite video color encoding, the correct color of a pixel can only be known relative to the current phase of the pixel clock and the pixels adjacent to it. Sometimes it's a fairly subtle difference but other times it can change a pixel from blue to red.

To be clear, this wasn't "better" in any way (other than allowing optional color). The "hack" of encoding chroma information at half the frequency of luma and only in relation to a sub-carrier frequency came with trade-offs like chroma fringing on high frequency edges, ringing and other spurious artifacts. However, it was the only video we had and video game creators of the 80s & 90s used the tech they had to create the best images they could. For example, we would put a pixel of a certain color next to a pixel of another color to intentionally change the color of the second pixel (and NOT just due to blurring the two together, it literally decoded to a different third color). I did this myself in the 1980s, intentionally positioning a white pixel next to a black pixel on an even numbered column so they would show as a single red pixel on the screen. Using this encoding technique, I could display 9 different colors from a computer that only had two bits per pixel. That's why just displaying a naive RGB output of the game isn't properly decoding a signal that was hand-encoded to have more and different data than what's in the naive RGB.

So I recommend using a CRT shader not because it emulates a glass tube with phosphors but because it includes the decoding logic to correctly display the original encoded content. Personally, I never use the shaders that add noise, blurring, cross-talk or other degradation. That would be as dumb as adding the pops and click of a dirty vinyl LP to a pristine signal. That would make it less accurate. My goal as an engineer is to be more accurate. It's fine if adding spurious crap tickles somebody's nostalgia bone from when they were a kid - but I'd never do that. I want to to see the original pixels and colors the artists saw and intended their audiences to see. That requires properly decoding the signal. And the example I posted demonstrates just how different an image can appear when properly decoded.

dylan604

>It was sort of a clever analog-era compression hack.

Also known as technical debt around these parts. The repercussions of that clever hack are still being dealt with to this day. I've spent a good deal of my career specializing in proper handling of video sources that are painful to deal with all because of this clever hack.

color burst, front porch, 1.001, ugh!!!!

lukevp

Very interesting! Learned a lot about color space and how it applies to subpixels, glad I watched this!

Gameplay wise, I think it should be a bigger game board and there should be accounting for the speed of the snake through each subpixel (when traveling left to right, going from R to G is less horizontal movement than going from B to R, and traveling vertically, each step is massive compared to the horizontal movement.) This should be pretty easy to do by ratio’ing and changing the speed of each animation step based on where in the spectrum it is. That would make it feel a lot more polished I think.

modeless

This is awesome. I was able to play it with a headband magnifier[1] on a 1440p monitor, slowed down 10x. Anything higher density would probably need an actual microscope.

[1] https://www.amazon.com/ProsKit-MA-016-Personal-Headband-Magn...

blibble

qbasic nibbles did the same using the ansi box drawing characters

there was a text character with half the vertical "cell" in use

this, along with clever use of foreground/background colours allowed double the vertical resolution (in text mode!)

FriedPickles

If you're as dumb as me and try to actually play this, note that the "Snake speed" value is inverted.

hatthew

the speed is ms per frame

grayhatter

ms/frame isn't a speed... it's a delay? maybe a rate?

kaoD

> it's a delay? maybe a rate?

I think period?

hatthew

Yeah frames per second probably would have made more sense. That being said, I think it's fine to colloquially refer to time/distance as speed, e.g. my walking speed is 15 minutes per mile, but it should probably be specified that that's the unit in use. But also this isn't a carefully designed game, it's a small tech demo, so ¯\_(ツ)_/¯

kbelder

Ah, "Speed of Time"

htk

Who's old enough to remember the joys of tweaking ClearType on Windows XP?

It was a great workaround for rendering smoother text on low dpi LCD displays.

shmeeed

You can still do that on Windows 10! It's just not that much fun anymore.

leeoniya

the easiest way to see a subpixel is to put a drop of water on the display. you probably get 100x magnification this way :)

bawolff

For the zooming out to make the css pixel = real pixel, i wonder if you could just use units like 0.25px instead. Or maybe divide by window.devicePixelRatio in js to make it dynamic.

yuvalr1

This great video goes into a bit more detail about pixels. It also shows that there is an interesting difference with the color green not only in monitors, but also in camera sensors that detect the color:

https://youtu.be/PMIPC78Ybts?list=PLplnkTzzqsZTfYh4UbhLGpI5k...

I can recommend it, and all the other videos of Cem Yuksel. He is really great at presenting!

kbelder

People who did it: We did it this way.

People who didn't do it: You couldn't have done it that way.