Why the original Macintosh had a screen resolution of 512×324
130 comments
·May 27, 2025pavon
analog31
A lot of those old machines had clock speeds and video pixel rates that meshed together. On some color machines the system clock was an integer multiple of the standard colorburst frequency.
The Timex Sinclair did all of its computation during the blanking interval which is why it was so dog slow.
implements
There’s an interesting blog post about how a far simpler machine generates its video signal, if people are curious about the signals involved:
http://blog.tynemouthsoftware.co.uk/2023/10/how-the-zx80-gen...
“The CPU then only produces a TV picture when BASIC is waiting for input (or paused). At other times it does not bother to produce a video picture, so the CPU can run the program at full speed.”
krige
The Commodore Amigas had their 68k clock speed differ based on region due to carrier frequency difference (more specifically, 2x freq for NTSC, 1.6x for PAL, which resulted in almost the same, but not quite, clock speed).
It's interesting how the differing vertical resolutions between these two (200p /400i vs 256p /512i) also had some secondary effects on software design, it was always easy to tell if a game was made in NTSC regions or with global releases in mind because the bottom 20% of the screen was black in PAL.
ido
To save the curious a search: the Timex Sinclair is the American variant of the ZX Spectrum.
rwmj
The ZX Spectrum had (primitive) video hardware. The GP commenter means the ZX80 and ZX81 which used the Z80 CPU to generate the display and so really were unable to both "think" and generate the display at the same time. On the ZX81 there were two modes, SLOW mode and FAST mode. In FAST mode the Z80 CPU prioritized computations over generating the display, so the display would go fuzzy grey while programs were running, then would reappear when the program ended or it was waiting for keyboard input.
pragma_x
It's also interesting to look at other architectures at the time to get an idea of how fiendish a problem this is. At this time, Commodore, Nintendo, and some others, had dedicated silicon for video rendering. This frees the CPU from having to generate a video signal directly, using a fraction of those cycles to talk to the video subsystem instead. The major drawback with a video chip of some kind is of course cost (custom fabrication, part count), which clearly the Macintosh team was trying to keep as low as possible.
jnaina
Both the key 8-bit contenders of yore, Atari 8-bit series and Commodore 64 custom graphics chips (Antic and Vic-II) “stole” cycles from the 6502 (or 6510 in the case of C64) did "cycle stealing", when it needed to access memory.
I remember writing a cpu intensive code on the Atari and using video blanking to speed up the code.
MBCook
Plus those weren’t raw bitmaps but tile based to help keep memory and bandwidth costs down.
stefan_
Displays are still bandwidth killers today, we kept scaling them up with everything else. Today you might have a 4k 30bpp 144hz display and just keeping that fed takes 33Gbit/s purely for scanout, not even composing it.
danudey
I have a 4k 60Hz monitor connected to my laptop over one USB-C cable for data and power, but because of bandwidth limitations my options are 4k30 and USB 3.x support or 4k60 and USB 2.0.
I love the monitor, it's sharp and clear and almost kind of HDR a lot of the time, but the fact that it has a bunch of USB 3.0 ports that only get USB 2.0 speeds because I don't want choppy 30Hz gaming is just... weird.
wmf
Everything is amazing and nobody's happy.
tveyben
Thats the exact reason i dichted my dock and connected the monitor directly to my labtop. 30 Hz is way too low, I need 60 (or maybe 50 would have been enough - I’m in the PAL part of the world ;-)
wkat4242
Should have gone for thunderbolt :)
monkeyelite
4k jumped the gun. It’s just too many pixels and too many cycles. And unfortunately was introduced when pixel shaders starting doing more work.
Consequently almost nothing actually renders at 4k. It’s all upscaling - or even worse your display is wired to double up on inputs.
Once we can comfortably get 60 FPS, 1080p, 4x msaa, no upscaling, then let’s revisit this 4k idea.
justinrubek
Make it 120fps and I could agree.
account42
WTF are you talking about, 60 FPS for 4K isn't even that challenging for reasonably optimized applications. Just requires something better than a bargain bin GPU. And 120+ FPS is already the new standard for displays.
bobmcnamara
We see this in embedded systems all the time too.
It doesn't help if your crossbar memory interconnect only has static priorities.
hulitu
And marketing said, when LCDs were pushing CRT out of the market, that you don't need to send the whole image to change a pixel on an LCD, you can change only that pixel.
p_l
except DVI is essentially VGA without Digital-to-Analog part, and original HDMI is DVI with encryption, some predefined "must have" timings, and extra data stuffed into empty spaces of blasting a signal designed for CRT.
I think partial refresh capability only came with some optional extensions to DisplayPort.
nothercastle
Why did they need 60hz? Why not 50 like Europe? Is there some massive advantage to syncing with the ac frequency of the local power grid?
MBCook
If you’re used to seeing 60Hz everywhere like Americans are 50Hz stands out like a sore thumb.
But mostly I suspect it’s just far easier.
kragen
Conventional wisdom a few years after the Macintosh was that 50Hz was annoyingly flickery. Obviously this depends on your phosphors. Maybe it was already conventional wisdom at the time?
I feel like the extra 16% of screen real estate would have been worth it.
kwertyoowiyop
A white background at 50hz was pretty flickery, at least to my NTSC-experienced eyes, on a TV. Maybe they could have used longer persistence phosphors in their monitor, but then motion and animation would blur.
msephton
Speaking of phosphors, early Photoshop had a setting to pick what you were using. These adjusted RGB balance according to published or measured values. https://twitter.com/gingerbeardman/status/154371808662010265...
ajross
Exactly. Like the Apple ][, the original Mac framebuffer was set up with alternating accesses, relying on the framebuffer reads to manage DRAM refresh.
It looks like DRAM was set up on a 6-CPU-cycle period, as 512 bits (32 16-bit bus accesses) x 342 lines x 60 Hz x 6 cycles x 2 gives 7.87968 MHz, which is just slightly faster than the nominal 7.83 MHz, the remaining .6% presumably being spent during vblank.
meatmanek
But why 342 and tune the clock speed down instead of keeping the clock speed at 8MHz and having floor(8e6/2/6/60/32) = 347 lines?
I suspect kmill is right: https://news.ycombinator.com/item?id=44110611 -- 512x342 is very close to 3:2 aspect ratio, whereas 347 would give you an awkward 1.476:1 aspect ratio.
kragen
You could reduce the gain on the horizontal deflection drive coil by 2% to get back to 3:2. In fact, I doubt that it was precise to within 2%.
ajross
That doesn't sound right. The tube the mac was displaying on was much closer to a TV-style 4:3 ratio anyway, there were significant blank spaces at the top and bottom.
If I was placing bets, it was another hardware limitation. Maybe 342 put them right at some particular DRAM timing limit for the chips they were signing contracts for. Or maybe more likely, the ~21.5 kHz scan rate was a hard limit from the tube supplier (that was much faster than TVs could do) and they had a firm 60 Hz requirement from Jobs or whoever.
bobmcnamara
It's like dual porting but twin half duplex!
johnklos
The title is incorrect, because b&w Macs have 512×342 resolution, not 512x324.
It wouldn't've been too crazy had Apple went with 64K x 4 chips, so they'd've just needed four of them to get 128 KB at a full 16 bits wide.
512x342 was 16.7% of 128 KB of memory, as opposed to 18.75% with 512x384. Not much of a difference. But having square pixels is nice.
jerbear4328
It looks like it's just the HN submitted title which is wrong (currently "Why the Original Macintosh Had a Screen Resolution of 512×324"). The article's title is "Why the Original Macintosh Had a Screen Resolution of 512×342", and "324" doesn't appear anywhere on the page.
bscphil
Looks like someone is reading Hacker News comments and editing the page - archive.org captured the page probably mid-edit, and it says "324" in one place: https://web.archive.org/web/20250527202300/https://512pixels...
ChuckMcM
Oh that's priceless. Real time HN feedback loops.
90s_dev
> wouldn't've
Really, John? You really had to make me parse that word?
webstrand
It's a great word, I use it all the time.
kragen
You shouldn't've tho. Who'd've complained if you hadn't've?
kevin_thibedeau
It usually isn't transcribed with Klingon orthography.
90s_dev
I bet you also work for the IRS don't you
brookst
You version of shouldn’t’ve’s punctuation isn’t like that?
kstrauser
Who’d’ve thought?
JKCalhoun
Worth adding? The (almost [1]) omni-present menu bar ate 20 pixels of vertical space as well — so you could say the application had 322 of useable rows.
[1] To be sure, many games hide the menu bar.
jhallenworld
CRTs are very forgiving- 512x342 vs x384 would have made very little difference. You could still get square pixels by minor adjustments to vertical and horizontal size.
My question is what is the htotal and vtotal times in pixels and lines? Maybe there was a hardware savings to have vtotal exactly equal to 384 (which is 128 times 3). Perhaps they saved one bit in a counter, which may have resulted in one fewer TTL chip.
bane
The answer is something that's harder and harder to do these days with all the layers of abstraction -- set a performance target and use arithmetic to arrive at the specifications that you hit and still achieve your performance goal.
It's a bit of work, but I suspect you can arithmetic your way through the problem. Supposing they wanted 60 Hz on the display and a framebuffer you need 196,608 bits/24,576 bytes/24 kbytes [below] on a 1-bit display at 512x384.
The Mac 128k shipped with a Motorola 68k at 7.8336 Mhz giving it 130560 Hz per frame @ 60 fps.
IIR the word length of the 68k is 32bits, so imagining a scenario where the screen was plotted in words, it's something like 20 cycles per fetch [1], you can get about 6528 fetches per frame. At 32-bits a fetch, you need 6144 or so fetches from memory to fill the screen. You need a moment for horizontal refresh so you lose time waiting for that, thus 6528-6144 = (drumroll) 384, the number of horizontal lines on a display.
I'm obviously hitting the wavetops here, and missing lots of details. But my point is that it's calculable with enough information, which is how engineers of yor used to spec things out.
1 - https://wiki.neogeodev.org/index.php?title=68k_instructions_...
below - why bits? The original Mac used 1-bit display, meaning each pixel used 1-bit to set it as either on or off. Because it didn't need 3 subpixels to produce color, the display was tighter and sharper than color displays, and even at the lower resolution appeared somewhat paperlike. The article is correct that the DPI was around 72. Another way to think about it, and what the Mac was targeting was pre-press desktop publishing. Many printing houses could print at around 150-200 lines per inch. Houses with very good equipment could hit 300 or more. Different measures, but the Mac, being positioned as a WYSISWYG tool, did a good job of approximating analog printing equipment of the time. (source: grew up in a family printing business)
p_l
Motorola 68000 used had 16 data lines and 24 address lines, so it took at least two cycles to just transfer a CPU full word (disregarding timings on address latches etc).
Some of the code AFAIK used fancy multi-register copies to increase cycle efficiency in graphics code.
As for screen, IIRC making it easy to correlate "what's on screen" and "what's on paper" was major part of what drove Mac to be nearly synonymous with DTP for years.
wmf
In typography there are 72 points per inch so they made 1 pixel = 1 point.
simne
Impressed to see, how many people read whole article, not see just one phrase: "We don’t need a lot of the things that other personal computers have, so let’s optimize a few areas and make sure the software is designed around them".
Mac was not cheap machine and Apple that time was not rich to make unnecessary thing - they really need to make a hit this time, and they succeed.
And yes, it is true, they was limited by bandwidth, it is also true they was limited by semi-32bit CPU speed.
But Mac was real step ahead at the moment, and had significant resources to grow when new technology will arrive. That what I think lack PCs of that time.
hyperhello
The article really didn’t explain why they picked that number.
kmill
I don't know, but I can do some numerology: a 3:2 aspect ratio that's 512 pixels wide would need a 341 and a third lines, so round up and you get 512 by 342.
The later 384 number corresponds to an exact 4:3 aspect ratio.
bryanlarsen
For efficient graphics routines on a 32 bit machine, it's important that the scan line direction (aka horizontal for normally mounted CRT's) be a factor of 32, preferably one that's a power of 2.
The article mentions the desire for square pixels. So presumably they chose the horizontal resolution first and then chose the vertical resolution that gave them square pixels for a 512 pixel horizontal resolution.
nssnsjsjsjs
It was 32bit?!
mayoff
The data and address registers of the 68000 were 32 bits wide.
kzrdude
That reminds me of this old system settings panel https://lowendmac.com/2015/32-bit-addressing-on-older-macs/
I remember the "enable 32-bit addressing" part (but it's not pictured..)
tom_
The 68000 is 16 bit internally, and can access memory only 16 bits at a time, but the instruction set was designed with future iterations in mind, and most instructions can operate on 32 bit quantities - with a performance penalty. (Because in essence it has to do the work in 2 stages.)
Whether this is enough to make it count as actually 32 bits is one for the philosophers.
edwinjm
The article says: In short, there’s no easy answer to explain why early compact Macs ran at a screen resolution of 512×342. Rather, Apple was doing what it does best: designing a product with the right trade-offs for performance, ease of use, and cost.
null
detourdog
It was noticeably better than anything else I had ever seen.
badc0ffee
It doesn't say exactly why 512x342 was chosen. But I'm more interested in why it was changed to 512x384 on later Macs. Is it just to fill the full 4:3 screen?
Beyond that, this article really wants to tell you how amazing that resolution was in 1984. Never mind that you could get an IBM XT clone with "budget" 720x348 monochrome Hercules graphics that year and earlier.
fredoralive
The 512x384 models are Macintosh LC adjacent, so the original LC monitor (the LC itself can do 640x480), or the Colour Classics. AFAIK it was partly in order to making the LC work better with the Apple IIe card (although the IIe software uses a 560x384 mode).
A Hercules card, whilst nice does suffer from the same non-square pixels issue as the Lisa, so not as nice for creating a GUI.
badc0ffee
> although the IIe software uses a 560x384 mode
Nice, that's line doubled from the //e's 560x192 and would probably look crisp.
rasz
Both MDA and Hercules were 50 Hz. Real mid eighties king of cheap crisp displays would be 12 inch 640x400@71Hz Atari SM124 monitor. You could buy Atari ST + SM124 + Atari SLM804 laser printer + Calamus DTP package for the price of just the Apple laser printer alone :)
badc0ffee
I had a XT clone + Hercules at the time (and SIMCGA for games), and the 50Hz refresh wasn't as bad as you'd think - the MDA CRTs were designed with slow decay phosphors to reduce flicker.
I actually had no idea that Atari made a laser printer. Everyone I knew with a ST (admittedly, not many people) was either doing MIDI or playing video games.
phendrenad2
I always assumed it was a compromise between memory usage, refresh speed, and the GUI that they wanted. Don't forget that the Macintosh was preceded by the Lisa (800x364) and the IIGS (640x200), so they probably had a good sense for what was comfortable given a certain resolution.
dragonwriter
> Don't forget that the Macintosh was preceded by the Lisa (800x364) and the IIGS (640x200),
Lisa was January 1983
Macintosh was January 1984
Apple IIgs was September 1986
sgerenser
The IIGS came out well after the original Macintosh: https://en.wikipedia.org/wiki/Apple_IIGS
wkat4242
The Lisa was also about twice as expensive as the Macintosh which is why it failed hard. So the price limited the hardware and that caused this display bandwidth constraint.
kristianp
The folklore link they reference: https://www.folklore.org/Five_Different_Macs.html
The 1st edition of macworld, notably the first page is an advert for microsoft's products, multiplan spreadsheet, word, etc. https://archive.org/details/MacWorld_8404_April_1984_premier...
The original floppy used on the mac was a single-sided 400KB disk. I imagine that was another set of trade-offs. https://folklore.org/Disk_Swappers_Elbow.html
kalleboo
> The original floppy used on the mac was a single-sided 400KB disk. I imagine that was another set of trade-offs
Originally they planned on using custom 870K drives, but they were too unreliable so at the last minute they switched to the Sony 400K 3.5" disks
kazinator
> but given the name of this website, it was pretty embarrassing.
Why, the name of the website is 512pixels.net not 342pixels.net; he nailed the 512 dimension. :)
dtgriscom
I remember in the early '80s using a computer (a Xerox Star, perhaps?) that used the CPU to generate the display. To speed up CPU-intensive tasks, you could blank the screen.
p_l
Alto had its entire display control in microcode, IIRC.
Out of similar tricks, Symbolics 3600 (at least first model) had major portions of disk driver implemented as one of the tasks in microcode (yes, the microcode was a multi-tasking system with preemption). Don't know how much of MFM wrangling was going there, but ultimately it meant that reading and writing a page from/to disk was done by means of single high level instruction
Reason077
> “To minimize CRT flicker, Apple worked to achieve a vertical refresh rate of 60 Hz”
… a limitation that many Macs, and even some iPhones, are still stuck with over 40 years later!
perching_aix
It's always surprising for me to see people regard 60 Hz CRT as "flicker-free", or "minimal flicker", etc. Whenever I saw a CRT running at 60 Hz, I'd be immediately be able to tell. Always used at minimum 75 Hz but preferably 85 Hz at home (early 2000s, Windows).
bluGill
Have you ever seen something running at 30 Hz? Or even 15? The difference in flicker between 30 and 60 is much much larger than the difference between 60 and 120! Yeah 60 isn't flicker free, any finite number is not (there is probably quantum limits), but realistically you reach a point where you can't really tell. For most purposes 60Hz is close enough, though you can still tell.
perching_aix
I don't remember frankly. For what it's worth, TV sets would always be 50 Hz here (PAL) (unless they did some tomfoolery I'm not aware of and ran at 100 Hz "in secret" or something) and evidently I could watch those on end without too many holdups for years and years, so clearly it wasn't a dealbreaker. But on monitors, yeah, I just wouldn't tolerate it, whereas 85 Hz felt perfect (no discernible flicker for me that I'd recall).
pezezin
I have recently been playing with CRTs again, and something that I have noticed is that for fast-paced games running at 60 or 70 Hz* I don't notice the flicker much, but for text anything less than 85 Hz is headache inducing. Luckily the monitor I got can do 1024x768 at 100 Hz :)
* The original VGA and thus most MS-DOS games ran at 70 Hz.
p_l
I remember when I got my first computer for myself, instead of sharing with others, it was "obvious requirement" that the screen runs at least 72Hz, preferably higher. Which was why 15" CRT had to run at 800x600.
Later on, and with graphic card that had more than 2MB of RAM, I remember experimenting a lot with modelines to pull higher refresh rates and higher resolution on the 17" CRT I inherited when my father switched to a laptop :)
kragen
On a green ZnS:Cu phosphor, even 20Hz is minimal flicker.
wkat4242
Me too. I'm also really sensitive to PWM. I tried using 85Hz on my VGA monitor but the higher signal bandwidth and cheap hardware made the video noticeably blurrier. 70 wasn't a great compromise either.
Since TFTs came I was bothered a lot less by it because the lack of flicker (though some 4 bit cheap TN LCDs still had it with some colours)
npunt
Monochrome CRT phosphors like P4 (zinc sulfide w silver) have longer persistence than ones used in color CRTs, so flicker is less noticeable.
Suppafly
>Whenever I saw a CRT running at 60 Hz, I'd be immediately be able to tell. Always used at minimum 75 Hz but preferably 85 Hz at home (early 2000s, Windows).
Same, I remember installing some program that would let you quickly change the display settings on basically every computer I ever interacted with. It was especially bad if the crt was in a room with fluorescent lighting.
bluGill
If your lighting and display have flicker at mathematical ratio you will notice unless the frequency is extremely high. 1:1 is most likely because it is easy to sync lights and the CRT to the AC line frequency which is 60Hz in the US (50Hz in Europe). 1:2 (used to be somewhat common) or 4:5 ratios would also cause issues.
Though now that I think of it, the CRT should be syncing with the signal and there is no reason that sync needs to be related to the AC line, but it does anyway (all the computers I know of generate their own sync from a crystal, I have no idea where TV stations get their sync but I doubt AC line frequency).
hollerith
But there is less need because LCDs do not flicker (except some designed for videogames that strobe the backlight for some strange reason IIUC).
I know I found the flicker of CRTs annoying even at 60 Hz.
kragen
Strobing the backlight seems like it would allow you to not illuminate the new frame of video until the liquid crystals have finished rotating, so you only have to contend with the persistence of vision on your retina instead of additionally the persistence of the liquid crystals.
hollerith
My "for some strange reason" was the wrong choice of words. I don't wish to imply that I disapprove of the reason that gaming monitors do it, just that I haven't done the work to try to understand.
johnb231
MacBook Pro is 120 Hz
The article didn't nail down an exact reason. Here is my guess. The quote from Andy Hertzfeld suggests the limiting factor was the memory bandwidth not the memory volume:
> The most important decision was admitting that the software would never fit into 64K of memory and going with a full 16-bit memory bus, requiring 16 RAM chips instead of 8. The extra memory bandwidth allowed him to double the display resolution, going to dimensions of 512 by 342 instead of 384 by 256
If you look at the specs for the machine, you see that during an active scan line, the video is using exactly half of the available memory bandwidth, with the CPU able to use the other half (during horizontal and vertical blanking periods the CPU can use the entire memory bandwidth)[1]. That dictated the scanline duration.
If the computer had any more scan lines, something would have had to give, as every nanosecond was already accounted for[2]. The refresh rate would have to be lower, or the blanking periods would have had to been shorter, or the memory bandwidth would have to be higher, or the memory bandwidth would have had to be divided unevenly between the CPU and video which was probably harder to implement. I don't know which of those things they would have been able to adjust and which were hard requirements of the hardware they could find, but I'm guessing that they couldn't do 384 scan lines given the memory bandwidth of the RAM chips, and the blanking times of the CRT they selected, if they wanted to hit 60Hz.
[1]https://archive.org/details/Guide_to_the_Macintosh_Family_Ha...
[2]https://archive.org/details/Guide_to_the_Macintosh_Family_Ha...