Skip to content(if available)orjump to list(if available)

Blue95: a desktop for your childhood home's computer room

metadat

This looks nice and easy to use.

My hypothesis is today's "modern" OS user interfaces are objectively worse from a usability perspective, obfuscating key functionality behind layers of confusing menus.

It reminds me of these "OS popularity since the 70s" time lapse views:

https://youtube.com/watch?v=cTKhqtll5cQ

The dominance of Windows is crazy, even today, Mac desktops and laptops are comparatively niche

hi_hi

As a kid, the OS's supported me in learning. They were simple, intuitive and rewarding. I'd click around and explore, and discover cool things like a Wheezer music video, or engaging puzzle games.

There was no one who could help me when I got stuck, beyond maybe an instruction manual. I just had to figure it out, mostly by trial and error. I learned so much, eventually being able to replace hardware, install and upgrade drivers, re-install the entire OS and partition the hard drive, figure out networking and filesystems. It built confidence.

Now my kid sits infront of an OS (Windows, Mac, it doesn't really matter) and there's so much noise. Things popping up, demanding attention. Scary looking warnings. So much choice. There's so many ways to do simple things. Actions buried deep within menus. They have no hope of building up a mental model or understanding how the OS connects them to the foundations of computing.

Even I'm mostly lost now if there's a problem. I need to search the internet, find a useful source, filter out the things that are similar to my problem but not the same. It isn't rewarding any more, it's frustrating. How is a young child meant to navigate that by themselves?

This looks like a step in the right direction. I look forward to testing it out.

accrual

> Things popping up

This is one of my biggest frustrations with modern GUI computing. It's especially bad with Windows and Office, but it happens on iOS and macOS too to an extent. Even though I've had Office installed for weeks I still get a "look over here at this new button!" pop-up while I'm in the middle of some Excel task. Pop-up here, pop-up there. It's insane the number of little bubbles and pop-ups and noise we experience in modern computing.

askvictor

Even on gnome, I regularly have applications stealing focus when they decide they're the most important thing. As well as being really annoying, it's a security risk if an application steals focus while you're typing your password or otp key

overgard

Apple has kind of made things worse in the recent macOS, where my phone's notifications show up on the desktop now. Like, man, I was already drowning in them before anyway, I don't want them on two screens now.

mrob

The worst example of things popping up I've seen is Youtube's sponsored content warnings. These immediately appear above the active click target (video thumbnail) when you move your mouse over it, hijacking the expected action there. If you click things as a single action like every skilled mouse user does (rather than pointing and then clicking as two separate actions), it's physically impossible to react in time to avoid clicking them. And because only a part of the click target gets hijacked it's too inconsistent to learn to avoid it by intuition. I've clicked them accidentally several times and every time it's been disturbing because it feels like some serious and unexpected error.

fragmede

Omfg I do not need national political news shoved into my face in on the left side of my taskbar while I'm trying to focus on work, thank you Microsoft, k thx bye!

Andrex

Humans hate being bored but only dick-around and learn things like this when they're bored. Speaking personally, I guess.

Since the 90s we've found "better" ways at "curing" our boredom. Put this UI on a modern OS in front of a kid today and they would just download Steam, Chrome and Discord. And be assured, they're very proficient at the in-and-outs of those platforms.

Just some random thoughts I had, not sure any of it tracks...

jon_richards

I used to think I watched TV or scrolled Reddit because I didn’t have the energy to pursue more interesting things. I blocked Reddit and TV. Turns out I have plenty of energy, those were just stealing it from me.

_carbyau_

I think part of it is that ubiquitous internet access.

Take that away so you have a standalone computer that you "run programs on" and it becomes simpler.

whatevertrevor

Yeah I agree. The trial and error mentioned in GP needed a good amount of focused time commitment, the sort of thing at a premium in the modern attention economy of tech.

girvo

> I learned so much, eventually being able to replace hardware

As a young teenager in the early-mid 2000s, I learned the hard way what the little standoffs are for by killing a motherboard by screwing it directly into the steel case :')

Never made that mistake again, that's for sure. And I share all the same experiences as yourself

xandrius

There should be a club for people like us: we learnt the hard way to double-check and never ever fully trust ourselves, especially with hardware connections.

The first Mobo I ever purchased with my own money was insta-fried exactly like that, it still hurts a little to think about that.

interludead

I did the exact same thing - mounted a shiny new board straight onto the case, powered it on, and… nothing. Spent hours troubleshooting before realizing I'd basically shorted the whole thing

brulard

I share your nostalgia. I did learn a lot by exploring Amiga OS and later Windows 98 without anyone to help or an internet connection. It was fun, but we had time to spend back then. Today time feels much more scarce and I no longer appreciate if I have to learn by trial and error. Now it feels like you have to keep up with the tech progress, which is crazy fast. And youth has too many things to do that are more appealing, like social media, youtubes, loads of games etc. For them the exploration of OS or some software is not attractive anymore.

underlipton

There're also performance issues. Building muscle memory (which means offloading tasks from working memory, leaving it open for learning) can't happen if you're constantly trying to figure out when the system is going to actually respond to your input.

titzer

We largely abandoned an unbelievably efficient form of human input in favor of big fat dumb slow touchscreens. Can you imagine where we'd be as a species if we got our shit together 25 years ago and standardized a few of the most important keyboard shortcuts and layouts and that was the default everywhere? I won't advocate terminal-only, but even classical GUIs with windows and icons and all could have been much more efficient if keyboard input and navigation was given priority instead of the comedy of using a pixel-precise indirect pointer to fart around a virtual screen select some button when...there are buttons under my fingers.

crims0n

This is so true. The most frustrating thing in the world to me is waiting for the UI to catch up to my actions… that should just never happen in 2025. Not only is it frustrating to wait, but as you elegantly stated it forces the menial task to enter working memory.

hulitu

> Actions buried deep within menus.

Maybe they are though in CS school that every layer of abstraction is better. I don't see other explanation for this level of stupidity.

mananaysiempre

> This looks nice

These kinds of things almost always give me an uncanny-valley feeling. Here I'm looking at the screenshot and can’t help noticing that the taskbar buttons are too close to the taskbar’s edge, the window titles are too narrow, the folders are too yellow, and so on and so forth. (To its credit, Wine is the one exception that is not susceptible to this, even when configured to use a higher DPI value so the proportions aren’t actually the ones I’m used to.) I’m not so much criticizing the theme’s authors as wondering why this is so universal across the many replicas.

mouse_

Computing is largely a cargo cult thing these days.

The problem is that the interfaces these bootleg skins draw "inspiration" from were designed on the back of millions of pre-inflationary dollars' R&D from only the best at Golden-Age IBM, Microsoft, Apple, etc.. BeOS, OS/2, Windows 95-2000 do not look the way they do because it looks good, they look the way they do because it works good, countless man hours went into ensuring that. Simply designing an interface that looks similar is not going to bring back the engineering prowess of those Old Masters.

mananaysiempre

I’m less inclined to attribute it to “these days”, as I remember the contemporary copycat themes in e.g. KDE and Tk looking off as well. Even Swing with the native look-and-feel didn’t quite look or feel right, IIRC.

As a (weak) counterpoint to supplicating ourselves to the old UI masters, I submit Raymond Chen’s observations from 2004[1] that the flat/3D/flat cycle is largely fashion, e.g. how the toolbars in Office 97 (and subsequent “coolbars”) had buttons that did not look like buttons until you hovered over them, in defiance of the Windows 95 UI standard. (Despite Chen’s characteristic confident tone, he doesn’t at all acknowledge the influence of the limited palettes of baseline graphics adapters on the pre-Win95 “flat” origins of that cycle.)

Also worth noting are the scathing critiques of some Windows 95 designs[2,3] in the Interface Hall of Shame (2000). I don’t necessarily agree with all of them (having spent the earlier part of my childhood with Norton Commander, the separate folder/file selectors in Windows 3.x felt contrived to me even at the time) but it helps clear up some of the fog of “it has always been this way” and remember some things that fit badly at first and never felt quite right (e.g. the faux clipboard in file management). And yes, it didn’t fail to mention the Office 97 UI, either[4,5]. (Did you realize Access, VB, Word, and IE used something like three or four different forks of the same UI toolkit, “Forms3”, among them—a toolkit that looked mostly native but was in fact unavailable outside of Microsoft?..)

None of that is meant to disagree with the point that submitting to the idea of UI as branding is where it all went wrong. (I’ll never get tired of mentioning that the futuristic UI of the in-game computers of the original Deus Ex, from 2000, supported not only Tab to go between controls and Enter and Esc to submit and dismiss, but also Alt accelerators, complete with underlined letters in the labels.)

[1] https://devblogs.microsoft.com/oldnewthing/20040728-00/?p=38...

[2] http://hallofshame.gp.co.at/file95.htm

[3] http://hallofshame.gp.co.at/explore.htm

[4] http://hallofshame.gp.co.at/visual.html#VISUAL36

[5] http://hallofshame.gp.co.at/visual.html#VISUAL38

charcircuit

>they look the way they do because it works good

In modern times telemetry can show how well new designs work. The industry never forgot how to measure and do user research for ui changes. We've only gotten better at it.

goosedragons

It can look better. This is basically a distro with Chicago95 out of the box and not well configured. If you take the time it can look more like 95. The Chicago95 screenshots IMO look better:

https://github.com/grassmunk/Chicago95

int_19h

Fonts make the biggest difference here. Tahoma would also be decent (if not quite right).

Tade0

To be fair at least the title bar height was configurable and I recall at least one original Windows theme taking advantage of that.

zestyping

Ouch. That screenshot is uncomfortable to look at. The window title bars are painfully narrow, the frame borders have inconsistent thicknesses, the Start menu overlaps the taskbar, the vertical centering of text is wrong.

The answer to your question is that these replicas are of low quality. This one looks like the whole thing was made by someone (or a committee of people) lacking attention to detail.

bowlofhummus

The text is the worst. The icons are nice and pixely but the fonts are baby butt smooth anti aliased

j45

Someone having spend too much time using the original replicating it would likely notice these things.

Still it is hopefully a nice introduction for some.

voidfunc

I got in an argument with an accessibility engineer about this recently...

The whole UI as branding thing has utterly killed usability.

burnte

> The whole UI as branding thing has utterly killed usability.

This is caused by a change in who is hired as UI/UX developers. In days past it was HCI experts and engineers, now it's graphic designers. "Pretty" is the order of the day, not "useful". "There are too many menu items" is now answered with "So let's hide them" when it used to be "How can we organize them in the UI us a simple, discoverable manner?" But then that "overflow" menu (really? Needed menu commands are now OVERFLOW?) gets crowded so they start just removing features so the UI is nice.

girvo

Having worked with amazing HCI experts over the years, you've hit the nail on the head. It's wild how much design is done for designs sake at my work, with nary a nod to HCI given. The a11y team try to patch over it as best as possible, but we end up with a mess, and I'm treated like a pariah for pushing back on some of it

ivan_gammel

>This is caused by a change in who is hired as UI/UX developers.

„UX/UI developers“ is a strange name for it.

In 2000s the web enabled more sophisticated presentation designs and there was a push from client-server to web-based applications using incredibly strange technologies for building UIs — HTML, CSS and JavaScript, which gave the rise to UX design as a interdisciplinary job (HCI+digital graphics design). By 2010 the internet of applications kicked off and in mid-2010s moved to mobile, dramatically increasing the demand for UX designers. By then it actually mattered more who is hiring designers, not who is hired. Since only relatively small fraction of hiring managers does understand the scope of this job even now, they even started calling it „UX/UI designers“ or „Product designers“ as if that name change could help, still judging design work by often-fake screenshots on Behance rather than by case studies in the portfolio. Even HCI professionals are often reduced to mere graphic designers by those managers who skip research and apply „taste“ to a science-based discipline. At the same time, since UX design is one of the most well-paid and less stressful creative jobs, a lot of people switched to it without proper education or experience, having no idea what is statistical significance or how to design a survey. And voila, we are here.

hyperbrainer

It's interesting especially because it seems like companies today pour tens of millions into "accessibility", but I never see a thing's usability in terms of simple and easy-to-do-what-I-want UX fall in to the same category.

cenamus

Even just simple UX testing with people that have never seen or used your software seems to be a lost art.

rafram

One is required by law and/or contract terms, the other is just nice to have.

cosmic_cheese

It’s a completely predictable result if you think about it.

Old style UI was developed with the findings of countless man-hours of UX research performed by field experts, while branded UI is typically whipped together purely based on trends and vibes in an evening by a visual designer who’s probably never performed an ounce of serious research or user trials. It’s natural that the latter is only going to be good at the most superficial of purposes. UI as branding is the McMansion of UX.

bri3d

I think it’s worse from a time wasting standpoint, really - a lot of modern UX does have thousands of hours of UX research dumped into it, but with faulty metrics driven goal seeking and internal politics bolted on. I agree that Vibe Branding killed UX in the way you describe in the 2000s (remember when every company had some abominable Flash site?!), but now, we’ve come full circle: from the ashes we’ve allowed warring factions of UX researchers to return to create hundreds of carefully constructed disparate systems with no consistency.

Lorkki

It's also repeating what the hellscape of inconsistent skinned UIs did in the late 90s and early 2000s. People are looking back at those times with a rather selective memory.

Gormo

The themed UIs of that era were very superficial -- if they applied to serious software at all, they were just a cosmetic layer on top of an otherwise well-engineered interface, and could be easily disabled. Most people I knew, for example, disabled the theming engine that shipped with Windows XP. Most applications that supported UI skinning still had a default or fallback UI that adhered well enough to modern conventions.

Not so much anymore. The abandonment of any coherent organizing principle to UI layout in favor of pure aesthetics has been a massive regression. Reasonably complex software often doesn't even include menu bars anymore, for example.

xandrius

People forget having to use IE with 12 toolbars when going over at some friend's house.

pwg

"It is difficult to get a man to understand something, when his salary depends on his not understanding it.”

https://www.goodreads.com/quotes/21810-it-is-difficult-to-ge...

WarOnPrivacy

> The whole UI as branding thing has utterly killed usability.

Imagine if Active Desktop had taken over.

I eventually came up with a not-awful use for AD but that was a few years after it went away.

anthk

It did under several ways since w98SE and Explorer with IE merged on.

leonidasv

When I used XFCE as my daily driver, I once tried installing Chicago95 just for nostalgia and it stick as my daily driver for almost a year! The UI is less distracting than modern UIs and there's something to it that makes it easier to just know which window is open over which window that's lacking in modern UIs (I think it's the over-reliance on soft shadows and the borderless windows).

Eventually, I stopped using it because: 1- it was always annoying to send an screenshot to someone and have to explain that no, I wasn't using Windows 95, and why; 2- the grey-ish look of everything started to bother me over time; 3- I wanted a more integrated desktop experience and moved to KDE Plasma. Still, I configured my Plasma to work like old Windows: window titles on taskbar, zero to none animations, etc.

caseyy

I also dailied it on XFCE. The UI was very utilitarian and purposeful. I suppose aesthetically it is unimpressive and not streamlined, but it serves the purpose of being a good interface to do a task.

Same as you say, people have asked me a lot about it and even asked me if I could set it up for them. The theme is evangelizing Linux a little bit, and that is interesting. In the right hands, these UI principles could convert many people to some product.

P.S. You can now change the grey-ish look with Win95-style theming support. I've not used it, but here's more info: https://github.com/grassmunk/Chicago95/blob/master/Plus/READ...

keyringlight

>and there's something to it that makes it easier to just know which window is open over which window that's lacking in modern UIs (I think it's the over-reliance on soft shadows and the borderless windows).

I think this started with Vista, I remember watching a video criticizing the new love of glass effects on UI chrome as it got rid of or minimized the color/shading difference between focused/unfocused windows. The example the video used was 6 notepad windows and pick which one was focused, and the main cue you'd need to look for is that the window with focus had a close button colored red.

Thin borders and minimalist/hiding scrollbars is another one that annoys me, give me something graphical for my gaze to grasp.

fc417fc802

This entire comment section has me repeatedly thinking to myself "you don't run into that problem with i3-alikes" over and over. My choice not to put up with modern UX bullshit is feeling strongly reaffirmed right now. Not that it needed to be.

> it was always annoying to send an screenshot to someone and have to explain that no, I wasn't using Windows 95

That's not a negative, that's a fringe benefit as an endless source of entertainment.

breadwinner

Agree that "modern" OS user interfaces are objectively worse from a usability perspective. That's thanks to Flat UI, mostly.

In my opinion, nothing beats the 35-year-old NeXTSTEP interface (which W95 is a weak imitation of):

https://www.gnustep.org/carousel/PC_1300x650.png

esafak

Microsoft Windows programs hid functionality under layers of menus and the registry. MacOS, at least, surfaces much less functionality, because it offers sensible defaults. I never had to do anything akin to fiddling with the Windows Registry.

I did like some Windows things, though, like the ribbon, and reconfigurable UIs. Today's UIs are more immutable, for the worse.

zamadatix

I'd agree macOS surfaces much less functionality but I feel like it's more "because they don't want you to feel like there is a choice to make in the first place" rather than "because the defaults are ideal for everyone". Over time it feels like "layers of menus" have definitely made their way into Apple's software anyways.

The replacement to the registry seems to half be "magic CLI incantations for settings which can't be found in the GUI for some reason" and half "here's a $4.99 app to 3 finger tap to close tabs".

p_l

And the defaults system is just registry by another name

cosmic_cheese

It’s not a 1:1 mapping, but much power user functionality in macOS is designed to progressively reveal itself as the user becomes more technically capable, a type of design known as progressive disclosure. This allows newbies to not feel overwhelmed while also allowing power users to feel at home.

The problem is that way too many people approach macOS with the Windows way of doing things firmly planted in their minds as “correct”, which interferes with this process. For example, over the years I’ve encountered numerous posters complaining about how macOS can’t do X thing, after which I point out that X thing is right there as an easy to find top level menu item, but the poster in question never bothered to take a look around and just assumed the functionality didn’t exist since it wasn’t surfaced the same way as under Windows or KDE or whatever they were coming from.

Of course there are things macOS just doesn’t do, but there’s plenty that it does if users are willing to set their preconceptions aside for a moment.

exiguus

If you approach macOS the Linux or BSD way, it feels like Windows Powershell. Of course you can use brew and stuff, setup you dev enviroments etc. But when it comes to system settings, its bad, very bad. Also stuff like docker, k8s suffer performance and usability.

int_19h

One thing that I always hated about macOS is the menu bar placement.

Ironically, in the long run, it has proven to be an asset for the simple reason that any macOS app has to have a main menu with commands in it, if it doesn't want to look silly. So this whole modern trend of replacing everything with "hamburger" menus that don't have the functionality isn't killing UX quite so bad there.

Although some apps - Electron ones, especially - stick a few token options there, and then the rest still has to be pixel-hunted in the window. Some don't even put "Settings" where it's supposed to be (under the application menu). Ugh.

SlackSabbath

As somebody who recently had to switch to Mac for work, my experience has been the exact opposite of this. Every other OS I've used since Windows 95 I've been able to get to grips with the same way: start off using the mouse to find my way around the UI, and introduce keyboard shortcuts as and when I find them useful. Eventually I get to the point of being able to use either exclusively keyboard or exclusively mouse for most tasks.

MacOS seems to _require_ some unergonomic combination of both from the get go. Some basic things are easy with the keyboard but hard/impossible with the mouse and vice versa. The Finder app doesn't even have a button to go 'up' a directory for god's sake.

diggan

> I never had to do anything akin to fiddling with the Windows Registry

If I recall correctly, when I got my first Macbook, I had to edit plist files or something similar in order to do basic things like permanently showing hidden files, showing the full path in Finder, show file extensions for all file types, increase the animation speed so the computer didn't feel slow as molasses, etc, etc.

Maybe these things are now easier to configure via GUI on macOS?

baq

One thing that is weird is that you’re expected to look around the menu bar holding the option key as the menu contents change when that is pressed (also applies to tray icon menus, e.g. WiFi icon shows a lot of stuff when option-clicked.) IIRC some of what you say can be toggled with option menu items.

cosmic_cheese

Toggling hidden file visibility in Finder and open/save dialogs has been doable with the key shortcut Command-Shift-. for quite some time now.

bbqfog

MacOS is pretty cursed. The equivalent to registry fiddling is doing anything in ~/Library/Application Support

It still has "Services" as a hold over from Next that is completely broken and unused (but still present in every app for some reason). Now you also have the joy of diving deep into the Settings every time an app needs some sort of permission.

I'd say something about .DS_Store files, but that's not really UI.

wpm

I'd much rather work with plain-text human readable property list files with straightforward `defaults` commands than the hive of Hell called the registry.

anthk

NeXTStep/GNUStep/Cocoa 'defaults' commands.

burnte

> I never had to do anything akin to fiddling with the Windows Registry.

I don't believe you. You have never, EVER, NOT ONCE run a terminal command to change an option on MacOS? I just refuse to believe anyone on HN hasn't altered preferences in the terminal on MacOS.

esafak

Not with anything near the frequency of Windows. The last such thing I remember doing is restarting the locate indexing service with launchctl. I do lots of things in the command line, of course, but not so much to configure MacOS itself.

brandon272

There is an entire ecosystem of free and paid Mac apps meant to augment the Mac experience because MacOS does not provide functionality or configuration needed for a sensible computing experience out of the box.

exiguus

I think, the main difference of MacOS and Windows is, that Windows allow drivers from 3rd-party. MacOS does not. Drivers means also hardware. So you can build your own PC. Same as with Linux.

This is the Apple secret of success IMO. No 3rd-party drivers and hardware, means, it will just work and no one will blame you for stuff 3rd-parties messed up.

But its also like: There is only a red and blue t-shirt. Choose. No gray, no white, no yellow, no printings.

exiguus

Selecting the appropriate tool for the task at hand is crucial, in my opinion. However, I believe the choice is often influenced by companies mandating the use of Microsoft and Mac systems due to cost and maintenance considerations, rather than allowing employees to choose between Mac, Windows, or Linux based on their preferences. Proprietary software that only runs on Mac or Windows, never was an argument, because you can just RDP stream remote desktop apps or use the browser.

happyopossum

> My hypothesis is today's "modern" OS user interfaces are objectively worse from a usability perspective, obfuscating key functionality behind layers of confusing menus.

I think if you went back and actually tried to use these old UIs you would realize that one of the reasons stuff isn't hidden behind layers of menus is that in a lot of cases the 'hidden' features just didn't exist back then.

epolanski

Mobile too, I'm sick of those OSs updating every 18 months because some product person along marketing decides the wheel has to be reinvented otherwise there will be no buzz.

I have a harder and harder time navigating both iOS and Android as time goes, should be the opposite.

Same for Windows or MacOs.

haswell

Lately I've been strongly considering helping migrate my parents to Linux. Their needs are primarily web-based with some basic productivity tools mixed in, and Windows has just been getting more and more hostile. On top of this, they're at an age where they're now more susceptible than ever to various scams/attacks, and shutting down an entire category of problems by removing Windows from the picture is increasingly attractive.

I had forgotten that Chicago95 exists, but this might be exactly the right thing. They'd immediately find it familiar, and while the theme isn't the whole story, this would go a long way in easing the transition I think.

I miss this era of computing.

ianmcgowan

A chromebox mounted behind the monitor did the trick for me. Haven't had an emergency wipe/reinstall in years. Also, a tablet with keyboard takes some of the pressure off having a "computer" and you can go with iOS or Android depending on what phone they use.

haswell

I've been evaluating a Lenovo ThinkCentre m920q tiny I picked up for not very much money on eBay (the m720q models are even cheaper) and they seem like perfect machines for the task.

My parents use some tools and hardware that require a full OS so the tablet route isn't an option, but I'm starting to really like the idea of deploying a couple of these micro PCs.

mixmastamyk

Out of the frying pan, into the fire. Exposing your parents to total surveillance (from one corp to another) is not what I'd characterize as safe or friendly. Linux is fine these days if the hardware is supported, and you can use an immutable distro if extra reliability is warranted.

hattmall

Strong disagree. I mean tracking is what it is, but it's happening regardless and if they are using Chrome to browse and Google services they are being tracked.

ChromeOS seems to work really well though and is dead simple and intuitive. It used to be incredibly awesome with crouton but that's mostly dead. Crostini is acceptable though. I would absolutely recommend ageing people getting a chrome device for security and simplicity.

Plus running any android apps on desktop gives even more software options than any other desktop for most consumers.

fc417fc802

You also can't properly back up the system unless you install a custom ROM and take on the associated maintenance burden. (At least Android. I'm not sure to what extent these things are left up to the whims of the developer on iOS.)

arcmechanica

My parents run linux because mom likes coupon websites and I can't repair the thing every week

veqq

What are coupon websites / you can get free coupons from them(?)

thoughtpalette

retailmenot, etc. There's some pretty dubious ones that come up in search results, e.g. try searching "levis.com coupon codes" or something.

txdv

I installed ubuntu for my mother, she just needs to download pdfs and read pdfs, look at images, use gmail. Sometimes she opens a document with LibreOffice, but no power usage.

Seems to work, the maintenance is also now super easy, ssh, update. Something wrong and she needs support? I ssh, open up a tunnel and connect via remina to her desktop to explain.

I had a situation once when Ubuntu did literally not go into the Desktop Environment anymore, but all I did was update and upgrade packages and it started working again.

dpflug

IMHO, Fedora's Atomic Desktops[^1] are the way to go for that. Automatic upgrades you can roll back if something breaks? Yes, please.

Universal Blue[^2] has some spins that got a glow up, but their dev team gives a bit of the "everything old is bad" vibe.

OpenSUSE's MicroOS[^3] desktops aren't ready for nontechnical people, but their atomic upgrade strategy is much faster and simpler (btrfs snapshots). I'm keeping an eye on it.

^1: https://fedoraproject.org/atomic-desktops/

^2: https://universal-blue.org

^3: https://microos.opensuse.org

haswell

Good call on Fedora's Atomic options.

My daily driver is NixOS and part of me really wants that level of predictability and rollback for them. For a brief period, I had started thinking through what it might look like to remotely manage this for them. But my ultimately goal is to help them achieve autonomy, and only step in when necessary.

cryptoegorophy

Why not just iPad?

haswell

I may have understated their needs somewhat. Most of what they do is browsing and document editing, but a few key use cases make a real computer necessary (or at least highly desirable):

- Document scanning

- Label printing (my mom buys/sells stuff on eBay)

- My dad still works and writes proposals/manages invoices/does complex taxes

At a minimum, they need a full desktop environment. Most of these things have decent 1:1 Linux alternatives, but one or two might necessitate a single-purpose Windows VM when all else fails.

Two pretty decent used micro PCs will also cost less than a single iPad.

doright

I like themes like this. The only thing that hampers the authenticity for me, and this isn't the fault of the author really, is the super high resolution fonts compared to what was available back then. There's just something charming about low resolution fonts that are readable enough on screen, probably nostalgia.

I think any type of pixel font authentic to a couple decades ago won't look good on a 4K monitor, unfortunately. It got to the point where I ordered a 1024x768 monitor just to play old games with a period system.

jeroenhd

Pixel fonts don't accurately represent the 90's UIs because we don't use CRTs anymore. The poor souls buying the very first terrible flat screen monitors may have used computers like that, but most of that era was experienced using smudgy, edge blurring CRTs.

You could probably create a CRT-filter-based font for high resolution screens (though you'd probably still need to optimise for subpixel layout for accuracy, even on 4k monitors).

Gormo

Most people vastly overstate the effect that CRT displays had on the appearance of legacy software.

Yes, very early on, when people used TVs or cheap composite monitors as the display devices for their computers, there were blurry pixel edges, bloom effects, dot crawl, color artifacting, and all the rest.

But by the '90s, we had high-quality monitors designed for high-resolution graphics with fast refresh rates, with crisp pixel boundaries and minimal artifacting. CRT filters overcompensate for this a lot, and end up making SVGA-era graphics anachronistically look like they're being displayed on composite monitors.

zozbot234

CRT monitors did not have "crisp pixel boundaries". A CRT pixel is a Gaussian-blurred dot, not a "crisp" square as it is on modern displays. What "high-quality" CRT monitors did have was higher resolutions, even as high as 1600x1200, where individual pixels are basically not distinguishable.

dfox

Another issue with modern recreations of old UIs is that the dimensions are usually subtly wrong, which for me ruins the feeling. Some of that is related to the fonts having different height, but in many cases it is just that something is one-pixel off and just looks wrong. For the 95-style UI the common issue are control borders (especially the high-light side of "3d" controls), of which there is a huge amount of examples on the screenshot.

wobfan

I actually would think less of this as a look back into the past but hopefully as a real alternative to the current DEs, which obviously then needs to have high res fonts. That would be nice.

selfhoster11

I wouldn't say that's so "obvious". I for one would prefer the original pixel fonts, but size adjusted to fit my screen density. By hand, if required.

zozbot234

If we're talking Windows 9x, the "original fonts" could also be TrueType, hence arbitrarily resized. Yes, the original Windows 95 included a pixel font for the UI but then TrueType fonts like Verdana and Tahoma were added soon after that and were commonly used.

selfhoster11

For 4K monitors, why not just pixel-double? Integer scaling will solve many issues introduced by pixel fonts.

doright

You're right in that there's nothing stopping one from doing so (I even use an integer scaler for old games on my main computer), it's just a tradeoff between "doing what's possible" and "having the most authentic experience one can".

If we're talking about the subjective experience of recreating "a child's bedroom computer" from the mid 90s-early 00s, a widescreen aspect ratio alone would be jarring, since my conception of a monitor for such a system is a 4:3 CRT. So for me, little else would reach that level except a system with the same aspect ratio and a similar DPI.

Not only that, but UI design itself has undergone many shifts since that era to account for the types of monitors those UIs are being designed for. There's not as much of a need for pixel-perfect design when vector-based web UIs dominate the desktop application space nowadays, relegating those that go back to older UI paradigms to enthusiasts who still remember earlier times. Or maybe people who develop for fantasy consoles.

I should mention while I'm at it that those sort of faux-pixel art shaders used in some games come off as quite jarring to me since I expect UIs to be meticulously laid out for the original screen size, not just for blowing up arbitrary content 2x or 4x on a huge widescreen monitor. I sometimes feel those are meant to represent a nostalgic feeling of some kind, being pixelated and all, but really it just makes me wish there were some alternate reality in which people still designed games and desktop applications for 800x600 or 1024x768 monitors again.

It's interesting at present how there's stuff like 4K and then there's the playdate with a relatively tiny handheld resolution, but relatively little interest for new content for those types of resolutions in-between.

Gormo

> If we're talking about the subjective experience of recreating "a child's bedroom computer" from the mid 90s-early 00s

Is that what this project is going for? I understood it to be attempting to apply design elements from that era to create a superior UI for a modern "child's bedroom computer".

creatonez

libpango's removal of bitmapped fonts in 2019 did serious harm to retro theming.

interludead

I love that you went all-in with a 1024x768 monitor

MarkusWandel

Three modern desktop environments that I use:

- Windows 10/11. Especially in 11, it's easiest just to type the start of an app's name into the search box. As opposed to the two clicks it takes to get to the "traditional" menu where you still have to scroll to find it.

- Gnome (only on fresh Linux installs, usually replaced with Mate pretty soon). Has a smartphone-style app grid, but here, too, its quickest just to type the start of the app's name.

- Mate: Modern, but still has the Windows 95 paradigm (easy enough to collapse the two toolbars into just one bottom one). Still my favourite desktop environment.

Not all fancy graphic stuff is good. And don't even get me started on how hard it is to drag an app window to another screen these days - on Windows. You really have to find the 2% or so of the top bar that's still draggable and not cluttered up by other stuff.

teamonkey

How do you get Windows to launch an installed app after you type the first few letters, instead of searching the web with Edge and Bing?

Mogzol

I used winaero tweaker [1] to disable web search, the search is infinitely better now.

You can do the same tweak by editing the registry [2] if you don't want to download an app for it (though the app includes a lot of additional useful tweaks).

[1] https://winaerotweaker.com/

[2] https://www.tomshardware.com/how-to/disable-windows-web-sear...

hn92726819

You can download openshell and it will replace the start menu from the start menu from whatever era you want (XP, 7, 8, I think 10 too). It's open source as well

MarkusWandel

Type into the search box and when the app icon comes up, click on it.

arcmechanica

"I searched bing for you and here's the link to Word on the web"

wavemode

The decline in usability and organization of the Windows start menu over the years has been frankly staggering.

Whenever I see screenshots of the old menu I get pangs of nostalgia.

MarkusWandel

I think they drank the Macintosh kool-aid and expect you to have all your commonly used apps pinned. In Win11 this even looks sort of like a Mac dock. Still made no sense to ruin the usability of the start menu which they invented.

arcmechanica

What happens when you are no longer the darling. And PMs need to ship something new to get noticed, so they screw it all up

leptons

I use two free programs to return my computer usability to Windows 10 days.

https://open-shell.github.io/Open-Shell-Menu/

https://github.com/valinet/ExplorerPatcher

Without these I would probably give up on computers and go live under a bridge.

MarkusWandel

At least you get the right-click menu that has a lot of handy stuff in the old format.

interludead

Sometimes I wonder if the people designing this stuff ever actually use dual displays day to day

emidln

This looks neat. I remember the various fvwm95 and icewm themes doing a similar number in the late 90s and early 2000s.

It would be fun to pair this with Gambas[0], a free VB6 clone that works with GTK.

[0] https://gambaswiki.org/website/en/main.html

ThinkingGuy

qvwm was another window manager that sought to emulate the look and feel of Windows:

https://qvwm.sourceforge.net/index_en.html

bsnnkv

This still remains the absolute pinnacle of cohesive desktop environment design in my books.

InsideOutSanta

I think the desktop operating systems of that era were at a sweet spot. They were technically advanced enough to render good-looking, crisp color user interfaces. However, most people were still novices at using computers, so OS designers consciously designed their operating systems to be as clear as possible. Applications tended to be written for each individual platform and to follow its UI guidelines.

Windows 95, NT, System 7 and System 8, BeOS, and NextSTEP all had really clear UX. You always knew where to drag a window, what was clickable, where to find settings, etc.

cosmic_cheese

An aspect of System 7/Mac OS 8/9 that I find criminally underrated is how flexible it is.

For those versions, a good bulk of the “system” isn’t part of the system proper but instead implemented by way of extensions and control panels loaded at startup. The OS itself is extremely minimal, basically just enough to provide a barebones desktop and act as a substrate for apps to run on. Everything else, including “core” functionality like audio and networking, was implemented in an extension.

This meant that you could pare back the OS to be extremely lean and only have the exact functionality you personally needed at that precise moment and nothing else, and doing so didn’t require you to recompile a kernel or anything like that — just create an extension set that only loaded what you needed. This was excellent for use cases like games and emulators where you wanted every last ounce of resources to go to those, and nice for single purpose machines too (no point in loading game controller support on a box that only ever runs Photoshop and Illustrator).

Of course the way it was implemented is awful by modern standards, but the concept is golden and I think there should be OS projects trying to replicate it.

InsideOutSanta

I remember creating different extension sets using the built-in Extension Manager and the third-party tool Conflict Catcher. I had sets for gaming, video editing, and normal usage. It was a simple matter of selecting the correct set and rebooting. Or you could hit shift on startup and start into a minimal system without any extensions.

There's a good reason the third-party extension manager was called "Conflict Catcher," but the power and flexibility such a system grants users is unmatched.

vanschelven

> However, most people were still novices at using computers

It has (to my surprise, initially) been my experience that "kids these days" are more novice at (desktop) computer-usage than the people of the 90s

stonogo

"I've come up with a set of rules that describe our reactions to technologies: 1. Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works. 2. Anything that's invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it. 3. Anything invented after you're thirty-five is against the natural order of things." -- Douglas Adams

mfro

Is it really necessary to spin up an entirely new distro for an XFCE+GTK theme?

grayhatter

> Stop spending time on things I don't care about

It's ok for people to waste time building stuff they think is cool. Did it need to be a distro? No but it also didn't need to exist. I'm glad it exists though, I think it really whips the llamas ass!

charcircuit

This attitude is why Linux based operating systems have such poor market share on the desktop. Opportunity costs are real. Friction is real. You don't see Windows creating a new OS for a single theme. You don't see macOS do it either.

keyringlight

I see it as one of the consequences of freedom, but perhaps also a gap in packaging where they can't bundle up their changes in a form to be applied onto another base.

That 'base' is one issue I've been thinking about with linux, I have similar concerns about the cost of everyone being able to make their own distro for their own slight variation on something else. It's not that I think it's a bad thing to pathfind in new areas, but the replication in building/supporting it all, getting users to pick between 4 similar variants of the same thing, and accounting for "you're using KustomLinux, which is 2 steps removed from CommonLinux" and all the little differences between them. It's an interesting contrast against standardization, but I can't help wondering how it would change the approachability of linux if the starting point was limited to one of the big distros and then variants are layered on top of that.

grayhatter

The reason Linux has poor desktop market share has nothing to do with a fun themed distro someone created as a side project.

> You don't see Windows creating a new OS for a single theme. You don't see macOS do it either.

I'd also consider the behavior of both Windows and OSX to be a warning to avoid, and not an example to emulate.

But the line doesn't always have to go up with every breath everyone takes. It's ok to do stuff just because it's fun. Not every single action needs to increase market share.

Gormo

Why would the Linux ecosystem -- a diverse community of lots of different individuals and organizations all pursuing their own particular goals -- be singularly concerned with increasing desktop market share of Linux as a whole, and all pursue that in the exact same way?

immibis

You don't see Windows themes at all.

corank

I wouldn't call it an entirely new distro. It's just an Fedora image bundled with the necessary changes to create the UX. It doesn't provide its own software repositories. It's more like an unofficial Fedora Spin.

mfro

I see, it still seems like the kind of project that would be much better suited to a DE packge-group style release. I think very few people will want to reinstall their OS just to try it.

haunter

A better modern middle ground imo is the KD3 continuation project https://www.trinitydesktop.org/

haunter

*KDE3

bitbasher

It's not complete until you have a comet cursor and several IE toolbars that were somehow installed.

hybrid_study

Or Microsoft Bob somewhere

OsrsNeedsf2P

Does this project offer anything besides Chicago95's UI pre-installed?

fsiefken

would be nice if it would have wine installed so it can run most windows apps for where there is no good linux alternative. xyplorer, sumatra, irfanview

perhaps a shell where root is mapped to C:\

benrutter

I love the niche of enthusiasm that exists for the Windows 95 UI. It's not an original point that aside from nostalgia it's a really clear and usable design; but that leads me to wonder, are there any modern UIs/themes/etc that are inspired by (rather than necessarily directly mimicking) Windows 95?

Would be interesting to see what a modern version of Windows 95 would look like, or what general design lessons can be learned from it's niceties.

vardump

My childhood home's computer said 38911 BYTES FREE.

jhbadger

And mine just said ]▩ as it waited for you to type an Applesoft command. It is always weird when people say something is from "your childhood" as opposed to theirs. I remember the 1990s, sure, but I was already an adult.

WarOnPrivacy

Computers from my childhood wouldn't fit in my bedroom but I did bring punchcards home.

probably_wrong

Mine said C:\>, because I was cool enough to have a 34MB hard drive.

ipcress_file

Was that a C64? My VIC-20 had about a tenth of that!

myself248

SYS 64738