The Trinary Dream Endures
39 comments
·October 19, 2025mikewarot
Transistors are generally at their lowest static power dissipation if the are either fully on or off. The analog middle is great if you're trying to process continuous values, but then you're going to be forced to use a bias current to hold on in the middle, which is ok if that's the nature of the circuit.
A chip with billions of transistors can't reasonably work if most of them are in the analog mode, it'll just melt to slag, unless you have an amazing cooling system.
Also consider that there is only one threshold between values on a binary system. With a trinary system you would likely have to double the power supply voltage, and thus quadruple the power required just to maintain noise margins.
throw10920
This is great point, and I'll extend it by claiming that there's a more general physical principle underneath: that it's significantly easier to build bistable systems than tristable (or higher) systems, so much so that it makes up for the fact that you need more of them.
This is far more general than electronic systems (e.g. quantum computers follow the same principle - it's far easier to build and control qubits than qutrits/qudits).
(technically, it's even easier to build systems that have a single stable configuration, but you can't really store information in those, so they're not relevant)
foxglacier
Wouldn't you also get data loss using the linear region of transistors? The output would be have some error from the input and it would propagate through the circuit, perhaps eventually reaching on or off where it would be stuck.
bastawhiz
Trinary is an efficient way of storing lots of -1/0/1 machine learning model weights. But as soon as you load it into memory, you need RAM that can store the same thing (or you're effectively losing the benefits: storage is cheap). So now you need trinary RAM, which as it turns out, isn't great for doing normal general purpose computation with. Integers and floats and boolean values don't get stored efficiently in trinary unless you toss out power of two sized values. CPU circuitry becomes more complicated to add/subtract/multiply those values. Bitwise operators in trinary become essentially impossible for the average IQ engineer to reason about. We need all new IAs, assembly languages, compilers, languages that can run efficiently without the operations that trinary machines can't perform well, etc.
So do we have special memory and CPU instructions for trinary data that lives in a special trinary address space, separate from traditional data that lives in binary address space? No, the juice isn't worth the squeeze. There's no compelling evidence this would make anything better overall: faster, smaller, more energy efficient. Every improvement that trinary potentially offers results in having to throw babies out with the bathwater. It's fun to think about I guess, but I'd bet real money that in 50 years we're still having the same conversation about trinary.
Nevermark
Ternary is indeed an enticing, yet ultimately flawed dream.
Quaternary allows for:
True, “Yes”
False, “No”
Undetermined, “Maybe”, True or False
Contradiction, “Invalid”, True and False
Many people don’t know this, but all modern computers are quaternary, with 4 quaternit bytes. We don’t just let anyone in on that. Too much power, too much footgun jeopardy, for the unwashed masses and Python “programmers”.And the tricky thicket of web standards can’t be upgraded without introducing mayhem. But Apple’s internal-only docs reveal macOS and Swift have been fully quaternary compliant on their ARM since the M1.
On other systems you can replicate this functionality, at your own risk and effort, with two bits per. Until safe Rust support ships.
It will revolutionize computing, from the foundations up, when widely supported.
Russell’s paradox in math is resolved. Given a set S = “The set of all sets that don’t contain themselves”, the truth value of “Is S in S” returns Contradiction, I.e. True and False. Making S a well formed, consistent entity, and achieving full set and logical completeness via fill closure. So consistency is returned to Set theory and Russell’s quest for a unification of mathematics with just sets and logic becomes possible again. He would have been ecstatic. Gödel be damned!
Turing’s Incompleteness Theorem demonstrates that 2-valued bit machines are inherently inconsistent or incomplete.
Given a machine M, applied to the statement S = “M will say this statement is False”, or “M(S) = False”, it has to fail.
If M(S) returns True, we can see that S is actually False. If M(S) returns False, we can see that actually S is actually True.
But for a quaternary Machine M4 evaluating S4 = “M4(S4) = False”, M4(S4) returns Contradiction. Which indeed we can see S4 is.
Due to the equivalence of Undecidability limits and the Turing Halting Problem, quaternary machines are significantly more powerful and well characterized than binary machines. Far better suited for the hardest and deepest problems in computing.
nzeid
Not wrong, but I think the hope was more to have "8-trinit" bytes i.e. something with more states than a classic bit.
IndrekR
Most common quaternary storage system is probably DNA.
readthenotes1
I've liked true, false, unknown, unknowable--though think there should be a something somewhere for fnord.
bee_rider
> Trinary didn’t make any headway in the 20th century; binary’s direct mapping to the “on”/”off” states of electric current was just too effective, or seductive; but remember that electric current isn’t actually “on” or “off”. It has taken a ton of engineering to “simulate” those abstract states in real, physical circuits, especially as they have gotten smaller and smaller.
But, I think things are actually trending the other way, right? You just slam the voltage to “on” or “off” nowadays—as things get smaller, voltages get lower, and clock times get faster, it gets harder to resolve the tiny voltage differences.
Maybe you can slam to -1. OTOH, just using 2 bits instead of one... trit(?) seems easier.
Same reason the “close window” button is in the corner. Hitting a particular spot requires precision in 1 or 2 dimensions. Smacking into the boundary is easy.
hinkley
The lower voltage helps reduce leakage and capacitance in the chip as the wires get closer together.
But it does argue against more states due to the benefits of just making 1 smaller if you can and packing things closer. Though maybe we are hitting the bottom with Dennard scaling being dead. Maybe we increase pitch and double state on parts of the chip, and then generations are measured by bits per angstrom.
estimator7292
Once we invented CMOS this problem pretty much went away. You can indeed just slam the transistor open and closed.
Well, until we scaled transistors down to the point where electrons quantum tunnel across the junction. Now they're leaky again.
gyomu
> Trinary is philosophically appealing because its ground-floor vocabulary isn’t “yes” and “no”, but rather: “yes”, “no”, and “maybe”. It’s probably a bit much to imagine that this architectural difference could cascade up through the layers of abstraction and tend to produce software with subtler, richer values … yet I do imagine it.
You can just have a struct { case yes; case no; case maybe; } data structure and pepper it throughout your code wherever you think it’d lead to subtler, richer software… sure, it’s not “at the hardware level” (whatever that means given today’s hardware abstractions) but that should let you demonstrate whatever proof of utility you want to demonstrate.
russdill
There's a ton of places in modern silicon where a voltage represents far more than just on or off. From the 16 levels of QLC to the various PAM technologies used by modern interconnects
hinkley
I’ve wondered any number of times if 4 level gates would be useful to increase cache memory in CPUs. They aren’t great for logic, but how much decoding would they need to expand an L3 cache?
DiggyJohnson
What is PAM in this context?
saxonww
Pulse amplitude modulation
DiggyJohnson
Thanks. That’s a deep rabbit hole upon initial glances to say the least
1970-01-01
Isn't quantum computing "all the aries"
The quantum dream is also the trinary dream.
pumplekin
I've always thought we could put a bit of general purpose TCAM into general purpose computers instead of just routers and switches, and see what people can do with it.
I know (T)CAM's are used in CPU's, but I am nore thinking of the kind of research being done with TCAM's in SSD like products, so maybe we will get there some day.
hinkley
There’s a lot of tech in signaling that doesn’t end up on CPUs and I’ve often wondered why.
Some of it is ending up in power circuitry.
cyberax
TCAM still uses 2-bit binary storage internally, it just ignores one of the values.
jacobmarble
In digital circuits there’s “high”, “low”, and “high impedance”.
gblargg
There's low-impedance and high-impedance. Within low-impedance, there's high and low.
adamthegoalie
ChatGPT 5-Pro, What would it be like if we used trinary instead of binary computers? https://chatgpt.com/s/t_68f53bb9b15c8191b8d732f722243719
ChrisMarshallNY
I seem to remember reading about "fuzzy logic" (a now-quaint term), where a trinary state was useful.
zer00eyz
"One feature that sets certain rice cookers above the rest is “fuzzy logic,” or the ability of an onboard computer to detect how quickly the rice is cooking or to what level doneness it has reached, then make real time adjustments to time and temperature accordingly. " ... From: https://www.bonappetit.com/story/zojirushi-rice-cooker
It is a term that is still quite a fair bit for marketing. I think in this case (zojirushi) it isn't trinary, rather some probalistic/baysian system to derive a boolean from a number of factors (time, temp, and so on).
I've never understood the fascination here. Apparently some expression relating the number of possible symbols and the length of a message is closer to euler's number. I don't see why the product of those things is worth optimizing for. The alphabet size that works best is dictated by the storage technology, more symbols usually means it's harder to disambiguate.
2 is the smallest amount of symbols needed to encode information, and makes it the easiest to disambiguate symbols in any implementation, good enough for me.