Skip to content(if available)orjump to list(if available)

Reversible computing with mechanical links and pivots

Animats

See Drexler's mechanical nanotechnology from 1989.[1]

There's a minimum size at which such mechanisms will work, and it's bigger than transistors. This won't scale down to single atoms, according to chemists.

[1] http://www.nanoindustries.com/nanojbl/NanoConProc/nanocon2.h...

kragen

It seems like you've misremembered the situation somewhat.

Merkle developed several of his families of mechanical logic, including this one, in order to answer some criticisms of Drexler's earliest mechanical nanotechnology proposals. Specifically:

1. Chemists were concerned that rod logic knobs touching each other would form chemical bonds and remain stuck together, rather than disengaging for the next clock cycle. (Macroscopic metal parts usually don't work this way, though "cold welding" is a thing, especially in space.) So this proposal‚ like some earlier ones like Merkle's buckling-spring logic, avoids any contact between unconnected parts of the mechanism, whether sliding or coming into and out of contact.

2. Someone calculated the power density of one of Drexler's early proposals and found that it exceeded the power density of high explosives during detonation, which obviously poses significant challenges for mechanism durability. You could just run them many orders of magnitude slower, but Merkle tackled the issue instead by designing reversible logic families which can dissipate arbitrarily little power per logic operation, only dissipating energy to erase stored bits.

So, there's nothing preventing this kind of mechanism from scaling down to single atoms, and we already have working mechanisms like the atomic force microscope which demonstrate that even intermittent single-atom contact can work mechanically in just the way you'd expect it to from your macroscopic intuition. Moreover, the de Broglie wavelength of a baryon is enormously shorter than the de Broglie wavelength of an electron, so in fact mechanical logic (which works by moving around baryons) can scale down further than electronic logic, which is already running into Heisenberg problems with current semiconductor fabrication technology.

Also, by the way, thanks to the work for which Boyer and Walker got part of the 01997 Nobel Prize in Chemistry, we probably know how ATP synthase works now, and it seems to work in a fairly similar way: https://www.youtube.com/watch?v=kXpzp4RDGJI

floxy

>mechanical logic (which works by moving around baryons) can scale down further than electronic logic, which is already running into Heisenberg problems with current semiconductor fabrication technology.

I think I must be missing something here, I thought this was working with atoms. Are you saying that someday mechanical logic could be made to work inside the nucleus? Seems like you might be limited to ~200 nucleons per atom, and then you'd have to transmit whatever data you computed outside the nucleus to the nucleus in the next atom over? Or are we talking about converting neutron stars into computing devices? Do you have a good source for further reading?

kragen

No, no, not at all! That kind of thing is very speculative, and I don't think anybody knows very much about it. What I'm saying is that the position of a nucleus is very, very much more precisely measurable than the position of an electron, so it has a much weaker tendency to tunnel to places you don't want it to be, causing computation errors. That allows you to store more bits in a given volume, and possibly do more computation in a given volume, if the entropy production mechanisms can be tamed.

We routinely force electrons to tunnel through about ten nanometers of silicon dioxide to write to Flash memory (Fowler–Nordheim tunneling) using only on the order of 10–20 volts. That's about 60 atoms' worth of glass, and the position of each of those atoms is nailed down to only a tiny fraction of its bond length. So you can see that the positional uncertainty of the electrons is three or four orders of magnitude larger than the positional uncertainty of the atomic nuclei.

zozbot234

The interesting question is how much energy is lost to mechanical friction for a single logic operation, and how this compares to static leakage losses in electronic circuits. It should also be noted that mechanical logic may turn out to be quite useful for specialized purposes as part of ordinary electronic devices, such as using nano-relay switches for power gating or as a kind of non-volatile memory.

kragen

That's one of many interesting questions, but avoiding it is why Merkle designed his reversible logic families in such a way that no mechanical friction is involved, because there is no sliding contact. There are still potentially other kinds of losses, though.

gene-h

And why wouldn't it work? Linear slide like mechanisms consisting of a silver surface and single molecule have been demonstrated[0]. The molecule only moved along rows of the silver surface. It was demonstrated to stay in one of these grooves up to 150 nm. A huge distance at this scale.

[0]https://www.osti.gov/servlets/purl/1767839

kragen

It can work (see my sibling comment) but it's tricky. The experiment you link was done under ultra-high vacuum and at low temperatures (below 7 K), using a quite exotic molecule which is, as I understand it, covered in halogens to combat the "sticky fingers" problem.

gradschool

You seem to be knowledgeable about this topic. The reversible component designs in the article appear to presuppose a clock signal without much else said about it. I get that someone might be able to prototype an individual gate, but is the implementation of a practical clock distribution network at molecular scales reasonable to take for granted?

gsf_emergency

Not entirely.. terminal Br were also required to keep the molecule on the Silver tracks..

7373737373

I'd love to see a watch manufacturer try to build a watch-sized purely mechanical computer

kragen

That's clearly feasible; the mechanical complexity for a mechanical computer is on the order of a Curta calculator, and I outlined some promising approaches to macroscopic mechanical digital logic 15 years ago in https://dercuano.github.io/notes/mechanical-computers.html. Since then MEMS has advanced significantly and gone mainstream, and photolithographic and reactive-ion-etching-based silicon fabrication has been used for other purposes, including watchmaking, with macroscopic silicon flexure components going into first TAG Heuer's Guy Sémon's Zenith Oscillator in the Zenith Defy Lab https://www.hodinkee.com/articles/zenith-defy-lab-oscillator... and then mainstream watches:

https://wornandwound.com/no-escapement-an-overview-of-obtain...

https://monochrome-watches.com/in-depth-the-future-of-silico...

https://www.chrono24.com/magazine/innovative-escapements-fro... (warning, GDPR mugging)

https://www.azom.com/article.aspx?ArticleID=21921

https://www.europastar.com/the-watch-files/those-who-innovat...

ahartmetz

Actually kinda impressive that a current CPU is "only" 9 orders of magnitude from the ridiculously low theoretical minimum energy needed per "floating point operation" (kinda fuzzy, but who's counting at 9 orders of magnitude?). The efficiency difference between the first computers and SOTA CPUs is probably more than 9 orders of magnitude - actually, it seems to me that it's in the ballpark of 9 orders of magnitude.

qoez

For things like machine learning I wonder how much extra performance could be squeezed out by simply working with continuous floating values on the analog level instead of encoding them as bits through a big indirect network of nands.

tails4e

This is something that has been tried, basically constructing an analog matrix multiply/dot product and it gives reasonable power efficiency at into levels of precision. More precision and the analog accuracy leads to dramatic power efficiey losses (each bit is about 2x the power), so int8 is probably the sweet spot. The main issues are it is pretty inflexible and costly to design vs a digital int8 mac array, and hard to port to newer nodes, etc

hermitShell

I have wondered this and occasionally seen some related news.

Transistors can do more than on and off, there is also the linear region of operation where the gate voltage allows a proportional current to flow.

So you would be constructing an analog computer. Perhaps in operation it would resemble a meat computer (brain) a little more, as the activation potential of a neuron is some analog signal from another neuron. (I think? Because a weak activation might trigger half the outputs of a neuron, and a strong activation might trigger all outputs)

I don’t think we know how to construct such a computer, or how it would perform set computations. Like the weights in the neural net become something like capacitance at the gates of transistors. Computation is I suppose just inference, or thinking?

Maybe with the help of LLM tools we will be able to design such things. So far as I know there is nothing like an analog FPGA where you program the weights instead of whatever you do to an FPGA… making or breaking connections and telling LUTs their identity

rcxdude

It's possible, but analog multiplication is hard and small analog circuits tend to be very noisy. I think there is a startup working on making an accelerator chip that is based on this principle, though.

grumbelbart

There are optical accelerators on the market that - I believe - do that already, such as https://qant.com/photonic-computing/

Calwestjobs

TLC,QLC,MLC in ssd is it. so it is used already. and it gives you limits of current technology.

ziddoap

>*TLC,QLC,MLC"

For those unaware of these acronyms (me):

TLC = Triple-Layer Cell

QLC = Quad-Level Cell

MLC = Multi-Level Cell

thrance

You lose a lot of stability. Each operation's result is slightly off, and the error accumulates and compounds. For deep learning in particular, many operations are carried in sequence and the error rates can become inacceptable.

Legend2440

Deep learning is actually very tolerant to imprecision, which is why it is typically given as an application for analog computing.

It is already common practice to deliberately inject noise into the network (dropout) at rates up to 50% in order to prevent overfitting.

red75prime

Isn't it just for inference? Also, differentiating thru an analog circuit looks... interesting. Keep inputs constant, wiggle one weight a bit, store how the output changed, go to the next weight, repeat. Is there something more efficient, I wonder.

PendulumSegment

This is very interesting because according to one of the authors of the mechanical computing paper(personal communication) they never dynamically simulated the mechanisms. It was purely kinematic. So this web browser simulation is new work. Reversibility might disappear once dynamics are modelled.

mitthrowaway2

Indeed. The web simulation clearly applies damping, which is an irreversible element. A truly reversible process should probably be built around minimally-damped oscillating elements, so that the stored energy never needs to dissipate.

PendulumSegment

Even if damping is removed they might not be reversible. Logic gates that were found to be individually reversible, were found to have difficulties operating when connected in a circuit: https://core.ac.uk/download/pdf/603242297.pdf

mikepfrank

It's impossible to avoid incurring some losses at finite speed, but as far as I know there is nothing fundamental preventing one from approaching reversible operation when operating at a sufficiently slow (but nonzero) speed.

vonnik

for the silicon kind of reversible computing, see vaire.co. they're in tapeout.

rkp8000

A great pedagogical article on thermodynamic vs logical reversibility, for those interested: https://arxiv.org/abs/1311.1886 (Sagawa 2014).

mikepfrank

Sagawa was mistaken in this article; he failed to appreciate the role of mutual information in computing, which is the proper basis for understanding Landauer's principle. I discussed this in https://www.mdpi.com/1099-4300/23/6/701.

coumbaya

This is the concept behind the computers in The Diamond Age right ? Or am I mistaken ?

fintler

It's very similiar. The rod logic in diamond age (Eric Drexler was the one who originally came up with it) moves linearly -- not rotationally like this does. It's also reversible.

danbmil99

Now that I think of it, if using damped springs, the system would not be reversible. Energy is dissipated through the damping, and the system will increase in entropy and converge on a local energy minimum point.

Another way of looking at it: there are 4 states going in (0 or 1 on 2 pushers) but there are only 2 states of the 'memory' contraption, so you lose a bit on every iteration (like classical Boolean circuits)

fintler

Classical reversible computing feels like it would be a good way to interface with a quantum computer (since it's also reversible in theory).

danbmil99

Quantum computation came directly out of reversible computing. Look for example at the Fredkin and Toffoli gates.

null

[deleted]

jstanley

> Specifically, the Landauer’s principle states that all non-physically-reversible computation operations consume at least 10^21 J of energy at room temperature (and less as the temperature drops).

Wow! What an absurd claim!

I checked the Wikipedia page and I think you actually meant 10^-21 J :)

tennysont

Fix! Ty!

P.S. I once calculated the mass of the sun as 0.7kg and got 9/10 points on the questions.

godelski

FYI, total global energy production is a lot less than 10^21 J. It's south of 10^19 from what I can google...

kragen

Depends on which aspects of energy production you're concerned with and over what time period. Global marketed energy production is about 18 terawatts, which is about 10²¹ J every year and 9 months. The energy globally produced by sunlight hitting the Earth, mostly as low-grade heat, is on the order of 100 petawatts, which is 10²¹ J every hour and a half or so. Global agriculture is in between these numbers.

HACKER224

[flagged]