Skip to content(if available)orjump to list(if available)

Stochastic computing

Stochastic computing

4 comments

·November 3, 2025

mikewarot

The key thing I would watch out for with real stochastic computing hardware is crosstalk[1], the inevitable coupling between channels that is bound to happen at some level. Getting hundreds or thousands (or millions?) of independent noise sources to avoid correlation is going to be one of the largest challenges in the process. For a small number of channels, it should be managable, but with LLM size problems, I think it's a deal killer.

[1] https://en.wikipedia.org/wiki/Crosstalk

kragen

If your random bit streams are generated by deterministic processes such as LFSRs, and you're combining them with things like NAND gates, you should easily be able to get the bit error rate down below 10⁻²⁰, I'd think? (And crosstalk would be a bit error.) How often do the gates in your CPU produce the wrong answer?

emil-lp

How is using randomness in stochastic computing connected to how algorithms (eg in the complexity class BPP) use randomness to solve problems?

numbol

It seems that those two (actually three or four) ideas are parallel and not always compatible.

[please forgive my grammar]

1. There is noisy computers which can work despite or because some unreliable part. Neural netwroks are quite ok with it for example, so some people speculate that it will be possible to build specialized noisy circuits for specific networks. 2. There is stochastic computing, in which complicated numerical functions represented as probability density distributions (?) 3. And then there is probabalistic computing, when state randomly updated in accordance with some "temprature". 4. And finally there is randomized algoritms, which are closer to classical computer science but with some stream of input. Howver, people like Avi Wigderson who succesfully removed the "random" parts of those algoritms.

Plus there is funny things with non-associativity of floating-point numbers which can lead to non-determinism when the order of execution (summation for example) is arbitary, which can lead to funny results. But because neural netwroks are robust to noise to some degree, it will still work.

And the stuff which done by Avi Wigderson requires that computers work in determinstic way (except of that random stream), so it will not be very compatible with unreliable noisy computations. However, it seems that stochastic, probabalistic and noisy computations could be combined.