Skip to content(if available)orjump to list(if available)

Sandia turns on brain-like storage-free supercomputer

patcon

Whenever I hear about neuromorphic computing, I think about the guy who wrote this article, who was working in the field:

Thermodynamic Computing https://knowm.org/thermodynamic-computing/

It's the most high-influence, low-exposure essay I've ever read. As far as I'm concerned, this dude is a silent prescient genius working quietly for DARPA, and I had a sneak peak into future science when I read it. It's affected my thinking and trajectory for the past 8 years

iczero

Isn't this just simulated annealing in hardware attached to a grandiose restatement of the second law of thermodynamics?

pclmulqdq

Yes. This keeps showing up in hardware engineering labs, and never holds up in real tasks.

ahnick

Is this what Extropic (https://www.extropic.ai/) is aiming to commercialize and bring to market?

afarah1

Interesting read, more so than the OP. Thank you.

lo_zamoyski

I will say that the philosophical remarks are pretty obtuse and detract from the post. For example...

"Physics–and more broadly the pursuit of science–has been a remarkably successful methodology for understanding how the gears of reality turn. We really have no other methods–and based on humanity’s success so far we have no reason to believe we need any."

Physics, which is to say, physical methods have indeed been remarkably successful...for the types of things physical methods select for! To say it is exhaustive not only begs the question, but the claim itself is not even demonstrable by these methods.

The second claim contains the same error, but with more emphasis. This is just off-the-shelf scientism, and scientism, apart from what withering refutations demonstrate, should be obviously self-refuting. Is the claim that "we have no other methods but physics" (where physics is the paradigmatic empirical science; substitute accordingly) a scientific claim? Obviously not. It is a philosophical claim. That already refutes the claim.

Thus, philosophy has entered the chat, and this is no small concession.

vlovich123

I’m not sure I understand what you’re trying to say. It’s not really questionable that science and math are the only things to come out of philosophy or any other academic pursuit that have actually shown us how to objectively understand reality.

Now physics vs other scientific disciplines sure. Physicists love to claim dominion just like mathematicians do. It is generally true however that physics = math + reality and that we don’t actually have any evidence of anything in this world existing outside a physical description (eg a lot of physics combined = chemistry, a lot of chemistry = biology, a lot of biology = sociology etc). Thus it’s reasonable to assume that the chemistry in this world is 100% governed by the laws of physics and transitively this is true for sociology too (indeed - game theory is one way we quantifiably explain the physical reality of why people behave the way they due). We also see this in math where different disciplines have different “bridges” between them. Does that mean they’re actually separate disciplines or just that we’ve chosen to name features on the topology as such.

evolextra

Man, this article is incredible. So many ideas resonate with me, but I never can't formulate them. Thanks for sharing, all my friends have to read this.

epsilonic

If you like this article, you’ll probably enjoy reading most publications from the Santa Fe Institute.

GregarianChild

I'd be interested to learn who paid for this machine!

Did Sandia pay list price? Or did SpiNNcloud Systems give it to Sandia for free (or at least for a heavily subsidsed price)? I conjecture the latter. Maybe someone from Sandia is on the list here and can provide detail?

SpiNNcloud Systems is known for making misleading claims, e.g. their home page https://spinncloud.com/ lists DeepMind, DeepSeek, Meta and Microsoft as "Examples of algorithms already leveraging dynamic sparsity", giving the false impression that those companies use SpiNNcloud Systems machines, or the specific computer architecture SpiNNcloud Systems sells. Their claims about energy efficiency (like "78x more energy efficient than current GPUs") seem sketchy. How do they measure energy consumption and trade it off against compute capacities: e.g. a Raspberry Pi uses less absolute energy than a NVIDIA Blackwell but is this a meaningful comparison?

I'd also like to know how to program this machine. Neuromorphic computers have so far been terribly difficult to program. E.g. have JAX, TensorFlow and PyTorch been ported to SpiNNaker 2? I doubt it.

floren

As an ex-employee (and I even did some HPC) I am not aware of any instances of Sandia receiving computing hardware for free.

prpl

no but sometimes they are for demonstration/evaluation, though that wouldn’t usually make a press release

bob1029

I question how viable these architectures are when considering that accurate simulation of a spiking neural network requires maintaining strict causality between spikes.

If you don't handle effects in precisely the correct order, the simulation will be more about architecture, network topology and how race conditions resolve. We need to simulate the behavior of a spike preceding another spike in exactly the right way, or things like STDP will wildly misfire. The "online learning" promise land will turn into a slip & slide.

A priority queue using a quaternary min-heap implementation is approximately the fastest way I've found to serialize spikes on typical hardware. This obviously isn't how it works in biology, but we are trying to simulate biology on a different substrate so we must make some compromises.

I wouldn't argue that you couldn't achieve wild success in a distributed & more non-deterministic architecture, but I think it is a very difficult battle that should be fought after winning some easier ones.

mycall

Artem Kirsanov provides some insights into the neurochemistry and types of neurons in his latest analysis [0] of distinct synaptic plasticity rules that operate across dendritic compartments. When simulating neurons in a more realistic approach, the timing can be deterministic.

[0] https://www.youtube.com/watch?v=9StHNcGs-JM

mikewarot

I see "storage-free"... and then learn it still has RAM (which IS storage) ugh.

John Von Neumann's concept of the instruction counter was great for the short run, but eventually we'll all learn it was a premature optimization. All those transistors tied up as RAM just waiting to be used most of the time, a huge waste.

In the end, high speed computing will be done on an evolution of FPGAs, where everything is pipelined and parallel as heck.

thyristan

FPGAs are implemented as tons of lookup-tables (LUTs). Basically a special kind of SRAM.

mikewarot

The thing about the LUT memory is that it's all accessed in parallel, not just a 64 bits at a time or so.

marsten

Interesting that they converged on a memory/network architecture similar to a rack of GPUs.

- 152 cores per chip, equivalent to ~128 CUDA cores per SM

- per-chip SRAM (20 MB) equivalent to SM high-speed shared memory

- per-board DRAM (96 GB across 48 chips) equivalent to GPU global memory

- boards networked together with something akin to NVLink

I wonder if they use HBM for the DRAM, or do anything like coalescing memory accesses.

timmg

Doesn't give a lot of information about what this is for or how it works :/

JKCalhoun

Love to see a simulator where you can at least run a plodding version of some code.

HarHarVeryFunny

The original intent for this architecture was for modelling large spiking neural networks in real-time, although the hardware is really not that specialized - basically a bunch of ARM chips with high speed interconnect for message passing.

It's interesting that the article doesn't say that's what it's actually going to be used for - just event driven (message passing) simulations, with application to defense.

Onavo

Probably Ising models, phase transitions, condense matter stuff all to help make a bigger boom.

shrubble

You don’t have to write anything down if you can keep it in your memory…

rahen

So if I understand correctly, the hardware paradigm is shifting to align with the now-dominant neural-based software model. This marks a major shift, from the traditional CPU + OS + UI stack to a fully neural-based architecture. Am I getting this right?

realo

No storage? Wow!

Oh... 138240 Terabytes of RAM.

Ok.

crtasm

>In Sandia’s case, it has taken delivery of a 24 board, 175,000 core system

So a paltry 2,304 GB RAM

SbEpUBz2

I am reading it wrong, or the math doesn't add up? Shouldn't it be 138240 GB not 138240 TB?

divbzero

You’re right, OP got the math wrong. It should be:

  1,440 boards × 96 GB/board = 138,240 GB

CamperBob2

Either way, that doesn't exactly sound like a "storage-free" solution to me.

Footpost

Well since Neuromorphic methods can show that 138240 = 0, should it come as as surprise that they enable blockchain on Mars?

https://cointelegraph.com/news/neuromorphic-computing-breakt...

jonplackett

Just don’t turn it off I guess…

rbanffy

At least not while it's computing something. It should be fine to turn it off after whatever results have been transferred to other computer.

rzzzt

I hear Georges Leclanché is getting close to a sort of electro-chemical discovery for this conundrum.

throwaway5752

I feel like there is a straightforward biological analogue for this.

But at in this case, one wouldn't subject to macro-scale nonlinear effects arising from the uncertainty principle when trying to "restore" the system.

1970-01-01

The pessimist in me thinks someone will just use it to mine bitcoin after all the official research is completed.

hackyhacky

The title describes this machine as "brain-like" but the article doesn't support that conclusion. Why is it brain-like?

I also don't understand why this machine is interesting. It has a lot of RAM.... ok, and? I could get a consumer-grade PC with a large amount of RAM (admittedly not quite as much), put my applications in a ramdisk, e.g. tmpfs, and get the same benefit.

In short, what is the big deal?