Maybe coders should learn to love analog? (2024)
51 comments
·March 8, 2025zabzonk
kragen
The author is not talking about using an "analog computer"; they are talking about designing analog circuitry:
> Among them are IoT nodes, sensors and front-ends, actuators including motors, test and measurement instrumentation, boards and interconnects, line drivers/receivers, motor drivers, physical-level wired/wireless links, MEMS devices, RF functions, power sources and systems, EMI, noise, thermal extremes…and that’s just a partial list. These roles and functions are not going away as long as the laws of physics, as we know them, remain in place.
(I also think it's misleading to use the term "computer" for things like differential analyzers, just as it's misleading to call a person who adds up numbers a "computer", even though both usages were well established before the invention of the inherently digital devices we call "computers" today. But that's a different discussion.)
stmw
I think the real point of the article is a job of occupation (even just writing code to support analog design work), not necessarily going all the way back to the wonders of using analog computers.
But that's a very cool story.. do you remember which model of an analog computer that was?
zabzonk
Can't remember, I'm afraid. Some obscure British company I guess. Probably one that made music synths back then, as it's the same sort of tech.
SanjayMehta
Linn? They used to make electronic drum kits and briefly dabbled in computer design. Byte Magazine (I think) had a cover story on them but as I recall it, their system was object oriented, not analog.
II2II
It seems as though analog doesn't mean what it used to mean. It seems as though it is a stand-in for physical these days. The physical thing you make may be analog, yet it could very well be digital. The important thing is the product is physical, rather than being a bundle of bits that you ship to someone else who takes care of the hardware it runs on. The tone of the article leads me to think that this is what the author is talking about.
kragen
The author is explicitly talking about designing analog electronics:
> Among them are IoT nodes, sensors and front-ends, actuators including motors, test and measurement instrumentation, boards and interconnects, line drivers/receivers, motor drivers, physical-level wired/wireless links, MEMS devices, RF functions, power sources and systems, EMI, noise, thermal extremes…and that’s just a partial list. These roles and functions are not going away as long as the laws of physics, as we know them, remain in place.
mikewarot
If you want to get your feet wet with analog electronics, I'd suggest getting an Arduino starter kit, with a breadboard, some components, and a cheap Multimeter and Oscilloscope, and just start playing with things. You can fairly quickly build up an intuition of things if you've got something you can get your hands on.
Once you get the hang of the basics at Audio and low RF frequencies, you can then set up GNU Radio, which works with your audio I/O of your computer, etc.. maybe add a $30 RTLsdr dongle, and the next thing you know, you've got a bit of RF under your belt.
petermcneeley
I respect those that have come before and I know there still exists some places where analog is not only the superior option but the only option however almost everything you want to do is ADC then back DAC.
exmadscientist
Analog lives in a few niches:
- As you note, signal conditioning to stuff things into an ADC
- Anywhere firmware is viewed as a liability (often medical or other hi-rel stuff)
- Existing proven designs (do not underestimate this sector!)
- Anywhere the cost of the signal conditioning circuitry might be comparable to the cost of just doing it outright in analog. This is mostly the low-cost realm, more rarely ultra-low-power, but sometimes you see it in other places too.
- Line power supplies happen to be all of the above, so you see plenty of analog there
You used to see analog in high-performance stuff (ultra-high-speed/RF or ultra-high-bit-depth), but this has mostly gone into either digital or whackadoodle exotica. Like frontends for those 100GHz oscilloscopes, which are both!
bgnn
most of analog design nowadays is getting the sign to the ADC, the ADC itself, and the clock generation for the ADC. The ADC is by far the most complex subsystem and it's partially digital partially analog, though the analog parts are also quite algorithmic (binary search, linear search, majority voting, LMS).
The reverse path (DAC) is less common, like 10% of the cases you need a good DAC for signal generation. It's more hardcore analog and harder to design a DAC.
01100011
Lol, I wish. I'm a SWE because it pays much more than using my EE degree.
petermcneeley
many such cases...
Bjartr
Tangentially related:
You can play around with analog programming of a sort with modular synthesizers. It's a pretty neat way to dip your toe into analog signal processing.
kragen
This is a great suggestion!
Another couple of ways to get started with analog signal processing:
- Build an AM radio from transistors. There are lots of tutorials out there.
- Simulate circuits with Falstad's circuit.js. There are some interesting analog circuits already in the set of examples, like https://tinyurl.com/24gccg7p.
- Build an Atari Punk.
You can get very, very good op-amps very cheaply these days. Some of them even still come in through-hole packages. This makes it possible to build interesting audio synthesizer circuits for pennies that would have required significant money outlay in the 70s.
karmakaze
That's been my pastime away from the digital screens of my day job. But I didn't go full modular, instead getting 3 analog monosynths. I need some structure while learning music at the same time and can't venture into the wild world of modular. My gear has lots of CV/Gate ins/outs for that later step though.
analog31
>>> Someone with programming experience could contribute in many of these areas, and still work exclusively at their keyboards and not even getting their hands dirty, if that’s their concern.
I could probably be described as living in the "analog" domain, as a physicist working for a company that makes measurement equipment. Naturally, this could be an ingrained bias, but I've formed the impression that something about getting your hands dirty confers the intuition needed to work productively in this domain. You need to experience being proven wrong by mother nature, over and over again.
Also, if you're sitting at your screen all day, nobody's going to pull you into the loop. It's quicker to just do that stuff ourselves, than to explain it to someone.
So I agree with everything else in the article, because I love analog and love coding. But come on, join us in the lab.
a3w
What is analog? Voltages and circuits and currents, but not digital tubes and transistors?
Most coders in my vicinity are interested in woodworking, is that analog? I think not.
analog31
It's a matter of representation. Do the signals represent continuous or discrete quantities? A digital signal represents a discrete quantity such as an integer or symbol, or a sequence of those quantities. Digital systems possess the feature of "noise immunity," where a signal can be unambiguously interpreted due to rules that involve thresholds. For instance you can look up an oscilloscope trace of the signals on a USB or Ethernet cable, and they look horrid, but those signals can transmit information with virtually zero error.
To expand a bit, since my day job involves this stuff, physical stimuli are always analog. Even the discrete energy levels of an atom make their transitions in continuous time. Yet there are good reasons to do virtually all computation in the digital domain, where "noise immunity" allows processing to occur without the introduction of additional noise, and you enjoy all of the other benefits of computer programming.
These days, the job of the analog person is often to understand the physics of the quantity being measured, and the sensor, but to get a signal safely to the front end of an analog-to-digital converter.
Now, the irony is that I actually spend most of my time working in the digital domain. The reason is that analysis of the digital data stream is how I know that my analog stuff is working, and how I optimize the overall system. So if you watched me work for a week, you'd notice that I actually spend a fair portion of my time coding. I just don't write software for other people to use. That's another department, and their work usually starts after mine is done.
kragen
The author is specifically talking about designing analog electronic circuits:
> Among them are IoT nodes, sensors and front-ends, actuators including motors, test and measurement instrumentation, boards and interconnects, line drivers/receivers, motor drivers, physical-level wired/wireless links, MEMS devices, RF functions, power sources and systems, EMI, noise, thermal extremes…and that’s just a partial list. These roles and functions are not going away as long as the laws of physics, as we know them, remain in place.
Woodworking can be analog if the wood shapes and positions (and maybe velocities, etc.) are used to quantitatively represent something other than the wood itself, as an analogue to those quantities. For example, you can carve some wooden cams to drive a little automaton, or you can make a clock out of wood gears, where the angles of rotation of the gears represent the amount of time that has passed. But this article is specifically about electronics.
benatkin
Anything not digital.
Coders already do love it. Terrible premise.
tracerbulletx
These jobs are far fewer, pay less, and are no more resilient to AI progress making this useless advice.
exmadscientist
Hardware might or might not be more resilient against AI in the long run, but for now, AI is sure doing a terrible job at hardware.
It is somewhat ironic that the single profession AI is best at replacing seems to be software engineering.
mirkodrummer
Not even software engineering, not even in the wettest of dreams of modern dysfunctional ceo that use AI to justify lay offs
from-nibly
I feel like we need to say this more and louder. I'm getting pretty tired of all the breathless ai hype.
inetknght
Programming in analog won't pay as well at all compared to programming in digital.
So telling people to move over to analog will depress that job market even more than it already is.
throw122323
I've got a relative who works at Analog Devices. They're on their third straight month of crunch time, working in 12 hour shifts through the weekends.
Why? Because the dipshits in leadership decided to project the revenue growth during the chip shortage as a straight line for the next 10 years.
Looks like those same dipshits decided the best course of action is to get their soft skulled alumni to write some blog posts to try to herd more cattle into the grinder.
beebaween
I love all things analog other than my macbook.
Smart things drive me completely insane and I find peace with things that just work without a wifi connection or firmware of any kind.
delfinom
As someone in the EE field. The jobs exist but are not plentiful. The physical engineering fields in the US have very largely shrunk due to offshoring, centralization into major OEMs and general efficiencies in doing work. "Analog" is a very cost sensitive and optimized arena.
Well, if you can do it, do it. But in my experience, using an analog computer is nothing at all like digital. I used to have to maintain one when I worked at the University of London, back in the very early 80s (basically making sure plug-board wires hadn't gone bad). Programming one (if you can call it that) required a bit of mathematical nous (which I didn't have enough of, though I was pretty sharp at digital), and the academic I worked with (who did) used to spend a lot of time saying "f*ck" as he tried to set up things like predator-prey simulation demos for the students.