Skip to content(if available)orjump to list(if available)

RF Shielding History: When the FCC Cracked Down on Computers

EvanAnderson

Ahh, alternative futures...

If the FCC hadn't been so strict I think there's a good chance we'd be using computers with a lineage going back to Atari versus IBM today.

Commodore ate Atari's lunch with the C64 and pricing, but Atari could have launched the 400/800 at lower price points with more lax emission standards. They would have had lower peripheral price points, too, since the SIO bus and "smart peripheral" design was also an emissions strategy.

On the home computer front the Atari 8-bits beat the pants off of the PET for graphics and sound capabilities. More success in the late 70s might have enabled Atari to build down to a price point that would have prevented the C64 from even happening.

On the business side Atari R&D had interesting stuff going on (a Unix workstation, Transputer machines). Alan Kay even worked there! They were thinking about business computing. If the 8-bits had had more success I think more interesting future products could have been brought to market on the revenue they generated.

squeedles

I happened to buy an Atari 800 at the peak of this and was amazed at the metal castings that surrounded everything. That little 6502 could survive small arms fire! That shielding was far beyond anything else at the time.

And you make a good point about the SIO bus - this was when every other machine had unshielded ribbon cables everywhere. Their devotion to daisy chained serial really crippled them in terms of speed, and when USB finally arrived, I initially scorned it due to the prejudice formed by my experience with the Atari peripherals! It turns out they were on the right track all along!

EvanAnderson

You may not be aware, but Joe DeCuir, who worked for Atari on the VCS and 8-bit computers, also worked on the development of USB. Some of his Atari engineering notes helped fend off a patent troll who tried to claim USB was infringing. It's a neat story. There are a ton of interviews with him about it. He gave a nice presentation at VCF a few years ago where it was mentioned, too: https://youtube.com/watch?v=dlVpu_QSHyw

fidotron

> If the FCC hadn't been so strict I think there's a good chance we'd be using computers with a lineage going back to Atari versus IBM today.

And/or many of the other manufacturers of that era. I have encountered execs from that era that still believe the whole thing was some sort of shrouded protectionism.

null

[deleted]

MountDoom

The regulatory landscape here is pretty funny. In all likelihood, the worst RFI offenders in your home are LED lights, followed by major appliances. Both of these are regulated less than something like a computer mouse. For lightbulbs, I think the manufacturers just self-certify.

I guess there are two ways to look at it. Either the regulation was wildly successful, so the problems persist only in the less-regulated spaces. Or we spend a lot of effort chasing the wrong problem.

Flamingoat

If I turn my kettle or microwave on in my kitchen it will kill any bleutooth or wifi signal. My microwave is getting on for 15 years old, maybe newer ones are better, but the kettle was bought last year.

elevation

If you cannot change the microwave, consider trying a different wifi channel. I once had a 2012 Panasonic microwave that killed 802.11g channels 7 and 14 but not channel 0.

FuriouslyAdrift

Microwave ovens hover in and around the 2.4 GHz range just like 802.11b/g. Switching to 5 or 6 GHz (802.11a/n/ac/ax/etc) - can help immensely.

schoen

I remember being confused as a kid about the "this device must accept any interference received, including interference that may cause undesired operation" labels.

I kept reading "must accept" as a technical requirement, somehow like "must not be shielded against" or "must not use technical means to protect against", rather than what I now think is the intended legal sense "does not have any legal recourse against".

It's weird that they phrased it in terms of how the device itself must "accept" the interference, rather than the owner accepting it.

bitwize

I always thought it meant "must continue to function despite" such interference, i.e., it must not blow up or break permanently in the presence of such interference.

alwa

I was gratified by the last little tidbit: a nod to the Ohio “tinkerer” whose 2019 experiment in home automation interfered with neighbors’ 315MHz-band devices to the point that the power company shut off the whole block in an attempt to isolate the interference

https://www.nytimes.com/2019/05/04/us/key-fobs-north-olmsted...

(https://archive.is/aTWZ2)

Apparently the regulations work well enough to provoke an official response when garage door openers stop working over the area of a few houses… a level of reliability I’d long taken for granted

afandian

I'd heard, probably in a Centre for Computing History [0] interview or similar that these regulations contributed to the BBC Micro never getting a good foothold in the USA and losing to Apple.

It had an amazing selection of ports, all unshielded and designed for flat ribbon cables. But that wouldn't fly in the USA.

[0] https://www.youtube.com/@ComputinghistoryOrgUk1

whartung

Anecdotes from the age.

When I would fire up my KIM-1, the TV would turn to snow.

There was a toy called the "Big Trak", a programmable ATV toy. If you ran that underneath the desk with a TRS-80 on it, it would crash.

The TRS-80 Model 1 was notorious for this, as you connected the computer to the expansion interface with a bare, 40(? ish?) pin ribbon connector. It was a beautiful broadcast antenna for computer signals.

The FCC was an impetus for the Model 3.

anjel

Back in the mid-century, we used to put an AM radio anywhere on the same desk next to the TRS-80 and the revealed cacophany was endlessly fascinating. As I recall, radio tuning was unnecessary.

whartung

We used to put them next to our TI-58/59 calculators. You could use the radio for sound effects in games.

bitwize

The first application developed for the MITS Altair, besides watching the blinkenlights blink, was playing music on a nearby AM radio.

transpute

"Using Deep Learning to Eavesdrop on HDMI from its Unintended Electromagnetic Emanations" (2024), https://news.ycombinator.com/item?id=41116682

"Tempest-LoRa: Cross-Technology Covert Communication via HDMI RF Emissions", https://news.ycombinator.com/item?id=44483262

mmastrac

For those of you who watch Adrian Black on YouTube, you might remember him angrily tearing out RF shielding from the older computers.

On the other hand, I have been struggling to get my IP KVM at home working and it turned out that the cause of its failure was some cheap PoE extractors that spew noise across the spectrum, especially interfering with VGA sync.

Modern equipment, assuming you aren't buying bargain-basement AliExpress junk (which I do, from time to time) is surprisingly good at RF rejection.

And, amusingly, this just popped up on Twitter: https://x.com/hakluke/status/1980479234159398989

jcalvinowens

The worst RFI I encounter in my day to day life is from ethernet switches... I really wish the FCC would stop allowing the use of 125.0MHz on airband. My local airport (KPAO) uses that as it's ground frequency, and it's every bit as terrible as you'd expect :D

mrandish

Having lived through the early 8-bit home computer era as a teenaged user and then in the mid-80s working in tech startups making hardware peripherals for 16-bit computers - although not as a hardware designer, here's my perspective. Early digital devices definitely could occasionally cause interference with TVs and radios in their immediate area, so there needed to be some regulation to address the issue.

However, once aware of the potential problems it wasn't too hard or even very expensive to design hardware which avoided the most serious problems. Properly grounding components and a little bit of light shielding here and there would generally suffice to ensure most devices wouldn't cause noticeable issues more than two walls and 30 feet away. I think by the 90s the vast majority of hardware designers knew how to mitigate these issues while the evolution of consumer device speeds and designs reduced the risks of actual interference on both the 'interferor' and 'interferee' sides.

Unfortunately, the FCC's regulatory testing requirements didn't similarly evolve. Hardware designers I worked with described an opaque process of submitting a product for FCC testing only to receive a "Pass/Fail" with no transparency into how it was tested. Sometimes the exact same physical product could be resubmitted a month later with zero changes and pass. This made things unpredictable and slow, which could be a lethal combination for small hardware startups. So there emerged a sub-industry of "independent RF testing labs" which you could pay to use their pricey gear and claimed expertise to test your device and tell you why it failed, let you make a change right there and retest again until you passed. This made things more predictable but it could cost upwards of $10K (in 90s dollars) which was a serious hardship for garage startups. I was told a lot of the minor hardware changes made during such interactive testing probably did nothing to decrease actual interference in the real-world and only served to pass the test.

Then came the era of "weaponizing" FCC certification. Small startups could avoid the costs and delay of FCC testing by filing their product as a "Class A" device (which meant only for use in industrial/scientific environments) instead of as a "Class B" (consumer) device. The devices still had to not interfere but their makers could self-certify their internal tests without going through FCC testing. When new hardware startups would threaten a large, established company product with a cheaper, better product shipped as "Class A", BigCo would report them either interfering or just being used in consumer environments - despite the device very likely not interfering with anything. This ended up creating a lot of problems for such startups because if their cool new product ended up even once in an arguably "retail distribution channel", they could get hit with big fines - all without ever causing any actual interference - and even if the device was able to pass FCC testing and would have been certified as Class B. It got especially ridiculous when a lot of cheaper products were simply generic designs, like a modem using the standard Rockwell chip set and reference design. These were often made on the same production line and even used the same circuit board in a different case as other products which all passed FCC testing. But if you didn't have your official "FCC Cert", you could get busted.

I left the hardware space in the early 2000s so I never heard if these regs were ever modernized, but it sure seemed like they were in need of it.

jaydenmilne

Tangentially related: once I bought a no name Amazon HDMI switch that would cause FM interference but only when the screen was mostly white: https://youtu.be/n2DPLEvwO-k

Another reason to use dark mode I guess

mmastrac

What's interesting is that HDMI is supposed to have a scrambling system that prevents any repeating pattern from causing EMI. I wonder if there was an unshielded, unscrambled raw data path somewhere in the switch.