Skip to content(if available)orjump to list(if available)

A Tektronix TDS 684B Oscilloscope Uses CCD Analog Memory

Retr0id

Wow, great timing!

I recently got a TDS684A and also made the same discovery, and wrote half a blog post about it which remains unfinished/unpublished. I don't have much of an EE background (at least, not on this level) so my article was certainly worse.

Relatedly, I dumped the firmware of mine (referencing Tom's notes on a similar model) and started writing an emulator for it: https://github.com/DavidBuchanan314/TDS684A

It boots as far as the vxworks system prompt, and can start booting the Smalltalk "userspace" but crashes at some point during hardware initialization (since my emulation is very incomplete!) - some screenshots/logs: https://bsky.app/profile/retr0.id/post/3ljzzkwiy622d

Edit: heh, I just realised I'm cited in the article, regarding the ADG286D

bgnn

Nice work. Just a comment: ADC clock is very sensitive to capacitive loading of the probe as it cannot drive that large load, it most likely fails to generate enough signal swing. It's not advisable to probe clock signals like that.

weinzierl

In the 90s Conrad (think Germany's RadioShack) sold a little audio recorder module that I believe was CCD based. It could only store 16 seconds and quality was terrible but audio stored completely in solid state without moving parts was revolutionary back then.

Thinking about it, I might still have the device somewhere in the attic.

benob

> If you ever remove the interconnection PCB, make sure to put it back with the same orientation. It will fit just fine when rotated 180 degrees but the scope won’t work anymore!

I remember pulling a 486 out of its socket in the 1990s and putting it back with the wrong orientation. There was a poc and a bit of smoke. Something on the mainboard had burnt and it wasn't working anymore.

I used smell to locate the fault, a big trace on the PCB, which I soldered back and magic, it all worked again...

MarkusWandel

I have a story like that! Back in my lab rat days, I dropped a scope probe and its grounded sleeve skittered down the back side of a powered-up, prototype memory board (with 240 individual RAMs on it), leaving a trail of sparks. After that it was dead.

Uh oh. We needed that board. What to do? Well, it can't hurt to try. We had "freeze spray" for debugging purposes, so got a bottle of white-out handy (what's that?), frosted up the board really well on the component side, powered up, and quickly marked the devices that defrosted notably quicker than the rest.

Got the solder station lady to replace all those parts and it worked again.

Old days...

codedokode

I didn't even know such things exist. When I wanted to know the delays of 7400-series logic gates, I built a logic analyzer using Arduino and stroboscopic effects which provided resolution on order of 1 ns. But with CCD it seems that it could be done much easier, and allowed to measure exact voltage and not just logic levels.

nottorp

Unrelated but related:

I have an entry level standalone oscilloscope that i got but never used. I once looked for tutorials and unpacked it, ready to test, but:

It's covered in that kind of plastic that goes all gooey if left unattended for a long time.

Any hints on how I can clean it up so i can touch it again?

pehtis

Its called soft touch or "rubber oil". It's basically silicone suspended in plastic which gives that soft touch feeling before it biodegrades to the sticky goo.. We can easily clean that with isopropyl alcohol. You will be left with the clean hard plastic below that layer.

MarkusWandel

This isn't nearly as technologically radical, but a Tek scope of that era that we had in the lab then had a sequential colour display. A monochrome CRT with a fast-changing electronic colour filter in front of it. Totally sharp (for the era) colour graphics with no subpixels. Until you moved your head and you briefly saw the individual colour frames. The HP stuff was stodgy and boring and predictable by comparison.

The HP logic analyzers back then had a really neat touchscreen interface based on criss-crossing infrared beams in front of the CRT face. The only thing that I've ever used that felt even better than a capacitive touchscreen, though obviously lower resolution.

tverbeure

I'm pretty sure that this TDS 684B has the same kind of CRT. When I take a photo of the display, you can often clearly see it switching from one color field to the next in the middle of the screen.

As for the HP touch screen, I tore down a bunch of HP 16500A logic analyzers and reverse engineered the touch screen PCB. It uses a pretty simply LED/photo sensor matrix. You can see the PCB in one of the pictures here: https://tomverbeure.github.io/2022/06/17/HP16500a-teardown.h....

grishka

Modern video projectors also display color channels sequentially and also exhibit that effect. Also, this particular CRT technology was not only used in oscilloscopes: https://www.youtube.com/watch?v=z-q8ehzHeQQ

sgerenser

Not all video projectors, only color wheel DLP projectors (which were once the majority of the home projector market but not sure if that’s still the case). Always drove me nuts but apparently most people don’t notice it.

null

[deleted]

cmrdporcupine

Not any kind of EE, just a nerd: could this kind of CCD circuitry be used to model a neural network, but in analog-land?

bgnn

There are similar methods to model neural networks in analog. Switched capacitor circuits like CCD are semi digital as they are time discrete. This makes them very interesting for analog computing. It kinda never got popular though, as capacitors are much larger than transistors, and you need an ADC and a DAC to interact with a digital computer. In fact you need a lot of them, and they consume even more power than the digital alternatives.

The biggest use case for this is sensor interfaces where the signal is still analog (not passed through an ADC yet). Voice recognition is a typical example where analog neural networks are used to certain level of success, and now people are pushing for insge recognition but the architecture of a digital camera isn't compatible with that, so I don't see much happening there.

Funny fact is these kind of circuits are used in analog portion of the chips to implement rather complex calibration/training loops (correlation, LMS optimization, pattern recognition etc) heavily since early 90s. There's a lot of analog computing happening in every SoC.

formerly_proven

> The input to the ADC is clearly already chopped into discrete samples, with a new sample every 120 ns. We can discern a sine wave in the samples, but there’s a lot of noise on the signal too. Meanwhile the TDS684B CRT shows a nice and clean 1 MHz signal. I haven’t been able to figure out how that’s possible.

Is this maybe using some form of correlated double sampling?

> It looks like the signal doesn’t scan out of the CCD memory in the order it was received, hence the signal discontinuity in the middle.

Or maybe the samples are also interleaved in the low-order bits in some way. This could be because the organization of the CCD isn't symmetric for the input and output paths, perhaps to reduce area or power, since only one path has to be fast. This would make sense because if you implement the CCD using n parallel CCD bucket brigades you only have to put a fast S&H and multiplexer in front of it, then you can drive the CCD brigades at a fraction of the actual sample rate, and the capacitive load of each of those clock phases is much lower as well.

tverbeure

I've been doing more measurements after some discussions on discord after publishing the blog post. There's definitely some kind of interleaving going on, but even when applying a square wave at the input (that gets displayed perfectly fine on the scope CRT) there is a lot of noise on the ADC input that can't be explained away. I will update the blog post tonight after doing more measurements.

Some people have also suggested deliberate addition of a pseudo-random signal that gets removed after sampling to counteract some CCD issues. But I don't know how that would work either.

crote

Could it be some kind of fixed per-bucket offset?

I wouldn't be surprised if the CCD has all sorts of funky analog stuff going on internally which has different impacts on different samples, which would be incredibly hard to deal with on its own.

However, if this behaviour is merely a fixed offset, it would be fairly easy to compensate for this on the digital side: just do a calibration with a known signal, and the measured offset can be used to reverse its effect in future sampling.

tverbeure

Could be. I've already measured with sawtooth and square waveforms to get a better idea about the interleaving and noise, but not yet with a pure DC input.

Another possibility is that there's some charge decay which you could calibrate for.

monster_truck

The shapes in the noise vaguely look like they are repeating. Perhaps something like time%sampleRate (or some other context dependent value)? Easy enough to filter out while still providing enough to know it's a coherent signal

formerly_proven

You mean like a signal added to combat / detect aliasing? I'm not sure how that would work. I know HP did something like this, but they did it by introducing a known jitter into the time base. I don't think it could work by merely adding a signal, you need to interact with the incoming signal in the time domain to do something about aliasing. I have no experience with these older CCD-based ones, but the later TDS'es from the 90s could do aliasing. The time base jitter thing was probably patented by HP.

dp-hackernews

I've not read the article, but from your response, is this similar to "oversampling" from the world of CD audio?

Oversampling Versus Upsampling: Differences Explained https://www.soundstagenetwork.com/gettingtechnical/gettingte...

amelius

Maybe this CCD/delay-line/bucket-brigade trick can finally inspire someone to make a cheap DIY device that can sample USB3 signals, useful e.g. to check signal integrity. A missing tool in my toolbox.

I mean, I think this would be a very nice project for someone with hardware skills and some time on their hands, and it would be useful too.

tverbeure

This whole exercise started because I wanted to see if it's possible to add a sniffer to the memory chips of the TDS 684B so that I can use it as the analog frontend of a USB scope. It's not possible because of this CCD trick.

I've also looked at the existence of CCD memory, but it doesn't seem to be a thing anymore. I didn't find any such modern chips.

fecal_henge

https://www.psi.ch/en/drs

-not quite the same but similarly novel.

amelius

tverbeure

I didn't know that kind of product existed! I don't think it's useful to slow-sample a fast signal (e.g. they delay digital signals, not analog ones), but some of those chips have a BW of 1.5GHz.

null

[deleted]

topspin

The Thunderscope story had a comment about USB3 sampling as well. Here, you cite "check signal integrity," as a reason. I'm curious. Is USB3 signal integrity really an active area of research or troubleshooting? I would have figured this has all been well characterized and mostly solved by device manufacturers selling working USB3 components by the boat load for years now.

crote

High-speed signals are incredibly fragile. The chips themselves are a solved problem, but you still need quite a bit of skill and experience to design a PCB for them. It's quite easy to mess up in subtle ways which look like they should work, but actually ruin your signal to the point of being completely unusable.

If you're designing a board, not being able to look at its signals is a major limitation. Is something wrong with the transmitter, receiver, cable, connector, pcb, firmware, driver? Who knows! It doesn't work, and that's all you're going to get. Have fun randomly tweaking stuff in the hope that it is magically going to work.

tverbeure

The ICs are a solved problem, but hobbyists like me designing their own USB3 capable PCB is a different story. Whether or not having a high BW scope would be helpful to detect signal integrity issue is not clear to me: measuring these kind of signal is an art in itself.

null

[deleted]

blagie

Tektronix instrumentation from this era (as well as HP/Agilent, and many of the military-affiliated labs) used pretty magical engineering tricks.

In order to be able to design equipment, the instrumentation generally needs to outperform the equipment, sometimes by a significant margin. If I'm looking at the eye of a digital signal, I need to capture much faster than the signal.

It'd be fun to have a book of tricks from this era. At some point, it will fade into obscurity. Right now, it's a whole different bag of tricks for the state-of-the-art. They feel less... more textbook and less clever.

On the other hand, what's nice is that in 2025, decent equipment is cheap. There's a breakpoint where below around 100MHz, you can't do basic work, and above you can. That's roughly where FM pickup and a lot of oscillations sit. That used to cost a lot, but as technology progressed, we're at a point where a decent home lab can be had for well under a grand.

mycatisblack

   In order to be able to design equipment, the instrumentation generally needs to outperform the equipment, sometimes by a significant margin.
Flashback to my days as beginning TLP engineer. I was subjecting ESD protection structures to kV pulses with ~nanosec rise-time. The oscilloscope measures the pulse as it enters and reflects. You’d increase the voltage until the device breaks and do a wafer mapping. I remember a conversation where I showed the setup to a colleague from a diff department, telling him we’re developing next-gen protection against static discharges.

To which he replies: why don’t we use what the guys from the oscilloscope are using?

scrlk

> It'd be fun to have a book of tricks from this era.

Though it isn't a book, the Hewlett-Packard Journal is a gold mine for this type of content: https://web.archive.org/web/20120526151653/http://www.hpl.hp...

E.g. An 8-Gigasample-per-Second, 8-Bit Data Acquisition System for a Sampling Digital Oscilloscope (October 1993): https://web.archive.org/web/20120526151653/http://www.hpl.hp...

null

[deleted]

floxy

>It'd be fun to have a book of tricks from this era.

I think you'll get a kick out of:

"Analog Circuit Design: Art, Science and Personalities"

https://www.amazon.com/Analog-Circuit-Design-Personalities-E...

null

[deleted]

null

[deleted]