Skip to content(if available)orjump to list(if available)

How to reverse engineer an analog chip: the TDA7000 FM radio receiver

kens

Author here for if you have questions on this chip...

magnat

The separate noise source is a bit of surprise here. Why is it necessary? Wouldn't RF noise produce same results?

kens

I'm not sure what the FM demodulator produces when it's mistuned, but I'm guessing that you'd get pretty much no output, rather than white noise (since there's no frequency for the demodulator to lock onto). The problem for the user is that you wouldn't know if your batteries are dead or if you just haven't found the station. By adding a "hiss" between stations, the radio has better usability

wkat4242

It depends, if the RF frequency you use has a signal on it then it won't be random so it's not really noise. I wonder why they need a noise generator in a receiver chip though.. They're usually used for crypto stuff.

CamperBob2

It's to provide "comfort noise" when the correlator indicates a missing or mistuned signal.

Muting the audio would make more sense -- and would certainly have been familiar to the CB[1] radio operators of the day in the form of a squelch effect -- but this chip was targeted at consumers who expected it to behave like a conventional FM radio.

1: An early incarnation of social media, for better and worse

CamperBob2

In a conventional radio, yes, but I'll bet this approach would sound incredibly awful if mistuned.

contingencies

Hey Ken, great read as always. I wonder if in future you would consider doing an overview of the various early radio chips and their evolution. I recall recently reading some HAM projects and understanding that a lot of the later radio chips were clones of earlier designs. Given your suggestion that this earlier period of integrated radio innovation is 'low hanging fruit' in terms of RE-friendliness, it should be an interesting read and I'm sure a very large number of radio enthusiasts would love to see your insights.

CamperBob2

The correlator is interesting. I don't see how it works. In the perfectly-tuned case, how does delaying the signal by half an (IF?) period and inverting it yield a match for the original signal? Inversion isn't the same as a delay.

I guess the idea is that the 70 kHz IF is effectively sampled at 2x the necessary Nyquist cutoff needed for 15 kHz baseband audio. So the signal content at half the period can be relied upon to match after an inversion and delay, assuming it was (a) band-limited at the source (or by the clever deviation-reduction scheme), which it would be; and (b) tuned correctly.