Skip to content(if available)orjump to list(if available)

BPS is a GPS alternative that nobody's heard of

Lammy

I hope it will still be possible to receive a BPS timing signal privately and anonymously with ATSC 3 like one can with GPS. ATSC 3 has the Dedicated Return Channel because marketers “““need””” to spy on every-fucking-thing we do: https://www.atsc.org/wp-content/uploads/2024/04/A323-2024-04...

“Conventional linear TV services alone (albeit ultra-high-definition) may not be sufficient to sustain the terrestrial broadcasting business which requires a large amount of highly coveted spectrum resources. Intelligent media delivery and flexible service models that maximize the network Return on Investment (ROI) is of paramount importance to the broadcasting industry in the new era.”

That's a lot of fancy words to say ‘we're doing this because it makes us more money’ lol

“Recent studies have shown that interactivity between media customers and service providers and between users themselves will be one of the most important features in the next-generation media service. In this document, this unique opportunity is addressed by defining a Dedicated Return Channel (DRC) system for the next-generation broadcasting system.”

geerlingguy

Yeah... and that's one of the most innocuous new "features" in ATSC 3.0.

Almost everything I've seen (besides BPS, and maybe HDR if you're one of the few who has a really good home theater setup) is a benefit for broadcasters and advertisers, and a bit worse for consumers (especially requiring new hardware/decoders... and sometimes persistent Internet connections!).

1970-01-01

Same feeling here. ATSC 3.0 with DRM and persistent Internet requirements tells me this is going to be the downfall of OTA television. I can see ATSC 4.0 being a discounted ISP subscription paired with some OTA location checks via BPS.

throwing_away

There is 1 provider for ATSC3 DRM (which is already rolling out in major markets): Google Widevine.

There is 1 operating system for ATSC3 DRM: Android.

There are several SoCs that can be used for "Level 1 Widevine".

When a SoC is compromised and the key is leaked from the TEE, all models of that device with the key are now untrusted for Level 1.

I think people should just be aware of the state of play.

Alive-in-2025

So atsc 3 won't work unless you have some continuous "i'm user xyz watching this channel at time x.y.z" going back to the broadcaster?

Why would anyone use atsc 3? It's not free over the air, you can't spoof it?

philistine

This is so user-hostile that I don't believe there will be any adoption. It will have to be shoved down the throat of those who use broadcast, which are overwhelmingly older and more remote.

ATSC first generation will probably outlive this DRM-driven abomination.

threemux

You're not wrong, but don't forget about the upgraded modulation and coding schemes. That might actually help consumers on the edge of coverage receive broadcasts and will definitely be an improvement over creaky 8vsb

pridkett

I live in an area that is on the edge of coverage and has lots of hills. On ATSC 1.0, CBS is hard to pick up. Frequently unwatchable - which means unreliable for sports. I picked up an HDHomerun Flex 4K a few years ago. Basically the same week that ATSC 3.0 went live.

For a few weeks it was glorious. I had no problem picking up CBS (it was broadcast from the same antenna as ATSC 1.0 - so it was the modulation that was helping out). And then, after a little over a month. Whack! No more CBS. They turned on DRM. They are still the only network that in my area with DRM. Ughh.

Under the previous administration I filled a few issues about this from a public safety perspective - I live in with the FCC an area with unreliable power. During severe weather, we often lose Internet and power (which knocks out cable TV too). Requiring working internet to watch TV to monitor the progress of a tornado in your area seems stupid and dangerous. Unfortunately, nothing happened then regarding the issue and given the way that Brendan Carr is taking the FCC, I don’t think there will be any progress on this.

swores

> "That's a lot of fancy words to say ‘we're doing this because it makes us more money’ lol"

You say that as if they're using lots of words to obfuscate that fact, but the quote you pasted has them saying entirely directly "maximize the network Return on Investment", which is just normal business terminology (and only one word more than your "it makes us more money"!)

Obviously this has no impact on whether that's a good or bad thing, I'm just pointing out that they weren't using a lot of words to hide that fact.

Lammy

The utilization of an expanded lexicon, replete with polysyllabic and sesquipedalian terminology, engenders an ostensibly enhanced verisimilitude and an amplified capacity for rhetorical suasion in the articulation of one's propositional assertions.

swores

"Return on investment" isn't using uncommon words to try to be hard to interpret, it's a standard business term in English-speaking countries.

The fact that it seemed like its intention was to obfuscate is just a sign that you're not familiar with basic business terminology, nothing more.

(And there's no shame in that, nobody knows the common phrases in every area of society, if you've never taken any business classes nor been involved in running a business there's no reason you should know it - but that doesn't mean people who use those words are trying to hide what they mean.)

Edit: and unsurprisingly it's therefore also frequently seen on HN:

https://hn.algolia.com/?q=return+on+investment

https://hn.algolia.com/?q=ROI

dajtxx

Shorter, Milchick!

throw0101d

For anyone who wants to know about ATSC 3.0 the Antenna Man channel covers over the air (OTA) stuff in the US:

* https://www.youtube.com/watch?v=cw3W7MoafR4

* https://www.youtube.com/@AntennaMan/videos

ATSC 3.0 allows for DRM/encryption as the parent comment mentions.

xattt

I just realized the BPS is there to augment the return channel. Not only can the advertiser figure out what you are watching, but also where you are located.

lxgr

Wi-Fi geolocation has been around for a while now and is very accurate too, so if TV/receiver manufacturers choose to reveal your location to advertisers (whether out of their own greed or because it's required to receive ATSC 3.0 DRM keys), they can already do so without any problem.

Lammy

Thanks to the beamforming that's central to modern Wi-Fi and 5G cellular standards they can even see through your walls using the backscattered energy of the steered beam, like an airport security scanner for the entire planet:

https://arxiv.org/pdf/2301.00250.pdf

https://people.csail.mit.edu/fadel/papers/wivi-paper.pdf

karaterobot

> Recent studies have shown that interactivity between media customers and service providers and between users themselves will be one of the most important features in the next-generation media service. In this document, this unique opportunity is addressed by defining a Dedicated Return Channel (DRC) system for the next-generation broadcasting system.

Wow, that's one of the best uses of corporate-speak euphemism I've seen. Everybody who reads it knows what it really means, but if you just don't say it, it's fine. Recent studies indeed!

kmeisthax

Wait, to be clear, this 'dedicated return channel' is just for TVs to broadcast back to the station that they're watching the adverts? I thought ATSC 3.0 was going to rely on IP backhaul for that. Actually broadcasting back seems... impractical at best.

I mean, let's keep in mind, even ATSC 1.0 had really awful reception issues; compared to analog NTSC where there was enough redundancy that you could just tune into a garbage station from way too far away and see something. Now imagine trying to make that already unreliable channel bidirectional. I just really hope all the return channel stuff is optional, because it sure as hell isn't going to work without way more stations broadcasting on more channels, and OOPS you've reinvented LTE.

DoubleGlazing

Back in the late 90s when Ireland was starting to think about digital terrestrial TV a system called DVB-RCT was considered. Basically your receiver could also transmit back to the television transmitter. The system could handle thousands of concurrent connections, albeit each one had very low bandwidth - around 1Kbp/s in peak time.That was considered good enough for very basic interactivity and for authorising PPV purchases etc.In quieter hours or in areas were there were fewer receivers that bandwidth would be much higher, but in reality that would be rare.

In the end the company that the governement selected to start the rollout of DTT went bust and I don't think the system was used anywhere else. The developer of the technology abandoned it in 2006 as other connection methods (broadband/mobile data) were preferred.

timewizard

Ironically there's no redundancy in NTSC. There are layers of information and they degrade downward until you have just a black and white picture with no sound.

In ATSC there is two types of forward error correction on the digital bitstream. The problem it faces is it sits in the same channel allocations as NTSC while having to deliver significantly more information than NTSC. That and the actual digital modulation used is not as ideal for receivers to capture.

kmeisthax

You can digitize and lossily compress an NTSC signal significantly without losing much detail; that's why I call it redundant. Compression removes redundancy.

In ATSC the tradeoff between compression and error correction is such that a noisy channel is far more likely to cut out or otherwise be unusable than it would have been in NTSC.

mindslight

I would think that the expectation is for the common option to be IP backhaul based on embedded LTE/5G modems. Having a separate communication scheme is probably more of a hedge for super rural areas that don't have LTE/5G or other IP coverage, especially as that is where broadcast TV will have more staying power.

Lammy

This is my understanding as well. You can see the usage-reporting scheme and data structure here: https://www.atsc.org/wp-content/uploads/2024/04/A333-2024-04...

“The fundamental record that captures consumption information is called a Consumption Data Unit (CDU). For a streaming A/V channel, each CDU identifies a reporting interval during which a service is accessed. Such a CDU includes the service identifier, the time the service access started and the time the service access ended. If any Applications are active during the report interval, it also records when the Applications are active (whether on a primary device or a “second screen”, companion device), including the Application Identifier, the time the Application started being active, and the time it stopped being active.”

“For services, events logged into a CDU shall correspond to all usage intervals of no less than 10 seconds and may correspond to shorter usage intervals. For Application activity, events logged into a CDU shall correspond to all usage intervals of no less than 5 seconds and may correspond to shorter usage intervals. The precision and accuracy of start times and end times in the CDUs should be within 1 second.”

The payload schema is a 4651-byte JSON structure, so I would imagine that a response-payload fitting this schema would be same-order-of-magnitude size. With 10-second granularity that works out to roughly a half-kilobyte-per-second data rate, and according to the DRC spec the maximum payload size of one DRC message is 2048 bytes.

It will also report when you play something back from a DVR:

“Component.SourceDeliveryPath – Delivery path used for or the source of the content component indicated by the parent Component element.

SourceDeliveryPath.type –

0 – Broadcast delivery (content component is delivered by broadcast)

1 – Broadband delivery (content component is delivered directly by broadband by broadcaster)

2 – Time-shift-buffer source (content source is local time shift buffer)

3 – Hard-drive source (content source is local hard drive)

4 – Delivery via direct connection (HDMI)

5 – Alternate IP delivery (content component is delivered via intermediary)”

m463

Just like 5G, which provides unexpected connectivity for IoT devices.

Search for "miot" or "mmtc"

arghwhat

There is zero relation there.

OP is falling about the spec incorporating usage monitoring.

5G was designed with a very public and explicit goal for IoT of allowing many more devices to connect than 4G could, and more conveniently. Nothing unexpected or user-harming, and nothing new as 4G was already used for IoT.

wkat4242

Yeah at my work we still get a lot of the old 5G propaganda. "It's groundbreaking for IoT". Beh screw that it's just the successor to 4G. A bit faster and more flexible. The rest is just stupid marketing. There's nothing it can do that 4G couldn't, only slightly less efficiently.

The one thing that was really new was support for super high density environments with mmWave. That would have been ideal for stadiums etc. With regular tech the networks get overwhelmed. mmWave offers more cells in a tiny area. But here in Europe that's been given up for good. Phones don't even come with mmWave antennas anymore.

timthorn

When EE launched 5G in the UK, they took out a full page ad with a massive headline saying simply "A real crowd pleaser"

One of the best adverts for a long time - if you knew about the technical advantages of 5G, it had a real double meaning.

If you knew 90s British dance music, it had a further humourous double meaning promoting the telco (an oblique reference to The Shamen's Ebenezer Goode)

Lammy

> Nothing unexpected or user-harming, and nothing new as 4G was already used for IoT.

https://www.fastcompany.com/90314058/5g-means-youll-have-to-...

elzbardico

We should create technology that deliberately feeds trash data to marketers, in mind-boggling volumes. Drowning the signal in a biblical flooding of noise.

We should make things so useless and annoying for them, as they did for us.

geerlingguy

Note that this blog post (and the associated video) were a quick off-the-cuff thing while I was on the NAB show floor—I have been talking to a few of those involved in the testing at NIST, Sinclair, and Avateq (among others), and will hopefully have a lot more in a follow-up.

Right now it's in the experimental stage, with only 6 towers total deployed (only 5 were operational during NAB, and only one in Nevada... so timing, not navigation yet).

The ultimate plan—which is probably dependent on how well ATSC 3.0 rolls out (which has plenty of hurdles[1])—is to encourage broadcasters to add on the necessary timing equipment to their transmitter sites, to build a mesh network for timing.

That would allow the system to be 100% independent of GPS (time transfer could be done via dark fiber and/or ground-satellite-ground directly to some 'master' sites).

The advantages for BPS are coverage (somewhat) inside buildings, the ability to have line of sight nearly everywhere in populated areas, and resilience to jamming you can't get with GPS (a 100 kW transmitter signal 10 miles away is a lot harder to defeat than a weak GPS signal hundreds of miles away in the sky).

The demo on the show floor was also using eLoran to distribute time from a site in Nevada to the transmitter facility on Black Mountain outside Vegas, showing a way to be fully GPS-independent (though the current eLoran timing was sourced from GPS).

[1] ATSC 3.0, as it is being rolled out in the US, doesn't even add on 4K (just 1080p HDR), and tacks on 'features' like 'show replay' (where you tap a button and an app can stream a show you're watching on OTA TV through the Internet... amazing! /s), DRM (at stations' discretion, ugh), and 'personalized ad injection' (no doubt requiring you to connect your TV to the Internet so advertisers can get your precise location too...). Because ATSC 3.0 requires new hardware, consumers have to be motivated to buy new TVs or converter boxes—I don't see anything that motivates me to do so. I feel like it may be a lot like the (forever ongoing) HD Radio rollout.

toast0

I bought an atsc 3 tuner, and the experience turned me off of OTA tv. Since then, things managed to get worse as when I was poking around, DRM wasn't in use, but now it is.

I was hoping to get better fidelity between the roughly 2x bitrate per channel, and the video codec update. And probably overly optimistically was hoping the 1080p feed source was progressive so there wouldn't be a deinterlacing step.

Otoh, local broadcasters use an audio codec I can't easily use, integration with mythtv is poor, and there's no sign anything is going to get better soon.

Maybe if I had a tv with an atsc 3 tuner, live tv would be an option, but I'm not buying a tv for that.

ATSC 1.0 took a while before gathering momentum, so maybe that's going to be the same here, and in another few years, it might make sense to consider a transition. OTOH, maybe the writing is on the wall and OTA broadcasting will die on this hill. I was an OTA enthusiast, but between ATSC 3 being terrible, and the reallocation of spectrum that means cellular base stations sometimes overwhelm my pre-amp, it's not much fun anymore. (I have a filter post-pre-amp but it'd be better if I got on the roof to put it pre-pre-amp, but roofs are scary) Maybe I'm just getting curmudgeonly though.

ksec

Why is US ATSC 3.0 so bad? It is nearly a decade since it was South Korea have it deployed and operational. The standard itself is no longer "next gen". Brazil's TV 3.0, also uses ATSC 3.0 is so much better in every aspect.

Even if someone mandate it as requirement for TV sold next year all the tech inside are at least 10 years old ( HEVC ? ) . Not to mention the roll out time. Do Americans only watch Cables and Netflix? And not Free to Air TV? Which is what I belief what most of the worst still do to a larger extend other than Internet streaming.

They might as well look into the standards before putting a mandate into it.

michaelt

Broadcast TV modernisation is trapped between a load of enemies.

To the north, competition from a huge installed base of last-gen technology, which is mostly good enough.

To the south, streaming services, youtube and cable. These let people watch whenever they want (nobody has VCRs any more) and they've offered 4k for over a decade.

To the east, the industry's dumb decision to build the 'next gen' technology atop a patent minefield, and load it with DRM. So if you manufacture this tech, you can face huge surprise bills because in implementing the spec you've unknowingly infringed on some nonsense patent.

And to the west, the commercial reality that showing someone an advert in 4K isn't any more profitable than showing the advert in 1080p. If you're a broadcast TV station when you up your quality everything gets more expensive but you don't make any extra money. So why bother?

ryandrake

> If you're a broadcast TV station when you up your quality everything gets more expensive but you don't make any extra money. So why bother?

In a functioning, competitive market, the answer to this is "Customers choose a competing broadcast TV station with higher quality." Unfortunately what we have is far from that.

nyanpasu64

Nitpick: ATSC 1.0 only offers broadcast in 720p or 1080i... of course with overscan and all, nobody actually notices the resolution of TV.

extra88

Only "18% of U.S. TV households had at least one TV set enabled to receive free, broadcast programming."

https://www.nielsen.com/insights/2024/beyond-big-data-the-au...

ksec

Wow. Thanks. So when an Americans say they are watching TV, I assume that mostly meant watching Netflix or Cables?

donatj

Broadcast TV after the digital rollout was so bad many people just stopped watching TV. Picking it up is such a hassle it's simply not worth the effort for some ad laden TV.

At the time of the switchover in the early 2000s I lived about 40 miles from a major metropolitan area, Minneapolis, which is pretty close in US terms. We spent hundreds of dollars on different antennas (indoor and outdoor) and signal boosters and what not and it was simply never reliability.

In 2008 I moved to my current location, three miles outside of downtown Minneapolis. Again I tried a number of antennas and still found operation to be anything but reliable. I gave up and began just watching Netflix.

The people who live close enough to the broadcasts to pick it up have easy access to cable TV. The people who live in the countryside who used to depend on it can't pick it up. There's just no place for the TV system we were given.

mmooss

> Broadcast TV after the digital rollout was so bad many people just stopped watching TV.

That is the first time I've heard that. Everything I've heard has been positive - people amazed that others aren't doing it. Are there any numbers on user satisfaction?

I used it myself once or twice and it worked simply with antennas that were relatively cheap (<$50 iirc). Maybe there was a problem in Minneapolis?

> The people who live close enough to the broadcasts to pick it up have easy access to cable TV.

Cable is expensive for many people and broadcast is free, of course. (Also, Broadcast is more private, for now.)

cozzyd

I live in downtown Chicago and get tons of channels, despite no line of sight due to buildings in the way. Though they tend to go out when the El passes by.

throw0101d

> The demo on the show floor was also using eLoran to distribute time from a site in Nevada to the transmitter facility on Black Mountain outside Vegas, showing a way to be fully GPS-independent (though the current eLoran timing was sourced from GPS).

There's been a consistent call by many people that there needs to be a diversity of options for navigation and timing:

* https://rntfnd.org/2025/02/04/pnt-gps-critical-issue-for-new...

China has GNSS (BeiDou, plus plans for LEO), plus terrestrial navigation (eLoran), plus a fibre-based network for accurate timing:

* https://rntfnd.org/2024/10/03/china-completes-national-elora...

* https://rntfnd.org/2024/03/01/patton-read-their-book-chinas-...

* https://rntfnd.org/2024/11/29/china-announces-plan-to-furthe...

Russia has a Loran-equivalent:

* https://en.wikipedia.org/wiki/CHAYKA

mschuster91

Well... and there's the electricity grid which can be used for timing needs accurate enough to a single second, and in Europe there's DCF77 [1] which can not just be used as a 2*10^-12 seconds-accurate timing standard but also a frequency standard.

[1] https://de.wikipedia.org/wiki/DCF77

lsaferite

Did you actually mention what BPS actually stands for in the article? I read the whole thing and don't recall reading that. Yes, I'm capable of searching and finding the information myself, but in an article about something something esoteric like this, explaining the acronym would be useful.

Edit: Broadcast Positioning System for anyone that didn't figure it out.

The_Double

How does it solve for time without location? With GPS location and time are one solution to an equation with 4 unknowns (x,y,z,t). Without location you won't know the time delay between you and the transmitter.

fnordpiglet

The the transmitters are of fixed terrestrial locations.

michaelt

So you set your clock up by telling it its own location, so it can offset for the signal's flight time?

namibj

The satellites are on known positions too, once you know the time.

lxgr

High-power, and ideally authenticated, alternatives to space-based GNSS are desperately needed, given the sharp uptick in jamming and spoofing incidents in many places.

In a true "end of history" moment, the US and other NATO members discontinued both of their ground-based systems (which are inherently harder to jam due to their much higher transmission power, since transmitters are not power limited) – Omega in the late 1990s and Loran-C in the early 2010s – in favor of GPS, while Russia kept their equivalent functional, and China completed an eLoran network last year.

Add to that the FAA's reduction of their ground-based VOR/DME station network that lets planes navigate when GPS is unavailable...

GPS jamming, and much more concerningly spoofing, will probably quickly come within reach of non-nation-states and smaller groups of all kinds, and ultimately individual actors, and that can't possibly end well for civil aviation if robust countermeasures don't become available very soon.

typewithrhythm

You can't really beat a jammer, sure you can compete for power output, but there is no real stopping it.

Aircraft and military positioning concepts are evolving towards more map and dead reckoning, lessening the benefit of GPS jamming.

zinekeller

The reason that dead reckoning was inaccurate was because of clock and vector inaccuracies. Looking at the advances in clockpieces and gyroscopes (both of these has benefitted over the optical revolution) over the years, I am not shocked that dead reckoning is back in vogue.

cameldrv

Dead reckoning is also inaccurate due to unknown winds. Even if you take off with the best available forecast, it’s often wrong by 5mph+. After three hours your position is off by 15 miles. That’s not remotely good enough for most aviation purposes.

plextoria

Correct me if I’m wrong, but wouldn’t a jammer be very easy to disable kinetically?

A missile would simply have to follow the jammer’s signal.

bluGill

Missels are expensive. Jammers can be farther away than a cheap drone can reach.

Jammers often move. Your missle often cannot manovure well enough to hit. jammers often turn off - if you missle is detected they turn the jammer off and move it. They are often running more than one jammer so getting one to turn off isn't going to matter.

petre

The obvious problem with this approach: destroying a jammer in foreign country with a missile is an act of war.

null

[deleted]

firesteelrain

Be an interesting feature to add to the AGM-88E.

fpoling

Plus aircrafs can use Earth gravity map. Sensors became accurate enough to detect minuscule changes in gravity strength to position aircraft to few hundred meters.

bob1029

You can defeat a jammer if you are willing to use more spectrum than your opponent and/or can sacrifice information rate.

lxgr

True, but both very hard to do in a system that's deployed to space and (heavily regulated) avionics part of aircraft that get upgraded very infrequently, to say nothing of the billions of other civilian receivers.

amelius

What if you used directional antennas?

jeffbee

A university of Texas research group demonstrated more than ten years ago that they could spoof GPS in the vicinity of an automatically navigating UAV, and force it to land at a point of their choosing. This has been within the reach of garage hackers for a long time.

Scoundreller

Now the UAV needs to track where the signal is coming from. “Hey, this isn’t coming from the sky, what gives”

touisteur

Gets trickier when you're using dGPS. And you need at least 2 antennas/receivers to know direction of arrival. Beamforming in the direction(s) of the expected satellite(s) seems to help a lot with e.g. starlink. But you'll need a phased array and a beamformer.

mindcrime

GPS jamming, and much more concerningly spoofing, will probably quickly come within reach of non-nation-states and smaller groups of all kinds, and ultimately individual actors

It may already be so:

https://hal.science/hal-03456365v1

keithwinstein

You don't need ATSC 3.0 to do this kind of thing! The short-term stability of the oscillators they use for commercial DTV transmission is apparently good enough that just having one local reference to compare GPS vs. each TV station's phase (and distribute that data) can produce a pretty good positioning system. Rosum was doing this back in 2005: https://www.tvtechnology.com/news/tv-signals-used-for-geopos...

RyanShook

Slide deck of BPS (Broadcast Postioning System) https://www.gps.gov/governance/advisory/meetings/2022-11/mat...

louwhopley

Thanks for sharing this. It creates a clear picture of its use cases and roll out plans.

GPS is such a critical infrastructure component to modern society- knowing that a redundancy system like this is in the works is great.

LeoPanthera

ATSC 3.0 channels (there are some already) are encrypted. So no more free-to-air, no open standards, no more open source viewers. Watch TV using Kodi? VLC? Not anymore.

It's a travesty that this was ever approved.

hwpythonner

I’m trying to understand if this depends on something specific to FM or TV signals, or if it’s more of a protocol-level idea (i.e., any time-synced, known-location transmitters would work).

If it’s not intrinsic to FM, why not use existing cellular towers to do this? They’re everywhere, and phones already receive broadcast messages (like Amber Alerts) even without a SIM (I think) — so it feels like this could be done without needing new radios.

What makes this more accurate than cell tower triangulation today? Is the limitation in timing sync across towers, or something else in how cell networks are structured?

And for indoor use — how does this handle multipath? Reflections from walls or even atmospheric bounce seem like they’d throw off timing accuracy, similar to what messes with GPS in dense areas.

anonymousiam

Nice article on HackADay from yesterday covering this:

https://hackaday.com/2025/04/11/gps-broken-try-tv/

dieselerator

If planning/designing a timing system like this using existing antenna, why wouldn't you choose to use cellular base stations? The cellular network reaches most places with overlapping coverage and carries network time. The lowest cellular frequencies are adjacent the upper broadcast TV channels. Aren't modern cellular receivers what we call software defined radios? They can choose which channels to receive.

michaelt

Interestingly, cellular base stations are one of the major customers for high precision timing systems.

They use precise timing to coordinate timed broadcast slots between base stations with overlapping coverage.

master_crab

This sounds interesting but it most likely will only be of use in populated areas where there is enough signal overlap from broadcast towers. You’ll still need GPS in the countryside and on water.

bri3k

In a lot of cities the broadcast towers are concentrated in the same place, I wonder how effective it could be.

lxgr

At least in Europe, when broadcasting went digital (DVB-T), a lot of cities started being covered by single frequency network transmission, which specifically leverages different transmitter locations transmitting the same signal to decrease dead zones.

Due to how OFDM works, I suppose the idea here is to intentionally send a heterogeneous signal on a few non-overlapping subcarriers (for single-frequency networks) or on different transponders at different locations (since single frequency networks aren't as common in the US due to how broadcasting evolved there, although ATSC 3.0 apparently also allows single frequency networks).

kristopolous

https://www.nab.org/bps/

for people who don't want to watch videos

geerlingguy

The OP link is a blog post, which includes links out to the primary resources (much more in depth than the BPS landing page). The video is a byproduct of my conversations at NAB, and both are just preliminary... I've been working on a more in depth look at GPS and BPS (and other alternatives).

kristopolous

I'm fighting through a cold so granted my reading comprehension is way down but at least in my diminished state I was reading through that and was baffled ...

I'm sure the average reader who deals with broadcast signal electronics knows what's going on here but I just walked away from it confused. It looks like terrestrial broadcasters sending out time codes for triangulation?

gblargg

I was amazed to not even find a Wikipedia article about BPS.

neuroelectron

What about RTK/PPS? Here's a module that implements them along with GPS and GGNS.

https://www.sparkfun.com/sparkfun-gps-rtk2-board-zed-f9p-qwi...

The datasheet: https://cdn.sparkfun.com/assets/f/8/d/6/d/ZED-F9P-02B_DataSh...