Skip to content(if available)orjump to list(if available)

Show HN: Real-time privacy protection for smart glasses

Show HN: Real-time privacy protection for smart glasses

35 comments

·August 11, 2025

I built a live video privacy filter that helps smart glasses app developers handle privacy automatically.

How it works: You can replace a raw camera feed with the filtered stream in your app. The filter processes a live video stream, applies privacy protections, and outputs a privacy-compliant stream in real time. You can use this processed stream for AI apps, social apps, or anything else.

Features: Currently, the filter blurs all faces except those who have given consent. Consent can be granted verbally by saying something like "I consent to be captured" to the camera. I'll be adding more features, such as detecting and redacting other private information, speech anonymization, and automatic video shut-off in certain locations or situations.

Why I built it: While developing an always-on AI assistant/memory for glasses, I realized privacy concerns would be a critical problem, for both bystanders and the wearer. Addressing this involves complex issues like GDPR, CCPA, data deletion requests, and consent management, so I built this privacy layer first for myself and other developers.

Reference app: There's a sample app (./examples/rewind/) that uses the filter. The demo video is in the README, please check it out! The app shows the current camera stream and past recordings, both privacy-protected, and will include AI features using the recordings.

Tech: Runs offline on a laptop. Built with FFmpeg (stream decode/encode), OpenCV (face recognition/blurring), Faster Whisper (voice transcription), and Phi-3.1 Mini (LLM for transcription analysis).

I'd love feedback and ideas for tackling the privacy challenges in wearable camera apps!

sillystuff

I appreciate your intent, But...

This does nothing to alleviate my privacy concerns, as a bystander, about someone rudely pointing a recording camera at me. The only thing that alleviates these concerns about "smart" glasses wearers recording video, is not having "smart" glasses wearers. I.e., not having people rudely walking around with cameras strapped to their faces recording everyone and everything around them. I can't know/trust that there is some tech behind the camera that will protect my privacy.

A lot of privacy invasions have become normalized and accepted by the majority of the population. But, I think/hope a camera strapped to someone's face being shoved into other peoples' faces will be a tough sell. Google Glass wearers risked having the camera ripped off their faces / being punched in the face. I expect this will continue.

Perhaps your tech would have use in a more controlled business/military environment? Or, to post-process police body camera footage, to remove images of bystanders before public release?

cladopa

Agreed.

Something like this tool is ridiculous against companies like Google or Meta. Just with their phone apps and OS control with a video like the displayed those companies could know exactly who each person in the video is, what are they doing, who they are with, and record that information forever.

In the video I see three young women, another woman near the zebra crossing. A young woman with a young man, a woman walking with two men on the sides, and another young couple. I know their heights, if they are fat or slim, the type of their hair and so an AI could know that and with that information and a little more like someone of one group having location activated it is enough for a computer to automatically decode the remaining information.

If enough people wear those stupid glasses it means in a city everyone is surveilled on real time with second and centimetre accuracy, included indoors in places like restaurants or malls.

This is too much power that no company or institution should have. If Meta or google have the ability to do that, they will be required by the US government to give that info automatically with some excuse like "national security".

Roark66

I feel quote uneasy about stuff being recorded and sent to big corps routinely with cameras strapped to random bystander faces. I'm much more bothered by the fact this gets sent to a central location and processed than mere fact of being recorded without consent.

However, even with this uneasy feeling, one has to recognise a street is a public space and I don't see how one can have reasonable expectation of complete privacy there. There is nothing rude about recording what you can see.

The privacy expectation I have is not that my picture will not be captured, but that such recordings from many unrelated people will not be aggregated to trace my movements .

So in summary, I think everyone has a right to photograph or record whatever they like in a public space, but the action of pooling all such recordings, and running face tracking on them to track individual people (or build a database of movements, whatever) is breaching these people's privacy and there should be laws against it.

null

[deleted]

troyvit

Seriously. There has been so much progress in the area of non-consensual recording and processing of data, and so little in the area of countermeasures. You can do a web search for the former and find tons of hardware and software that'll help you spy on folks. Searching for adversarial design gives scientific papers. It implies that there is little-to-no measurable demand for privacy (at least of this sort) in the marketplace.

Szpadel

I agree with all you said, but I don't believe there is any way you could protect yourself from being recorded.

The only way for this to work are legal regulations. But those can be easily dismissed as not possible to implement. So this is good PoC to show what is possible and way to discover how this could function. Without such implementation I don't believe you are able to convince anybody to start working on such regulations.

riffic

it's an ass-backwards approach to privacy isn't it?

arkadiyt

I don't need something to protect the privacy of others from me, I need something to protect my privacy from others. The majority of people who use smart glasses are not going to be using this - where is the product that will protect me from them?

Stevvo

Masks work.

sxp

> Real-time processing – 720p 30fps on laptop

Have you tried running this on a phone or standalone smart glasses? 30fps is horrible performance on a laptop given that it's probably 10-100x more powerful than the target device. And what laptop? Based on your screenshot, I'm guessing you're using a new Apple Silicon mac which is practically a supercomputer compared to smart glasses.

tash_2s

With proper implementation, on-device processing on a smartphone is feasible. On-glasses processing would be challenging, especially with battery constraints.

The 720p 30fps figure is from a PoC implementation, so there is still significant room for improvement. And yes, the demo is on an Apple Silicon M2.

throwawayoldie

Still much less effective than spray paint liberally applied.

sitkack

I think this is interesting research. I could see BLE beacons that announce what level of sharing one is comfortable with. Not unlike systems used at conferences to denote if someone can take their picture and what they can do with it.

tash_2s

I think mainstream adoption of smart glasses could be slowed more by privacy concerns than by hardware limitations. Remember Google Glass? While the hardware keeps improving, I want to make sure we're also addressing the software side.

abcd_f

Your solution does absolutely nothing to address privacy concerns with "glassholes".

Szpadel

I'm curious how do you handle case when camera sees multiple people and you detect consent.

If I understand correctly how this works consent can come from camera operator and be attributed to recorded person

tash_2s

When multiple people are in view and the system detects consent, the current implementation assumes the person closest to the camera is the one giving it. This is not ideal, so active speaker detection is planned.

Verbal consent is just one example. Depending on the situation, other interfaces may work better, such as having a predefined list of friends who are always consented.

bargainbin

This is like creating a t-shirt that says “don’t shoot me”.

The problem isn’t consent, the problem is that the gun is being needlessly pointed at you in the first place.

netsharc

I wonder if I can make a device that recognizes Meta, etc, glasses and aim and shoot laser at the camera lens (and send my users to jail after the device misses the lens and hits the glasshole's eye instead). If there are many glassholes, the laser would just shoot at them alternatingly at eg. 30 shots/seconds.. and make pew-pew noises each time.

VagabundoP

Using something like near infrared would dazzle the sensors in these devices and is invisible and safe for human eyes.

KaiserPro

Did you look at egoblur? its a lot more effective at face detection than https://github.com/ageitgey/face_recognition granted, you'd have to do your own face matching to do exception.

tash_2s

I'll take another look at EgoBlur to see if it's a good fit, thanks. When I checked it briefly before, I thought it was focused on post-processing (non-realtime) and didn't integrate with matching easily, but it's definitely solid tech.

Laconicus

Reminds me of the Black Mirror episode "White Christmas".

https://en.wikipedia.org/wiki/White_Christmas_(Black_Mirror)

TakaJP

Smart solution for wearable privacy. Consent-by-voice and offline processing make it practical, and I’m curious how you’ll expand beyond face blurring.

tash_2s

Thanks! Beyond face blurring, I'm planning to detect and redact other private information like license plates, name tags, and sensitive documents.

Most of these features are aimed at protecting bystanders, but I'm also interested in exploring privacy protection for the wearer. For example, automatically shutting off recording in bathrooms, or removing identifiable landmarks from the video when they could reveal the wearer's location, depending on the use case.