Skip to content(if available)orjump to list(if available)

Homomorphic Encryption in iOS 18

Homomorphic Encryption in iOS 18

23 comments

·January 11, 2025

ted537

It's cool how neural networks, even convulutional ones, are one of the few applications that you can compute through homomorphic encryption without hitting a mountain of noise/bootstrapping costs. Minimal depth hurrhah!

j16sdiz

> The two hops, the two companies, are already acting in partnership, so what is there technically in the relay setup to stop the two companies from getting together—either voluntarily or at the secret command of some government—to compare notes, as it were, and connect the dots?

The OHTTP scheme does not _technically_ prevent this. It increases the number parties need to cooperate to extract this information, hoping it would be caught somewhere in the pipeline.

timsneath

GeekyBear

> One example of how we’re using this implementation in iOS 18, is the new Live Caller ID Lookup feature, which provides caller ID and spam blocking services. Live Caller ID Lookup uses homomorphic encryption to send an encrypted query to a server that can provide information about a phone number without the server knowing the specific phone number in the request.

Privacy by design is always nice to see.

vrtx0

Is the Apple Photos feature mentioned actually implemented using Wally, or is that just speculation?

From a cursory glance, the computation of centroids done on the client device seems to obviate the need for sending embedded vectors of potentially sensitive photo details — is that incorrect?

I’d be curious to read a report of how on-device-only search (using latest hardware and software) is impacted by disabling the feature and/or network access…

aeontech

According to this post on Apple's Machine Learning blog, yes, Wally is the method used for this feature.

https://machinelearning.apple.com/research/homomorphic-encry...

sillysaurusx

I was going to make my usual comment of FHE being nice in theory but too slow in practice, and then the article points out that there’s now SHE (somewhat homomorphic encryption). I wasn’t aware that the security guarantees of FHE could be relaxed without sacrificing them. That’s pretty cool.

Is there any concrete info about noise budgets? It seems like that’s the critical concern, and I’d like to understand at what point precisely the security breaks down if you have too little (or too much?) noise.

3s

SHE vs FHE has nothing to do with security. Instead, it’s about how many operations (eg homomorphic multiplications and additions) can be performed before the correctness of the scheme fails due to too much noise accumulating in the ciphertext. Indeed all FHE schemes are also SHE schemes.

What typically makes FHE expensive computationally is a “bootstrapping” step for removing the noise that accumulated after X operations and threatening correctness. After bootstrapping you can do another X operations. Rinse and repeat until you finish the computation to you want to perform.

ruined

it's incredibly algorithm-dependent. if you look into the thesis that originates the 'bootstrapping' technique to transform SHE algorithms into FHE, they determine the noise limit of their specific algorithm in section 7.3 and then investigate expanding the noise limit in 8 and 10.

(written in 2009) http://crypto.stanford.edu/craig/craig-thesis.pdf

some newer FHE don't encounter a noise limit or don't use the bootstrapping technique.

Ar-Curunir

All known FHE schemes use bootstrapping

ruined

i expected that, but a search turned up several things claiming to implement fhe without bootstrapping. i didn't investigate and i can't say i'm familiar so maybe they're bogus

bawolff

Im not an expert on this, but my understanding is "noise" is less a security breakdown and more the entire system breaksdown. That is where the "somewhat" comes in, unlike "full" where the system can do (expensive) things to get rid of noise, in somewhat the noise just accumulates until the system stops working. (Definitely talking out of my hat here)

sillysaurusx

Interesting! Are there any security concerns with SHE? If not, it sounds like all of the benefits of FHE with none of the downsides, other than the noise overwhelming the system. If that’s true, and SHE can run at least somewhat inexpensively, then this could be big. I was once super hyped about FHE till I realized it couldn’t be practical, and this has my old excitement stirring again.

bawolff

My impression is that SHE are still relatively expensive, not as crazy as FHE but still slow enough to preclude many usecases and the noise breakdown can happen relatively quickly making them not work for most algorithms people want to use FHE for.

Ar-Curunir

Most FHE schemes are constructed out of SHE schemes. Also, there’s nothing preventing FHE from being practical, it’s just that existing constructions are not as fast we would like them to be

Ar-Curunir

SHE doesn’t relax security guarantees, it relaxes the class of supported computations

rkagerer

This would be even more exciting if there were some way to guarantee your phone, the servers, etc. are running untampered implementations, and that the proxies aren't colluding with Apple.

sneak

> This should be fine: vectorization is a lossy operation. But then you would know that Amy takes lots of pictures of golden retrievers, and that is a political disaster.

This downplays the issue. Knowing that Alice takes lots of screenshots of Winnie the Pooh memes means that Alice’s family gets put into Xinjiang concentration camps, not just a political disaster.

(This is a contrived example: iCloud Photos is already NOT e2ee and this is already possible now; but the point stands, as this would apply to people who have iCloud turned off, too.)

troad

Agreed. And for a less contrived example, people may have photos of political protests that they attended (and the faces of others present), screenshots that include sensitive messages, subversive acts, etc.

It's worth noting though that it's now possible to opt in to iCloud Photo e2ee with "Advanced Data Protection". [0]

[0] https://support.apple.com/en-us/102651

sneak

iCloud Photo e2ee still shares hashes of the plaintext with Apple, which means they can see the full list of everyone who has certain files, even if they all have e2ee enabled. They can see who had it first, and who got it later, and which sets of iCloud users have which unique files. It effectively leaks a social graph.

It’s also not available in China.

eviks

> There is no trust me bro element. > Barring some issue being found in the math or Apple’s implementation of it

Yes, is you bar the "trust me bro" element in your definition, you'll by definition have no such element.

Reality, though, doesn't care about your definition, so in reality this is exactly the "trust me bro" element that exists

> But we’re already living in a world where all our data is up there, not in our hands.

If that's your real view, then why do you care about all this fancy encryption at all? It doesn't help if everything is already lost

rpearl

I mean if you'd like, you could reimplement the necessary client on an airgapped computer, produce an encrypted value, take that value to a networked computer, send it to the server, obtain an encrypted result that the server could not possibly know how to decrypt, and see if it has done the computation in question. No trust is required.

You could also observe all bits leaving the device from the moment you initialize it and determine that only encrypted bits leave and that no private keys leave, which only leaves the gap of some side channel at the factory, but you could perform the calculation to see that the bits are only encrypted with the key you expect them to be encrypted with.

Krasnol

How is this comment useful to the OPs valid arguments?