Certificates for Onion Services
12 comments
·August 28, 2025xg15
Does external cert validation for onion domains even make sense? I thought the "domain name" was already the hash of some public key that is used in the normal encryption of the onion router - so there is already a mandatory cryptographic proof that the service you're talking with "owns" the domain. What additional security benefit would CA-signed certs bring?
maqp
>I thought the "domain name" was already the hash of some public key
With v3 it's the ed25519 key with a checksum.
For something like a Cwtch address or your personal dissident blog criticizing Emutopia, it's enough your contacts get the address from you personally or that they find it some other way and pin the site to bookmarks for TOFU.
But with public services like Duckduckgo onion service, it's possible for people to trivially spin up their own unique per-target MITM proxy server instance, and share the link to their friends, bookmark it to their SO's Tor browser and MITM their connections, poison link repositories, or official links on wikipedia pages etc.
Having a CA validate you own the clearweb site first helps mitigate this stuff to some extent. Problem is of course, will the user know if they're supposed to be expecting a cert for a page they visit the first time.
(I wonder if Tor browser could have a list of pinned onion addresses with "clearweb_equivalent_of" field for this, and you could easily check that from the site security badge.)
xg15
This seems like a general problem of using search on onion. I don't really understand how this is supposed to work at all, honestly.
Either you already know the domain you want to visit or you don't.
If you do, you don't need search.
If you don't, how could you be sure the search results are for the real site and not an MITM proxy?
throw0101c
> Does external cert validation for onion domains even make sense? […] What additional security benefit would CA-signed certs bring?
Yes, and the page/documents explain some use cases:
> The two ACME-defined methods allowed by CA/BF described in Sections 3.1.2 and 3.1.3 (http-01 and tls-alpn-01) do not allow issuance of wildcard certificates. A ".onion" Special-Use Domain Name can have subdomains (just like any other domain in the DNS), and a site operator may find it useful to have one certificate for all virtual hosts on their site. This new validation method incorporates the specially signed Certificate Signing Request (CSR) (as defined by Appendix B.2.b of [cabf-br]) into ACME to allow for the issuance of wildcard certificates.
* https://datatracker.ietf.org/doc/html/rfc9799#name-new-onion...
> Some Hidden Services do not wish to be accessible to the entire Tor network, and so they encrypt their Hidden Service Descriptor with the keys of clients authorized to connect. Without a way for the CA to signal what key it will use to connect, these services will not be able to obtain a certificate using http-01 or tls-alpn-01, nor enforce CAA with any validation method.
> To this end, an additional field in the challenge object is defined to allow the ACME server to advertise the Ed25519 public key it will use (as per the "Authentication during the introduction phase" section of [tor-spec]) to authenticate itself when retrieving the Hidden Service Descriptor.
* https://datatracker.ietf.org/doc/html/rfc9799#name-new-onion...
jontro
They write the following reason in the article: But as the web and other internet technologies mature, certificates are starting to be a requirement in order to unleash functionalities, especially in web browsers, such as the faster connection protocol HTTP/2 and payment processing.
xg15
This seems really sad. But I guess it depends what the goal is. If you want to integrate onion purely on a DNS resolver and network interface level and then use a stock browser for accessing the services, yes, you'd need that.
(Then you'll also have to fight with the stock browser for using your special DNS resolver, not leaking info to Google, Cloudflare or whoever else, etc etc, tho)
But don't most people use custom browsers with built-in support for onion anyway? If that's the case, the easiest solution would seem to just declare .onion a "secure origin" like localhost and patch the browser accordingly.
rnhmjoj
> But don't most people use custom browsers with built-in support for onion anyway? If that's the case, the easiest solution would seem to just declare .onion a "secure origin" like localhost and patch the browser accordingly.
Indeed, the use of the onion TLD has been standardised in RFC 7686 [1], so browsers should really treat it as secure and stop the usual plaintext HTTP shenanings.
arvinjoar
"Doesn't make sense for us but mandated by policy" is a super common phenomenon that you'll sadly encounter all the time in the industry. Especially when it comes to security. In this case it's at least motivated by something as peripheral as onion services wanting to fit in with the browser ecosystem, which, fair, maybe it doesn't make sense for browsers to bloat their designs by taking onion services into account, and then onion services have to adapt to modern browser standards.
potato3732842
>"Doesn't make sense for us but mandated by policy"
It's way worse in the physical world than in the software world IMO.
flotzam
The section Benefits (after Introduction) lists 9 reasons why it makes sense. Some of them are about working around a mismatch with existing standards, but not all.
null
"Automated Certificate Management Environment (ACME) Extensions for ".onion" Special-Use Domain Names", June 2025:
* https://datatracker.ietf.org/doc/html/rfc9799