Tor Browser and Onion Services - Challenges and Opportunities

Maintaining a browser like Tor Browser has its challenges but also its rewards. It allows us to reach faster adoption of important technologies like onion services, providing a more secure browsing experience for all Tor users. Improving the treatment of onion services on the browser side, however, comes with its own challenges both for users and service providers and it is important to reflect on those as a requirement for future growth. Thus, we feel it is time to take stock in this blog post and outline the steps we have taken over the years to improve the user experience and adoption of onion services, the challenges we faced and continue to face, and what the future might look like.

What does this mean and how did we get here?

Onions services are self-authenticating and provide integrity and confidentiality by default. That means once you are connected to an onion service you can be sure you are talking to the one you tried to reach and your data is not manipulated or read by Man-In-The-Middle-attackers. HTTPS was introduced over 20 years ago to provide some of those properties for plain web traffic (HTTP) when communicating with a server.
 
Three years ago, Mozilla announced their plan for raising awareness about the insecurity of HTTP by introducing a new visual indicator and a username/password warning message for websites loaded over HTTP (instead of HTTPS). Not knowing anything about onion services, the way this was implemented on Mozilla's side was by looking at the scheme in the URL bar: if it is only "http" then the warning kicks in. As a result, the idea of handling connections with onion services as (inherently) "secure" was proposed because these new browser security indicators directly harmed the usability of onion sites, like those hosted by SecureDrop and Riseup. At that time, extended validation (EV) TLS certificates containing .onion addresses were available for web sites that could afford them, and those web sites were already available over HTTPS, but the certificates were too costly for the general public. Domain validation (regular) TLS certificates weren't allowed to contain .onion addresses, and that idea wasn't being considered.
 
With the immediate usability problem, we found a solution that was acceptable by Mozilla for inclusion in Firefox, as well. The solution elevated connections with .onion addresses to a "potentially trustworthy origin," and this was defined by the Web specification as an origin (web site) that the web browser "can generally trust as delivering data securely".
 
As a result of handling an .onion address as a "potentially trustworthy origin", Tor Browser then modified how it dealt with mixed content over onion service connections and "secure cookies". Over the last two years, based on these previous changes, onion service usability has become a primary feature of Tor Browser, and the security and improved usability of onion services are the reason we can run an onion service adoption campaign like #MoreOnionsPorfavor.

Challenges

The features mentioned in the previous section meant a lot of effort for our small team of engineers and designers. Staying on track and delivering them on time has been challenging sometimes. Additionally, we did not have the resources so far to port all of them to mobile. But we keep iterating so that all Tor users are able to benefit from the enhanced security provided by onion services when browsing the web.
 
All those engineering efforts depend on good communication between stakeholders to be effective and lasting. That does not only mean between the different teams within Tor to get the various improvements implemented but external stakeholders as well.
 
Support by browser vendors is crucial, which is why we spent effort on making our changes specification-compliant and getting them upstreamed into Firefox. Mozilla is now even proactively taking the .onion use case into account which is promising for future changes. And other browser vendors have our onion services enhancements in their pipeline as well.
 
We neglected another stakeholder group, though: companies like Facebook, the New York Times, Guardian, BBC and others which started to run onion services in an enterprise environment. The complexities of those environments and the lack of communication in this area led to potential security issues like the one core contributor Alec Muffett recently reported. As a consequence of that we just started to reach out to enterprises we know are running onion services and provide them with help for running those onion services securely. We have a group, called "onion-advisors," where all of those efforts are coordinated to avoid making this mistake again in the future.

What is the future of Tor Browser and Onion Services?

We are going to continue supporting onion services in Tor Browser, both with and without certificates acquired by the onion service owner, and are continuing the trend of enhancing their usability as well. We hope campaigns like #MoreOnionsPorfavor will help increase awareness of onion services and adoption by small and large web sites. As part of these campaigns we will emphasize the importance of deploying onion services that are secure end-to-end so Tor Browser doesn't make a wrong assumption about which data should be sent over HTTP onion connections. We're also currently improving our documentation for onion service operators and making clear Tor Browser's expectations of web sites.
 
The future of TLS support for onion services is very encouraging. In March of this year, the CA/Browser Forum approved an amendment to the domain validation (DV) TLS certificate baseline requirements which now allows certificate authorities (CAs) to issue certificates containing v3 .onion addresses. This means, in the not-too-distant-future, a CA like Let's Encrypt can issue a certificate for an onion service and Tor Browser will "just work." In addition, for onion services that do not want to rely on certificate authorities, we are exploring alternative designs like Same Origin Onion Certificate for inclusion in Tor Browser.
 
While getting TLS support for onion services is important, making sure onion sites without any TLS certificate continue to get a good user experience will remain one of our priorities in the foreseeable future. After all, onion services by themselves already provide the security benefits (and more) TLS is meant to give to users in most of the cases. Additionally, there remains the idea to explore that onion routing is actually the successor of TLS security-wise. We should not give up on that easily by solely jumping on the TLS + .onion train.
 
There are many different scenarios possible for onion service deployment and the role browsers in general and Tor Browser in particular will play in that. Our experimenting so far seems to indicate that the changes in Tor Browser have been worth it even though deploying onion services securely has not always been easy in this fast-moving area. Where we will end up we don't know. The only thing we can say with certainty is that this part of Tor and Tor Browser is still evolving and we are continuing to push the boundaries of a secure, private, safe, and usable web for everyone. Please watch for future improvements in Tor and Tor Browser, and please deploy secure onion services.
Anonymous

October 05, 2020

Permalink

> Additionally, there remains the idea to explore that onion routing is actually the successor of TLS security-wise.
No! They are from the different levels of OSI model!

Anonymous

October 05, 2020

Permalink

Alec Muffett's vulnerability report ends with two points which should be stressed in any news story covering this issue, and perhaps should have been the lede in the blog post above:

> Does this mean that "The Dark Web" is broken?
>
> No. This issue is purely a function of TorBrowser and how it chooses to behave and manage cookies for websites which it accesses over HTTP and Onion Networking. There are no impacts upon other browsers nor upon fundamental layer-3 onion networking for (say) SSH-over-Onion.
Does this impact mobile? The author does not yet know whether this change impacts either of "TorBrowser" for Android or "OnionBrowser" for iOS.
>
> "Is this a NSA Backdoor", etc?
>
> The author does not believe that this is wittingly any form of backdoor, but rather appears to be driven out of a motivation to increase adoption of Onion Services. Unfortunately the path currently chosen towards this goal includes reviving HTTP as a protocol, requiring differing assumption of data protection than the rest of the web, and it puts at risk of interception data that was formerly not at risk.

I am one of the long-time Tor users who has long been lauding onions as the Next Big Thing, and urging TP to seek to grow the adoption of onion sites worldwide. And I have urged TP to do more to help other entities set up and maintain onions. So I have been very happy to see Tor Project doing just that.

But until I read the above (and Alec Muffett's report) I had no idea that TP had made a technical decision which in hindsight appears ill-advised. This is troubling in part because I have for years urged TP subject matter experts to write posts in this blog explaining for non-technical users (without excessively "dumbing things down", as far as possible) such issues as exactly where and how Tor products use various kinds of cryptography. This would cover, I think, such issues as exactly how Tor Browser claims to authenticate onion sites and to use "secure cookies". If TP had explained what it proposed to do before it did it, I suspect some onion enthusiasts would have backed off their calls for rapid adoption (perhaps too rapid adoption) of onions by sites which lacked the right kind of certificates.

In addition to improving explanation for users of technical design decisions, I hope Tor Project will do more in future to work closely with EFF and Debian Project, for example on adopting/expanding EFF's certificate authority innovation to fix the problem described by Muffet.

I also want to stress that IMO, Tor Project has an opportunity to collaborate with EFF, Debian Project, and Tails Project by making EFF's new "YAYA" into a Debian package which can be installed in a Tails USB via the onion mirrors. EFF appears to view YAYA as something of value only to "malware researchers" but I think it could be made into a malware scanning tool which could be used by any at-risk internet user, and whistleblowers, human rights researchers, activists, and others who use Tails clearly belong to the population of at-high-risk users (because of what they do, not because they use Tails to help them do it). See

https://www.eff.org/deeplinks/2020/09/introducing-yaya-new-threat-hunti…
Introducing “YAYA”, a New Threat Hunting Tool From EFF Threat Lab
Cooper Quintin
25 Sep 2020

We have been asking the Let's Encrypt folks to implement the new option for giving out certificates for v3 onion services and while they are amenable to the idea it seems they are understaffed (too). So, here we are.

But, yes, getting CAs to actually issue certificates without the horrendous costs that are currently involved is clearly a step in the right direction.

EFF is indeed understaffed and teleworking, but I think they are so far handling these difficulties pretty well. Some other orgs (not thinking of Tor here) have not been so lucky.

I have urged EFF to work more closely with Tor Project and Debian Project (and Tails Project), and hope they can find time and energy and money to do so. In particular, I hope you will look at the EFF work on YAYA and reach out to Cooper Quintin if you think that could be made into a Debian package. I think that would offer a fine example of the kind of software project which actually helps ordinary citizens (rather than advancing the corporate interests of Google, say, or the "structuralist" or "Deep State" agenda of USG, or worse, the personal agenda of a sociopath). One thing I like is that this hypothetical Debian package would "do one thing, but do it very well".

Keep up the good work!

Hmmm, I always took at as a given that the operator must secure plain HTTP traffic even without there being any cookies. Even without cookies, someone that can manipulate the traffic can just redirect the user to some other page, easily. Guess this is something that should be documented well though. It may not be obvious to everyone.

I don't think this is really unique to onion services either. I can't but wonder how many pages behind a CDN like Cloudflare use HTTPS for clients but silently connect via HTTP to the backend server.

Anonymous

October 05, 2020

Permalink

After all, onion services by themselves already provide the security benefits (and more) TLS is meant to give to users in most of the cases.

It seems to me that probably the biggest benefit of TLS is the ability to glance at the address bar and know that you are talking to a server whose name you recognize. .onion names aren't very recognizable at a glance.

Anonymous

October 05, 2020

Permalink

I don't use Onion services as they rarely seem to work. If you could include something like a Onion-share with lookup functionality into the browser then you may be getting somewhere more useful for me personally, this could also help increase their usage in general.

While I am here. Could someone please verify the comments on https://blog.torproject.org/new-release-tor-browser-100 Thank you.

Anonymous

October 05, 2020

Permalink

If you keep a long list of onion bookmarks do you make yourself more unique in terms of the browser? What if you have javascript on then can sites see your bookmarks and are they encrypted? Pls help

No, though with javascript enabled there could be some vulnerability. You would be better off keeping them in a separate document, or not at all. You can also set a master password that would offer some protection.

Anonymous

October 05, 2020

Permalink

This is quoted from a Cloudflare post, but is it true?

Tor uses hashes generated with the weak SHA-1 algorithm to generate .onion addresses and then only uses 80 bits of the 160 bits from the hash to generate the address, making them even weaker. This creates a significant security risk if you automatically generate SSL certificates. An attacker could generate a site that collides with the 80-bit .onion address and get a certificate that could be used to intercept encrypted traffic.

For an http site there should be 3 layers of encryption, for https 4, right? How many for a .onion?

Also, is this true?

The design of the Tor browser intentionally makes building a reputation for an individual browser very difficult. And that's a good thing. The promise of Tor is anonymity. Tracking a browser's behavior across requests would sacrifice that anonymity. So, while we could probably do things using super cookies or other techniques to try to get around Tor's anonymity protections, we think that would be creepy and choose not to because we believe that anonymity online is important.

Many thanks in advance for the response

This is quoted from a Cloudflare post, but is it true?

Tor uses hashes generated with the weak SHA-1 algorithm to generate .onion addresses and then only uses 80 bits of the 160 bits from the hash to generate the address, making them even weaker. This creates a significant security risk if you automatically generate SSL certificates. An attacker could generate a site that collides with the 80-bit .onion address and get a certificate that could be used to intercept encrypted traffic.

True for v2, not v3 onions.

Also, is this true?

The design of the Tor browser intentionally makes building a reputation for an individual browser very difficult. And that's a good thing. The promise of Tor is anonymity. Tracking a browser's behavior across requests would sacrifice that anonymity. So, while we could probably do things using super cookies or other techniques to try to get around Tor's anonymity protections, we think that would be creepy and choose not to because we believe that anonymity online is important.

Absolutely. Tor Browser aims to protect from supercookies (first-party isolation), but Cloudflare can use regular cookies for this purpose because he can inject arbitrary content into sites he proxies.

Example:
User visits www.example.org, fronted by Cloudflare
Cloudflare injects a redirect to tracking.cloudflare.com/?domain=www.example.org and issues a unique ID cookie, then redirects back to example.org

If Cloudflare did this, he could track you across all domains he proxies until you restart browser or request new identity. No need of supercookie.

> Tor Browser aims to protect from supercookies (first-party isolation), but Cloudflare can use regular cookies for this purpose because he can inject arbitrary content into sites he proxies. Example: User visits www.example.org, fronted by Cloudflare; Cloudflare injects a redirect to tracking.cloudflare.com/?domain=www.example.org and issues a unique ID cookie, then redirects back to example.org. If Cloudflare did this, he could track you across all domains he proxies until you restart browser or request new identity. No need of supercookie.

Thanks, did not know that. We should all use "New Identity" a lot more. Hence the importance of not burying that deep in the triple bars menu.

I wish Tor Project tried much harder to explain points like this, so that users know how to use Tor more safely.

Expecting users to be disciplined in using New Identity is failure prone. I think Tor Browser should automatically purge cookies and other state keyed to first-party domain when all tabs with that first-party domain are closed or navigated away from.

Doing this for all state requires a lot of work, but cookies and localStorage is simple and gets you most of the way there. Cookie AutoDelete extension is an example for this. (Note: DO NOT install this extension yourself in Tor Browser. It will alter your fingerprint.)

they are talking about onion service v2 which are out of date. ( https://blog.torproject.org/v2-deprecation-timeline )
v3 are much better and longer.

yes tor (exit) + https 4 layers. tor (.onion) 6 layers can vary, especially the onion service can give up anonymity. You can see what this means here: https://support.torproject.org/https/#https-1

There are theoretical attacks to survalance users. I've never seen them. More information here: https://2019.www.torproject.org/projects/torbrowser/design/
I only understood a fraction of it.

This is quoted from a Cloudflare post, but is it true?

Tor uses hashes generated with the weak SHA-1 algorithm to generate .onion addresses and then only uses 80 bits of the 160 bits from the hash to generate the address, making them even weaker. This creates a significant security risk if you automatically generate SSL certificates. An attacker could generate a site that collides with the 80-bit .onion address and get a certificate that could be used to intercept encrypted traffic.

Sounds about right but only for v2 addresses. This is one of the reasons why v3 onion services exist.

For an http site there should be 3 layers of encryption, for https 4, right? How many for a .onion?

Also, is this true?

The design of the Tor browser intentionally makes building a reputation for an individual browser very difficult. And that's a good thing. The promise of Tor is anonymity. Tracking a browser's behavior across requests would sacrifice that anonymity. So, while we could probably do things using super cookies or other techniques to try to get around Tor's anonymity protections, we think that would be creepy and choose not to because we believe that anonymity online is important.

Yes, The Design and Implementation of the Tor Browser puts it this way:

The privacy requirements are primarily concerned with reducing linkability: the ability for a user's activity on one site to be linked with their activity on another site without their knowledge or explicit consent.

This is true for old v2 .onion (the short ones) which are deprecated and should not be used anymore.

For the new v3 onions (larger addresses) these issues are fixed.

Anonymous

October 06, 2020

Permalink

The public's image of Tor could be much better without onion services. Why is there no seperate network for onion services? (don't argue with traffic, your argument is invalid)

> Why is there no seperate network for onion services?

If by separate network you mean a distinct set of Tor nodes, maybe even distinct Directory Authorities and distinct software... one hardly knows where to begin, but one key point is that a major challenge going forward will be to handle the increased traffic load, which means growing the existing network. A second key point is that onions appear to be potentially inherently safer than https from the cybersecurity standpoint, quite independently of privacy concerns or desire for anonymity. This is because regular DNS lookup is inherently unsafe, and https cannot avoid such lookups. Using Tor to surf to onion mirrors of news sites (say) at least makes it harder to mess with a particular target that way.

It is known that agencies such as GCHQ and FBI use fake news stories and state-sponsored fake news sites (which look just like the real BBC News site, for example) to punt Regin style malware to individual targets, such as persons "suspected" (wrongly in at least some cases) of being "associated" with Wikileaks (for example). One reason we know this is that some of the Snowden leaks contain the spooks' own presentations (to colleagues) about their methods.

The internet keeps getting more and more dangerous, as more and more governments, corporations, private individuals adopt more and more sophisticated persistent threat malware and more and more devious infection vectors, including "spraying" large populations with infection attempts.

> (don't argue with traffic, your argument is invalid)

Remarks like that cause me to wonder whether a certain public figure's self-delusion is as contagious as COVID-19.

I think your logic is backwards.

It is true that FBI's "Going Dark" anti-encryption disinformation campaign appears to have persuaded many people who don't know any better that "encryption is only used by bad guys" [sic], which is of course utterly absurd. (See for example the books by Bruce Schneier).

Or that "Tor is only used by sexual predators" [sic], a particularly dangerous counterfactual claim in light of the fact that victims rights groups use Tor to protect victims of domestic abuse, etc, that human rights groups use Tor to protect their in-country reporters, etc. The FBI also curiously insists upon ignoring an obviously far more effective response to concerns about alleged on-line sexual exploitation of minor children: high school civics curricula should contain a lot of good information about the kinds of dangers explained in Julia Angwin's excellent book Dragnet Nation. The fact that FBI never advocates for *effective* measures proves (in my view) that they care nothing whatever about real problems facing real children, they care only about crippling all forms of encryption so that their self-appointed "population control" mission becomes just a bit easier, without the need for wearing out their shoe soles pounding the pavement to perform such onerous tasks as appearing in court or serving warrants.

But the correct response is not to advise ordinary people to avoid using onions, but to patiently explain to anyone willing to hear us out how using (for example) the Debian repository onion mirrors can help keep bad guys from messing with your software updates, and how using onion mirrors for human rights sites and news sites can help protect you from some ugly state-sponsored attacks. This is really no different from pointing out that using Tor Browser without trying to surf to onion sites can help protect you from many cross-site scripting attacks (especially if you disable Javascript by using the "Safest" setting), because Tor Browser is bundled with Noscript protections.

The issue discovered by Alec Muffett is serious and I would like to understand better how it affects Secure Drop sites, OnionShare, etc. But I am confident this issue will be addressed and I continue to support TP's attempts to "mainstream" onions. Indeed, I feel that essentially all web sites should be made into onions. Given the anti-privacy user-exploitation model of the internet which became dominant (thanks to companies like Google) in the early years of the beyond-academic World Wide Web, this will require discovering and overcoming many, many thorny issues.

Anonymous

October 06, 2020

Permalink

Http2 makes onions browsing faster. HTTPS is required for use of Http2 and so onions with https certificate can get much better performance for both latency and througput.

Well, the requirement is a secure context so, in theory you are right but that should not apply to browser like Tor Browser which realize that onions *without* HTTPS+certificate are secure in that sense already.

That said, yes, if one had an certificate for the .onion this would not hurt but that should not be a requirement for getting the same benefits.

Good, secure by design. Would it not be a good idea to drop the http/https prefix from Onions? With the (hopefully soon) coming addition of https only connections, one would assume having a http prefix becomes counterproductive/insecure?

Anonymous

October 07, 2020

Permalink

Would it be correct to sum up Tor Project's position on the kind of problem discovered by Alec Muffett by saying that the best available fix for all the inherent shortcomings of http+DNS is not https+whatever, but onions? That this kind of problem arises because major websites use a huge mix of http, https, various certs, multiple servers including reams of third party content, and that if everyone simply adopted onions (and if the Tor network was large enough to handle all the world's internet traffic), everyone would enjoy much better cybersecurity for all on-line tasks, independently of privacy concerns, user-exploitation concerns, and censorship evasion?

The problem is of course how to get from here (awful) to there (sounds wonderful).

Anonymous

October 11, 2020

Permalink

We help deliver Tor to a wide community.

On each deployment - each user gets a Tor v3 address. We thought it a good idea for a while and a number of reasons.

We also develop tooling that uses Tor v2 and v3 addresses.

That leads to a question - can Tor Project place an onion on each users machine ?
To grow the onion garden - so to speak ?

Anonymous

October 14, 2020

Permalink

Glad to know the Guardian has an onion url, why doesn’t TB finds it if you browse The Guardian? Or is it just SecureDrop?