tor browser

New Tor Browser Bundles with Firefox 6

We've updated the experimental Tor Browser Bundles to Firefox 6 and all users are strongly encouraged to upgrade, as Firefox 6 fixes some serious security issues present in Firefox 5.

The 64bit GNU/Linux version of the bundle is now available again.

Tor Browser Bundle (2.2.31-1) alpha; suite=all

  • Update Tor to
  • Update Firefox to 6.0
  • Update Libevent to 2.0.13-stable
  • Update NoScript to
  • Update HTTPS Everywhere to 1.0.0development.5
  • Remove BetterPrivacy until we can figure out how to make it safe in all bundles (see #3597)

New Firefox 3.6 Tor Browser Bundles

The stable Tor Browser Bundle for Windows and the beta Firefox 3.6 Tor Browser Bundles for Linux and OS X have been updated to Firefox 3.6.20.

We are currently building new versions of the experimental Tor Browser Bundles, this time containing Firefox 6; we will announce them when they are ready. Firefox 5 and the experimental Firefox 5 TBBs are no longer safe to use.

Windows bundle
1.3.27: Released 2011-08-19

  • Update Firefox to 3.6.20
  • Update Libevent to 2.0.13-stable
  • Update HTTPS-Everywhere to 1.0.0development.5

OS X bundle
1.0.23: Released 2011-08-19

  • Update Tor to
  • Update Firefox to 3.6.20
  • Update Libevent to 2.0.13-stable
  • Update NoScript to
  • Update HTTPS-Everywhere to 1.0.0development.5
  • Remove BetterPrivacy until we can figure out how to make it safe in all bundles (see #3597)

Linux bundle
1.1.13: Released 2011-08-19

  • Update Tor to
  • Update Firefox to 3.6.20
  • Update Libevent to 2.0.13-stable
  • Update NoScript to
  • Update HTTPS-Everywhere to 1.0.0development.5
  • Remove BetterPrivacy until we can figure out how to make it safe in all bundles (see #3597)

Improving Private Browsing Modes: "Do-Not-Track" vs Real Privacy by Design

Updated 06/16/2011: Break off into its own linkability issue. While ultimately it should be handled identically to the referer, that was not clear in the original text.
Updated 07/01/2011: Add link to article about 81% consumer polling rate in favor of some form of Do Not Track...

As I said in my previous post, the Tor Project hopes to work on a set of patches that effectively improves the Private Browsing Mode of Firefox. Long term, we'd love to merge these patches upstream, and/or see them obsoleted by better implementations.

To help keep everyone on the same page with respect to this effort, I've decided to take some time to describe what we envision as our ideal private browsing mode.

Hopefully, such a mode would be useful for more than just Tor users. Indeed, there are many ways to obtain varying levels of IP address privacy once you have solid browser support for privacy by design. The average user is quite capable of going to a cafe and enabling private mode, and this ability can be explained to them in a single sentence by the UI. Arguably they can also obtain low-grade IP privacy simply by tethering to their cell phone, whose IP typically changes regularly. I am told that frequent IP rotation is also the norm for residential connections in Germany and much of the EU to deter services and malware. This is not to mention all of the commercial single-hop VPN and proxy privacy services out there that fail to provide actual browser privacy in their tools.

We believe that the attention surrounding the "Do-Not-Track" header also indicates that network privacy is an important feature. However, we believe that it must be provided by design, as opposed to via a humble request to the adversary that is impossible to audit or enforce, especially outside of the United States. In his presentation at W2SP, Balachander Krishnamurthy compared the "Do-Not-Track" request header to the real-world equivalent of leaving your door unlocked with a posted notice that reads "Do-Not-Rob". While many people actually do post "No Trespassing" and similar signs, no one expects these signs to replace actual security measures.

Unfortunately, right now the only usable and effective web privacy option for the average user is to install an ad-blocker or similar software. Personally, I see the need for an ad-blocker to achieve privacy as a huge failure of the web itself. If web tracking, profiling, and behavioral targeting is so extreme that it cannot be avoided except by blocking all ads, then the prevailing revenue model of the web is unsustainable. We must figure out a way to do non-intrusive, content-relevant advertising while still providing privacy, without relying on regulatory action that is unlikely to be enforceable.

Ok, so enough preaching. What does privacy by design look like? I'm going to describe a list of 7 key properties, some of which the major browser vendors already have or are working towards, but so far are not uniformly deployed in any browser, including even our own Tor Browser.

  1. Make local privacy optional
  2. Avoid Linkability: Minimize privacy options, plugins and addons
  3. Avoid Linkability: Isolate all non-private mode identifiers and state
  4. Avoid Linkability: Isolate state per top-level domain
  5. Avoid Linkability: Reduce fingerprintable attributes
  6. Avoid Linkability: Reduce default referer information
  7. Avoid Linkability: Restrict to referer policy

Make local privacy optional

The browser vendors got it half right the first time around. There are many users who consider local storage privacy to be the primary feature they want from private browsing mode. However, we believe that local privacy is actually an orthogonal feature to network privacy: some users want both, but some only want one or the other.

Therefore, we believe that users should be given the option in private browsing mode to choose if they want to record browsing history or not. Many users will want to use private mode regularly, and will only be concerned about ad network tracking as opposed to local storage (ie similar to the "Do-Not-Track" header's use case). These users will still want history and "awesome bar" functionality to work for them. Almost all users will want to maintain access to their bookmarks and previously stored history from within the mode.

Avoid Linkability: Minimize privacy options, plugins, and addons

Beyond the choice to store history and activity on disk, there should not be numerous global options provided to private browsing modes.

Each option that detectably alters browser behavior can be used as a fingerprinting tool on the part of ad networks. Similarly, all extensions should be disabled in the mode except as an opt-in basis.

Instead of global browser privacy options, privacy decisions should be made per top-level url-bar domain to eliminate the possibility of linkability between domains. For example, when a plugin object (or a JavaScript access of window.plugins) is present in a page, the user should be given the choice of allowing that plugin object for that top-level url-bar domain only. The same goes for exemptions to third party cookie policy, geo-location, and any other privacy permissions.

If the user has indicated they do not care about local history storage, these permissions can be written to disk. Otherwise, they should remain memory-only.

Avoid Linkability: Isolate all non-private identifiers and state

All major browsers already make some effort to isolate explicit identifier state between non-private and private browsing (despite protest that their threat model does not actually require it). Obviously, privacy by design requires that this effort be continued.

The ability to link users between private and non-private browsing modes via explicit identifiers, browser state, or TLS state should be considered a flaw in the mode. After all, the user may have gone to a wifi cafe to obtain IP address privacy, expecting identifier privacy from their browser. It is not fair to the user to abjectly fail to protect them in this case.

Avoid Linkability: Isolate state to top-level domain

However, users who want continuous "Do-Not-Track"-style privacy will likely use the mode regularly, possibly even exclusively, to avoid behavior advertising and associated tracking. These users will also want to reduce the linkability between arbitrary sites they visit.

This is a particular concern for Tor as many activists use web-based email, social networking sites, and other web services for organizing. Their activity in Tor Browser on one site should not trivially de-anonymize their activity on another site to ad networks and exits.

To provide this property, all identifiers and state must be isolated to the top-level url bar domain, starting with cookies, but extending to the cache, DOM Storage, client certificates, and HTTP auth.

The benefit of this approach comes not only in the form of reduced linkability, but also in terms of simplified privacy UI. If all stored browser state and permissions become associated with the top-level url-bar domain, the six or seven different pieces of privacy UI governing these identifiers and permissions can become just one piece of UI, possibly with a context-menu option to drill down into specific types of state.

We also believe that such an identifier model makes privacy relationships much more clear to the average user. Instead of having various disjoint relationships with and permissions for hundreds of omnipresent third-party domains, users will have one relationship with each of the top-level url-bar domains that they choose to interact with and authenticate to.

Obviously, the downside of this enhanced protection against identifier linkability is that third party services that rely on third party cookie transmission may be impeded by this model. Long term, the hope is for standardized, in-browser support for services federated login and "Like" buttons. Google Chrome has actually implemented a feature called Web-Send that provides this functionality in a privacy preserving way. They have even written a legacy HTML5 version that provides the same privacy properties, save for the need to trust for DOM Storage.

We are also trying to introduce the notion of "protected cookies" in the alpha Tor Browser series, to allow users to specify they want to maintain a relationship with certain sites but not with others. To simplify this experience, we've currently entirely disabled Third Party Cookies in Tor Browser, but we believe that this may end up breaking mashup and federated login sites that might still be able to function under the more lenient double-keyed cookie model.

Several other interim steps are possible in the meantime. One could imagine iframe attributes that cause the browser chrome to request that a site be granted permission to set top-level cookies, or even a fully automated client-side mechanism that performs this promotion automatically for selected sites on mouse-click (such a mechanism is actually being prototyped by researchers right now).

Avoid Linkability: Reduce fingerprintable attributes

Once the linkability via explicit identifiers is eliminated, it becomes important to address the linkability that is possible through browser fingerprinting.

Fingerprinting is a difficult issue to address, but that difficulty does not preclude a best-effort from being made at eliminating or mitigating the major culprits.

Luckily, the major culprit is plugin-provided information. Once plugins are restricted to only permitted top-level domains, fingerprinting linkability gets effectively reduced to the information available via CSS, Javascript, and HTTP headers.

The largest culprits in CSS and Javascript are resolution and media information (especially those properties that also provide information about the device and display as opposed to limiting information to the properties of the rendering window itself), the number of fonts that can be loaded per origin, time-based fingerprints, and WebGL device information.

It is likely that we need another Panopticlick-style study that focuses exclusively on CSS and Javascript to determine the relative importance of these components, but most of them can be addressed without serious breakage of functionality.

Avoid Linkability: Reduce default referer information

So far, the Tor Project has refrained from restricting referer primarily because we believe that restricting referer actually becomes less necessary if identifiers are isolated and linkability has been reduced.

However, non-Tor users do have one important element of linkability: IP address. Even those that have an alternate Internet connection may still be bound to a single alternate IP. These users probably actually benefit in a real way from a restricted referer. It turns out a lot of information is already smuggled or leaked via referrer and URL parameters to third party sites, either deliberately or accidentally.

Referer restriction could take multiple forms, but we believe more site flexibility is key. Sites actually have no way to restrict referer for most element types currently, and conversely, sites will always be able to subvert referer restrictions by smuggling the same data in POST or URL parameters.

Therefore, we believe that referers should be restricted by default in private browsing mode using a same-origin policy where sites from different origins get either no referer, or a referer that is truncated to the top-level domain. However, sites should also be allowed to request an exemption to this rule on a per-site basis using an html attribute, which could trigger a chrome permissions request, or simply be granted automatically (on the assumption that they could just URL smuggle the data).

While this may not seem like much of a protection, at least it allows us to differentiate negligence from deliberate information sharing, and to restrict information leakage in the default scenario. Again, because this data can always be transmitted between elements either directly or via a back-channel, it is better it be visible and apparent than covert.

Avoid Linkability: Restrict to referer policy poses many of the same conflicts as referer information. It gives sites a way to pass data between pages in the navigation lifespan of a tab. Sites can use to store data, but are given no way to clear it easily. Hence it becomes very hard to differentiate deliberate data exchange from accidental leakage.

Just like referer, it is obvious that should be empty whenever the URL bar is rewritten by the user. There should be no legitimate, functional need for data exchange between two arbitrary random user-typed URL bar domains in any situation. on user-entered URLs should be cleared regardless of any changes to existing referer policy.

Similarly, sites could be given the option to allow transmission of to third party iframe elements, but the default should be to isolate to the same origin policy. We do not believe this second step is required for Tor usage, but it may be helpful to non-Tor users, for similar reasons as the referer leakage.

As such, our current plan is to bind's lifespan to the Referer header contents in our addon implementations.


We believe that privacy can be a differentiating feature for browsers. Even early studies revealed that many users immediately began using private browsing modes regularly, either by mistake or deliberately.

We believe that many of these users deliberately use private browsing in cafes and on other alternate Internet connections assuming that they are being protected from ad tracking and behavioral analysis. We welcome user studies to determine what users actually expect and want from private browsing modes for the definitive answer, but obviously we're pretty convinced what the outcome will be.

Privacy by design represents the technical realization of "Do-Not-Track": the ability to actually opt-out (prevent) a complete behavior profile from being built to record and model your specific web viewing habits.

In order for a private browsing mode to succeed in actually providing privacy by design, it must reduce activity linkability in all forms. Six out of the seven items mentioned above are really linkability issues at their core. Reducing the ability of the adversary to link private activity to non-private activity and also to other private activity is what privacy by design is all about. This reduction in linkability is what prevents a behavioral profile from being constructed.

The Tor Project looks forward to a day where privacy by design becomes a key feature of major browsers. We would love to be able to ship a vastly simplified browser extension that contains only a compiled Tor binary and some minimal addon code that simply "upgrades" the user's private browsing mode into a fully functional anonymous mode. The ability to do this would vastly simplify our package offerings, and make it significantly easier to get our software into censored and oppressed regions.

However, until then, we must do our best to attempt to provide software that we believe will provide the privacy and security that users have come to expect from us. For now, this means shipping our own browser.

New Tor Browser Bundles (and other packaging updates)

Tor is out and there are the usual packaging updates. You can go right to the download page to update.

The alpha Vidalia bundles have also been updated with the latest Torbutton 1.3.3-alpha which has itself been updated to work with the latest Firefox 4.0.1 release and has this notable feature:

When used with Firefox 4 or the alpha Tor Browser Bundles, it also
features support for youtube videos in HTML5, but you must currently
opt-in for youtube to provide you with HTML5 video as opposed to

Tor Browser Bundle changelogs follow.

Firefox 3.6 Tor Browser Bundles

Tor Browser Bundle for Windows

1.3.24: Released 2011-04-30

  • Update Firefox to 3.6.17
  • Update Libevent to 2.0.10-stable
  • Update zlib to 1.2.5
  • Update OpenSSL to 1.0.0d

Tor Browser Bundle for Linux
1.1.8: Released 2011-04-30

  • Update Tor to
  • Update Firefox to 3.6.17

Tor Browser Bundle for OS X
1.0.16: Released 2011-04-30

  • Update Tor to
  • Update Firefox to 3.6.17

Firefox 4 Tor Browser Bundles

Tor Browser Bundle (2.2.25-1) alpha; suite=all

  • Update Tor to
  • Update Firefox to 4.0.1
  • Update Torbutton to 1.3.3-alpha
  • Update BetterPrivacy to 1.50
  • Update NoScript to

Temporary direct download links for Firefox 4 bundles:

To Toggle, or not to Toggle: The End of Torbutton

In a random bar about two years ago, a Google Chrome developer asked me why Torbutton didn't just launch a new, clean Firefox profile/instance to deal with the tremendous number of state separation issues. Simply by virtue of him asking me this question, I realized how much better off Chrome was by implementing Incognito Mode this way and how much simpler it must have been for them overall (though they did not/do not deal with anywhere near as many issues as Torbutton does)...

So I took a deep breath, and explained how the original use model of Torbutton and my initial ignorance at the size of the problem had led me through a series of incremental improvements to address the state isolation issue one item at a time. Since the toggle model was present at the beginning of this vision quest, it was present at the end.

I realized at that same instant that in hindsight, this decision was monumentally stupid, and that I had been working harder, not smarter. However, I thought then that since we had the toggle model built, we might as well keep it: it allowed people to use their standard issue Firefoxes easily and painlessly with Tor.

I now no longer believe even this much. I think we should completely do away with the toggle model, as well as the entire idea of Torbutton as a separate piece of user-facing software, and rely solely on the Tor Browser Bundles, except perhaps with the addition of standalone Tor+Vidalia binaries for use by experts and relay operators.

The Tor Browser Bundles will include Torbutton, but we will no longer recommend that people use Torbutton without Tor Browser. Torbutton will be removed from, and the Torbutton download page will clearly state that it is for experts only. If serious unfixed security issues begin to accumulate against the toggle model, we will stop providing Torbutton xpis at all.

I believe this shift must be done for a few reasons: some usability, some technical. Since I feel the usability issues trump the technical ones, I'll discuss them first.

Unfortunately, the Tor Project doesn't really have funding to conduct official usability studies to help us make the best choice, but I think that even without them, it is pretty clear that this migration is what we must do to improve the status quo.

I think the average user is horribly confused by both the toggle model and the need to install additional software into Firefox (or conversely, the need to *also* install Tor software onto their computers after they install Torbutton). I also think that the average user is not likely to use this software safely. They are likely to log in to sites over Tor that they shouldn't, forget which tor mode they are in, and forget which mode certain tabs were opened under. These are all nightmare situations for anonymity and privacy.

On the technical side, several factors are forcing us in the direction of a short-term fork of Firefox. The over-arching issue is that the set of bugfixes required to maintain the toggle model is a superset of those required to maintain the browser model. Trac report #39 lists the bugs we must fix for the browser model, where as to maintain the toggle model, we must fix bugs from trac report #14 in addition to the bugs in report #39.

A similar issue exists with bugs that must be fixed in Firefox. The Firefox API bugs that need to be addressed to properly support the toggle model include rather esoteric and complicated issues that few groups other than Tor will find useful.

This means more resistance from Mozilla to get the toggle mode bugs fixed or even merged, less likelihood the fixes will be used elsewhere, and more danger they will succumb to bitrot. As a result, the lag time between fix and deployment for low-priority Firefox bugs can be as long as 3 years. See Bug 280661 for an example.

The Tor Browser bugs on the other hand are more directly usable by Firefox in its own Private Browsing Mode, which makes them more likely to merge quicker, and be maintained long-term. Also, because we are releasing our own Firefox-based browser, we will also have more control over experimenting with them and deploying these fixes to our users rapidly, as opposed to waiting for the next major Firefox release.

So, we can either invest effort in improving the UI of Torbutton to better educate users to understand our particular rabbit-hole tunnel-vision of design choices, and also solving crazier Firefox bugs; or we can reconsider our user model and try to simplify our software.

We don't have the manpower (ie: enough me) to do both. This means we should go with the simpler, easier option.

We do face a small number of barriers and downsides associated with this plan. We are collecting the issues we need to address ASAP as child tickets of this bug:

Overall, the downsides seem to mostly apply to expert users and how they will adapt the custom Tor setups they have built. We don't anticipate a lot of long term issues with this group, as most of the configuration options of Torbutton will remain available, and users should still be able to install custom addons and configure their Tor Browser profile however they need (even to the point of running it side-by-side to a system tor instance that is used for non-web applications).

Additional discussion about this issue has occurred on the tor-talk mailinglist.

Hopefully this announcement doesn't ruin your day!

Tor Browser Bundle 1.1.9 Released

Tor Browser Bundle 1.1.9 is released.

It includes the following changes:

Update Tor to
Update Firefox to 3.0.6
Update Vidalia to 0.1.11

It's available at

Syndicate content Syndicate content