Blogs

Tor Browser 5.0 is released

The Tor Browser Team is proud to announce the first stable release in the 5.0 series. This release is available from the Tor Browser Project page and also from our distribution directory.

This release features important security updates to Firefox. Note that the recent PDF.js exploit did not affect 4.5 users, but they should upgrade to this release immediately because numerous other potential security issues were fixed by Mozilla in this release. (Incidentally: Users who are using the 5.0-alpha series are vulnerable to the PDF.js exploit, but not if they were using the 'High' security level. Regardless, we are also upgrading 5.0-alpha users to 5.5a1 today to fix the issue as well).

This release also brings us up to date with Firefox 38-ESR, which should mean improved support for HTML5 video on Youtube, as well as a host of other improvements. Controversial and hard-to-audit binary components related to EME DRM were disabled, however.

The release also features new privacy enhancements. In particular, more identifier sources that appeared in Firefox 38 (or were otherwise disabled previously) are now isolated to the first party (URL bar) domain. This release also contains defenses from the 5.0-alpha series for keystroke (typing) fingerprinting and some instances of performance/timing fingerprinting.

Regrettably, our new defenses for font and keyboard layout fingerprinting did not stabilize in time for this release. Users who are interested in helping us improve them should try out 5.5a1.

This release also will reset the permanent NoScript whitelist, due to an issue where previous NoScript updates had added certain domains to the whitelist during upgrade. The whitelist is reset to the default for all users as a result, and future updates to the whitelist by NoScript have been disabled.

Starting with this release, Tor Browser will now also download and apply upgrades in the background, to ensure that users upgrade quicker and with less interaction. This behavior is governed by the about:config pref app.update.auto, but we do not recommend disabling it unless you really know what you're doing.

Here is the complete changelog since 4.5.3:

  • All Platforms
    • Update Firefox to 38.2.0esr
    • Update OpenSSL to 1.0.1p
    • Update HTTPS-Everywhere to 5.0.7
    • Update NoScript to 2.6.9.34
    • Update meek to 0.20
    • Update Tor to 0.2.6.10 with patches:
      • Bug 16674: Allow FQDNs ending with a single '.' in our SOCKS host name checks.
      • Bug 16430: Allow DNS names with _ characters in them (fixes nytimes.com)
      • Bug 15482: Don't allow circuits to change while a site is in use
    • Update Torbutton to 1.9.3.2
      • Bug 16731: TBB 5.0 a3/a4 fails to download a file on right click
      • Bug 16730: Reset NoScript whitelist on upgrade
      • Bug 16722: Prevent "Tiles" feature from being enabled after upgrade
      • Bug 16488: Remove "Sign in to Sync" from the browser menu (fixup)
      • Bug 16268: Show Tor Browser logo on About page
      • Bug 16639: Check for Updates menu item can cause update download failure
      • Bug 15781: Remove the sessionstore filter
      • Bug 15656: Sync privacy.resistFingerprinting with Torbutton pref
      • Bug 16427: Use internal update URL to block updates (instead of 127.0.0.1)
      • Bug 16200: Update Cache API usage and prefs for FF38
      • Bug 16357: Use Mozilla API to wipe permissions db
      • Bug 14429: Make sure the automatic resizing is disabled
      • Translation updates
    • Update Tor Launcher to 0.2.7.7
      • Bug 16428: Use internal update URL to block updates (instead of 127.0.0.1)
      • Bug 15145: Visually distinguish "proxy" and "bridge" screens.
      • Translation updates
    • Bug 16730: Prevent NoScript from updating the default whitelist
    • Bug 16715: Use ThreadsafeIsCallerChrome() instead of IsCallerChrome()
    • Bug 16572: Verify cache isolation for XMLHttpRequests in Web Workers
    • Bug 16884: Prefer IPv6 when supported by the current Tor exit
    • Bug 16488: Remove "Sign in to Sync" from the browser menu
    • Bug 16662: Enable network.http.spdy.* prefs in meek-http-helper
    • Bug 15703: Isolate mediasource URIs and media streams to first party
    • Bug 16429+16416: Isolate blob URIs to first party
    • Bug 16632: Turn on the background updater and restart prompting
    • Bug 16528: Prevent indexedDB Modernizr site breakage on Twitter and elsewhere
    • Bug 16523: Fix in-browser JavaScript debugger
    • Bug 16236: Windows updater: avoid writing to the registry
    • Bug 16625: Fully disable network connection prediction
    • Bug 16495: Fix SVG crash when security level is set to "High"
    • Bug 13247: Fix meek profile error after bowser restarts
    • Bug 16005: Relax WebGL minimal mode
    • Bug 16300: Isolate Broadcast Channels to first party
    • Bug 16439: Remove Roku screencasting code
    • Bug 16285: Disabling EME bits
    • Bug 16206: Enforce certificate pinning
    • Bug 15910: Disable Gecko Media Plugins for now
    • Bug 13670: Isolate OCSP requests by first party domain
    • Bug 16448: Isolate favicon requests by first party
    • Bug 7561: Disable FTP request caching
    • Bug 6503: Fix single-word URL bar searching
    • Bug 15526: ES6 page crashes Tor Browser
    • Bug 16254: Disable GeoIP-based search results.
    • Bug 16222: Disable WebIDE to prevent remote debugging and addon downloads.
    • Bug 13024: Disable DOM Resource Timing API
    • Bug 16340: Disable User Timing API
    • Bug 14952: Disable HTTP/2
    • Bug 1517: Reduce precision of time for Javascript
    • Bug 13670: Ensure OCSP & favicons respect URL bar domain isolation
    • Bug 16311: Fix navigation timing in ESR 38
  • Windows
    • Bug 16014: Staged update fails if meek is enabled
    • Bug 16269: repeated add-on compatibility check after update (meek enabled)
  • Mac OS
    • Use OSX 10.7 SDK
    • Bug 16253: Tor Browser menu on OS X is broken with ESR 38
    • Bug 15773: Enable ICU on OS X
  • Build System
    • Bug 16351: Upgrade our toolchain to use GCC 5.1
    • Bug 15772 and child tickets: Update build system for Firefox 38
    • Bugs 15921+15922: Fix build errors during Mozilla Tryserver builds
    • Bug 15864: rename sha256sums.txt to sha256sums-unsigned-build.txt

Tails 1.5 is out

Tails, The Amnesic Incognito Live System, version 1.5, is out.

This release fixes numerous security issues and all users must upgrade as soon as possible.

New features

  • Disable access to the local network in the Tor Browser. You should now use the Unsafe Browser to access the local network.

Upgrades and changes

  • Install Tor Browser 5.0 (based on Firefox 38esr).
  • Install a 32-bit GRUB EFI boot loader. Tails should now start on some tablets with Intel Bay Trail processors among others.
  • Let the user know when Tails Installer has rejected a device because it is too small.

There are numerous other changes that might not be apparent in the daily operation of a typical user. Technical details of all the changes are listed in the Changelog.

Fixed problems

  • Our AppArmor setup has been audited and improved in various ways which should harden the system.
  • The network should now be properly disabled when MAC address spoofing fails.

Known issues

See the current list of known issues.

Download or upgrade

Go to the download page.

What's coming up?

The next Tails release is scheduled for September 22.

Have a look to our roadmap to see where we are heading to.

Do you want to help? There are many ways you can contribute to Tails, for example by donating. If you want to help, come talk to us!

Support and feedback

For support and feedback, visit the Support section on the Tails website.

Tor Weekly News — August 8th, 2015

Welcome to the thirtieth issue in 2015 of Tor Weekly News, the weekly newsletter that covers what’s happening in the Tor community.

Tor 0.2.7.2-alpha is out

Nick Mathewson announced the second alpha release in the Tor 0.2.7.x series. This version includes improvements to the handling of Tor’s identity keys, which now use the Ed25519 elliptic curve signature format. It also allows onion service operators to specify a higher number of introduction points with a special configuration option, if the service is coming under heavy load, “at the cost of making it more visible that the hidden service is facing extra load”.

For full details of the many other developments in this release, please see Nick’s announcement. The source code is available as usual from Tor’s distribution directory.

Tor Browser 5.0a4 is out

The Tor Browser team put out their fourth alpha release in the 5.0 series of the privacy-preserving anonymous browser. “Most notably, this release contains an experimental defense against font fingerprinting by using an identical set of shipped fonts on all supported platforms”, wrote Georg Koppen. This version also fixes some of the issues created by the update to Firefox 38ESR, which “brings us very close to a stable Tor Browser 5.0, which we aim to release next week”.

Get your copy of the new alpha from the project page, or via the incremental updater if you are already using the alpha Tor Browser series.

Random number generation during Tor voting

One of the weaknesses of the current onion service design is that parts of it (such as the relays chosen by a service to upload its descriptor) rely on a list of Tor relays which is generated in a predictable way. This makes it possible for people with malicious intentions to insert their bad relays into the list at points of their choosing, in order to carry out attacks such as denials-of-service (as some researchers proved earlier this year). A good way of preventing this is to make Tor’s directory authorities jointly come up with a random number as part of their regular voting procedure, which is then used by onion services to choose the directories to which they will upload their descriptor information, and by clients to find those same directories. It could also be used by other systems as a shared source of randomness.

George Kadianakis published a draft proposal describing how this procedure could work. For a period of twelve hours, the directory authorities send each other a “commitment”, consisting of the hash of a 256-bit value. Once all authorities are aware of the others’ commitments, they then reveal to one another the values they committed to, for another twelve-hour period. At the end of that time, the revealed values are checked to see if they correspond to the commitments, and then they are all used to compute that day’s random value. This works because although you can use the commitment hash to verify that the value revealed is the same as the one decided upon twelve hours ago, you cannot derive the value itself from the commitment.

Please see the draft proposal in full for discussion of the finer points of the proposed system, or if you are a fan of ingenious solutions.

CameraV (aka InformaCam) is out

The Guardian Project put out a full release of CameraV (or InformaCam), a nifty smartphone application that lets you “capture and share verifiable photos and video proof on a smartphone or tablet, all the while keeping it entirely secure and private”. It allows you to prove the authenticity of your photos by using “the built-in sensors in modern smartphones for tracking movement, light and other environmental inputs, along with Wi-Fi, Bluetooth, and cellular network information to capture a snapshot of the environment around you” and bundling this information into the picture file.

As you would expect, InformaCam is fully compatible with the Guardian Project’s Tor software offerings for Android, so whether you’re a citizen journalist or a keen phone photographer who values privacy, take a look at the CameraV page and try it out for yourself!

Monthly status reports for July month 2015

The wave of regular monthly reports from Tor project members for the month of July has begun. Pearl Crescent released their report first (for work on Tor Browser development), followed by reports from David Goulet (on onion service research and development), Georg Koppen (working on Tor Browser), Isabela Bagueros (for overall project management), Karsten Loesing (working on Tor network tools and organizational tasks), Damian Johnson (on Nyx and stem development), and Juha Nurmi (on ahmia.fi development).

The students in this year’s Tor Summer of Privacy also sent updates about their progress. Donncha O’Cearbhaill gave news of the OnionBalance load-balancing project, while Jesse Victors did the same for the OnioNS DNS-like system, Cristobal Leiva for the relay web status dashboard, and Israel Leiva for continuing development of the GetTor alternative software distributor.

Finally, the Tails team published their June report, bringing updates about outreach, infrastructure, funding, and ongoing discussions relating to the anonymous live operating system.

Miscellaneous news

The participants in the recent onion service hackfest in Washington, DC published a summary of the exciting progress they made during the meeting.

Arturo Filastò announced that an OONI-related hackathon entitled “ADINA15: A Dive Into Network Anomalies” will be held on October 1-2 in the Chamber of Deputies at the Italian Parliament in Rome. “This means that you are all invited…to put your design and data analysis skills to the test!”

David Fifield published the regular summary of costs incurred by the infrastructure for meek.

Nathan Freitas explored possible routes to an Android-compatible version of Ricochet, the exciting new privacy-preserving instant messaging application based on Tor onion services.


This issue of Tor Weekly News has been assembled by BitingBird and Harmony.

Want to continue reading TWN? Please help us create this newsletter. We still need more volunteers to watch the Tor community and report important news. Please see the project page, write down your name and subscribe to the team mailing list if you want to get involved!

A Hidden Service Hackfest: The Arlington Accords



At the beginning of July, a few of us gathered in Washington DC for the first hidden service hackfest. Our crew was comprised of core Tor developers and researchers who were in the area; mostly attendees of PETS. The aim was to push hidden service development forward and swiftly arrive at decisions that were too tiresome and complex to make over e-mail.

Since we were mostly technical folks, we composed technical proposals and prioritized development, and spent less time with organizational or funding tasks. Here is a snapshot of the work that we did during those 5 days:

  • The first day, we discussed current open topics on hidden services and tasks we should be doing in the short-to-medium-term future.

    Our list of tasks included marketing and fundraising ones like "Re-branding hidden services" and "Launch crowdfunding campaign", but we spent most of the first day discussing Proposal 224 aka the "Next Generation Hidden Services" project.

  • Proposal 224 is our master plan for improving hidden services in fundamental ways: The new system will be faster, use better cryptography, have more secure onion addresses, and offer advanced security properties like improved DoS resistance and keeping identity keys offline. It's heavy engineering work, and we are still fine-tuning the design, so implementation has not started yet.

    While discussing how we would implement the system, we decided that we would need to write most of the code for this new protocol from scratch, instead of hooking into the old and rusty hidden service code. To move this forward, we spent part of the following days splitting the proposal into individual modules and figuring out how to refactor the current data structures so that the new protocol can coexist with the old protocol.

  • One open design discussion on proposal 224 has been an earlier suggestion of merging the roles of "hidden service directory" and "introduction point" on the hidden service protocol. This change would improve the security and performance as well as simplify the relevant code, and reduce load on the network. Because it changes the protocol a bit, it would be good to have it specified precisely. For this reason, we spent the second and third days writing a proposal that defines how this change works.
  • Another core part of proposal 224 is the protocol for global randomness calculation. That's a system where the Tor network itself generates a fresh, unpredictable random value everyday; basically like the NIST Randomness Beacon but decentralized.

    Proposal 225 specifies a way that this can be achieved, but there are still various engineering details that need to be ironed out. We spent some time discussing the various ways we can implement the system and the engineering decisions we should take, and produced a draft Tor proposal that specifies the system.

  • We also discussed guard discovery attacks, and the various defenses that we could deploy. The fact that many core Tor people were present helped us decide rapidly which various parameters and trade-offs that we should pick. We sketched a proposal and posted it to the [tor-dev] mailing list and it has already received very helpful feedback.
  • We also took our old design for "Direct Onion Services" and revised it into a faster and far more elegant protocol. These types of services trade service-side location privacy for improved performance, reliability, and scalability. They will allow sites like reddit to offer their services faster on hidden services while respecting their clients anonymity. During the last days of the hackfest, we wrote a draft proposal for this new design.
  • We did more development on OnioNS, the Onion Name System, which allows a hidden service operator to register a human memorable name (e.g. example.tor) that can be used instead of the regular onion address. In the last days of the hackfest we prepared a proof-of-concept demo wherein a domain name was registered and then the Tor Browser successfully loaded a hidden service under that name. That was a significant step for the project.
  • We talked about rebranding the "Hidden Services" project to "Onion Services" to reduce "hidden"/"dark"/"evil" name connotations, and improve terminology. In fact, we've been on this for a while, but we are still not sure what the right name is. What do you think?




And that's only part of what we did. We also wrote code for various tickets, reviewed even more code and really learned how to use Ricochet.

All in all, we managed to fit more things than we hoped into those few days and we hope to do even more focused hackfests in the near future. Email us if you are interested in hosting a hackfest!

If you'd like to get involved with hidden service development, you can contact the hackfest team. Our nicks on IRC OFTC are armadev, asn, dgoulet, kernelcorn, mrphs, ohmygodel, robgjansen, saint, special, sysrqb, and syverson.

Until next time!

Tor Browser 5.0a4 is released

The Tor Browser Team is proud to announce the second alpha release based on Firefox 38 ESR. This release is also the fourth and final alpha in the 5.0 series. The release is available for download in the 5.0a4 distribution directory and on the alpha download page.

Most notably, this release contains an experimental defense against font fingerprinting by using an identical set of shipped fonts on all supported platforms. We've also updated the versions of several Tor Browser components, including updating Tor to 0.2.7.2-alpha. The 5.0-stable release will be based on Tor 0.2.6-latest, however.

Last but not least we fixed a lot of important bugs that were due to our switch to Firefox 38 ESR, including issues with major websites such as Twitter. This release brings us very close to a stable Tor Browser 5.0, which we aim to release next week. Unless we hear about additional issues, not much will change between 5.0a4 and 5.0-stable, aside from the Tor version and possibly the font defense.

Here is the complete changelog since 5.0a3

  • All Platforms
    • Update Tor to 0.2.7.2-alpha with patches
      • Bug 15482: Don't allow circuits to change while a site is in use
    • Update OpenSSL to 1.0.1p
    • Update HTTPS-Everywhere to 5.0.7
    • Update NoScript to 2.6.9.31
    • Update Torbutton to 1.9.3.1
      • Bug 16268: Show Tor Browser logo on About page
      • Bug 16639: Check for Updates menu item can cause update download failure
      • Bug 15781: Remove the sessionstore filter
      • Bug 15656: Sync privacy.resistFingerprinting with Torbutton pref
      • Translation updates
    • Bug 16884: Prefer IPv6 when supported by the current Tor exit
    • Bug 16488: Remove "Sign in to Sync" from the browser menu
    • Bug 13313: Bundle a fixed set of fonts to defend against fingerprinting
    • Bug 16662: Enable network.http.spdy.* prefs in meek-http-helper
    • Bug 15646: Prevent keyboard layout fingerprinting in KeyboardEvent (fixup)
    • Bug 15703: Isolate mediasource URIs and media streams to first party
    • Bug 16429+16416: Isolate blob URIs to first party
    • Bug 16632: Turn on the background updater and restart prompting
    • Bug 16528: Prevent IndexedDB Modernizr site breakage on Twitter and elsewhere
    • Bug 16523: Fix in-browser JavaScript debugger
    • Bug 16236: Windows updater: avoid writing to the registry
    • Bug 16005: Restrict WebGL minimal mode a bit (fixup)
    • Bug 16625: Fully disable network connection prediction
    • Bug 16495: Fix SVG crash when security level is set to "High"
  • Build System
    • Bug 15864: Rename sha256sums.txt to sha256sums-unsigned-build.txt

A technical summary of the Usenix fingerprinting paper

Albert Kwon, Mashael AlSabah, and others have a paper entitled Circuit Fingerprinting Attacks: Passive Deanonymization of Tor Hidden Services at the upcoming Usenix Security symposium in a few weeks. Articles describing the paper are making the rounds currently, so I'm posting a technical summary here, along with explanations of the next research questions that would be good to answer. (I originally wrote this summary for Dan Goodin for his article at Ars Technica.) Also for context, remember that this is another research paper in the great set of literature around anonymous communication systems—you can read many more at http://freehaven.net/anonbib/.

"This is a well-written paper. I enjoyed reading it, and I'm glad the researchers are continuing to work in this space.

First, for background, run (don't walk) to Mike Perry's blog post explaining why website fingerprinting papers have historically overestimated the risks for users:
https://blog.torproject.org/blog/critique-website-traffic-fingerprinting...
and then check out Marc Juarez et al's followup paper from last year's ACM CCS that backs up many of Mike's concerns:
http://freehaven.net/anonbib/#ccs2014-critical

To recap, this new paper describes three phases. In the first phase, they hope to get lucky and end up operating the entry guard for the Tor user they're trying to target. In the second phase, the target user loads some web page using Tor, and they use a classifier to guess whether the web page was in onion-space or not. Lastly, if the first classifier said "yes it was", they use a separate classifier to guess which onion site it was.

The first big question comes in phase three: is their website fingerprinting classifier actually accurate in practice? They consider a world of 1000 front pages, but ahmia.fi and other onion-space crawlers have found millions of pages by looking beyond front pages. Their 2.9% false positive rate becomes enormous in the face of this many pages—and the result is that the vast majority of the classification guesses will be mistakes.

For example, if the user loads ten pages, and the classifier outputs a guess for each web page she loads, will it output a stream of "She went to Facebook!" "She went to Riseup!" "She went to Wildleaks!" while actually she was just reading posts in a Bitcoin forum the whole time? Maybe they can design a classifier that works well when faced with many more web pages, but the paper doesn't show one, and Marc Juarez's paper argues convincingly that it's hard to do.

The second big question is whether adding a few padding cells would fool their "is this a connection to an onion service" classifier. We haven't tried to hide that in the current Tor protocol, and the paper presents what looks like a great classifier. It's not surprising that their classifier basically stops working in the face of more padding though: classifiers are notoriously brittle when you change the situation on them. So the next research step is to find out if it's easy or hard to design a classifier that isn't fooled by padding.

I look forward to continued attention by the research community to work toward answers to these two questions. I think it would be especially fruitful to look also at true positive rates and false positives of both classifiers together, which might show more clearly (or not) that a small change in the first classifier has a big impact on foiling the second classifier. That is, if we can make it even a little bit more likely that the "is it an onion site" classifier guesses wrong, we could make the job of the website fingerprinting classifier much harder because it has to consider the billions of pages on the rest of the web too."

Tor Exit Nodes in Libraries - Pilot (phase one)

Hello Tor Community!

We first introduced you to the Library Freedom Project back in February after we won the Knight News Challenge on Libraries. Since then, we’ve been hard at work bringing privacy education to libraries across the United States, with stops in the UK and Ireland, virtual trainings in Canada and Australia, and more plans to visit international libraries in the works.

Today, we're excited to announce a new initiative, a collaboration between the Library Freedom Project and Tor Project: Tor exit relays in libraries. Nima Fatemi, the Tor Project member who's already helped Library Freedom Project in a number of ways, is our main partner on this project. This is an idea whose time has come; libraries are our most democratic public spaces, protecting our intellectual freedom, privacy, and unfettered access to information, and Tor Project creates software that allows all people to have these rights on the internet. We're excited to combine our efforts to help libraries protect internet freedom, strengthen the Tor network, and educate the public about how Tor can help protect their right to digital free expression.

Libraries have been committed to intellectual freedom and privacy for decades, outlining these commitments in the ALA Core Values of Librarianship, the Freedom to Read Statement, and the ALA Code of Ethics. They're also centers of education in their local communities, offering free classes on a variety of subjects, including computer instruction. Libraries serve a diverse audience; many of our community members are people who need Tor but don't know that it exists, and require instruction to understand and use it.

Some of these patrons are part of vulnerable groups, like domestic violence survivors, racial and ethnic minorities, student activists, or queer and trans communities. Others belong to local law enforcement or municipal government. All of them could benefit from learning about Tor in a trusted, welcoming environment like the library.

Bringing Tor exit relays into libraries would not only be a powerful symbolic gesture demonstrating our commitment to a free internet, but also a practical way to help the Tor network, and an excellent opportunity to help educate library patrons, staff, boards of trustees, and other stakeholders about the importance of Tor. For libraries that have already installed Tor Browser on library PCs, running a relay is the obvious next step toward supporting free expression in their communities and all over the world.

As public internet service providers, libraries are shielded from some of the legal concerns that an individual exit relay operator might face, such as trying to explain to law enforcement that the traffic leaving her exit is not her own. Furthermore, libraries are protected from DMCA takedowns by safe harbor provisions. Importantly, librarians know their rights and are ready to fight back when those rights are challenged.

In order to begin this new project, we needed a pilot, and we had just the library in mind – Kilton Library in Lebanon, New Hampshire, one of two Lebanon Libraries. Chuck McAndrew is the IT librarian there, and he's done amazing things to the computers on his network, like running them all on GNU/Linux distributions. Why is this significant? Most library environments run Microsoft Windows, and we know that Microsoft participated in the NSA's PRISM surveillance program. By choosing GNU/Linux and installing some privacy-protecting browser extensions too, Chuck's helping his staff and patrons opt-out of pervasive government and corporate surveillance. Pretty awesome.

Kilton Library is not only exemplary because of its GNU/Linux computer environment; it's also beautiful and brand-new, LEED Gold-certified, with an inviting and sunny open floor plan and an outdoor community garden. It's an example of the amazing potential inherent in libraries. We drove up to New Hampshire last week to start phase one.

We decided to set our pilot up as a middle relay to start – we want to ensure that it is stable and doesn't interfere in any way with the library's other network traffic. We nicknamed the new relay LebLibraries, and you can check out how our relay is doing here, on Globe.

After the LebLibraries relay is up for a few months, we'll return for phase two of the project and convert it into an exit node. Our goal is to make exit relay configuration a part of the Library Freedom Project's privacy trainings for librarians; we'll meet with library directors and boards of trustees to talk about how Tor fits into the mission of libraries as beacons of intellectual freedom, and how libraries are perfectly positioned not only to help our patrons use Tor Browser, but are the ideal location to run Tor exit relays to help give back to the Tor community.

We need more libraries to join us in this initiative. Want your local library to be our next exit relay site? Know an awesome librarian who wants to help protect free expression locally and globally? Please have them contact us with the answers to this questionnaire. We're also looking for libraries to host FOSS seedboxes. And as always, we want libraries to install and run the Tor Browser on library computers.

Want to support this project and more like it? You can make a donation to the Library Freedom Project, or donate directly to Tor Project. And stay tuned for phase two of our pilot with Kilton Library.

Alison Macrina and Nima Fatemi


A version of this post also appeared on The Library Freedom Project’s blog

Note: This post was drafted by Alison. (Thank you!)

Tor 0.2.7.2-alpha is released

This, the second alpha in the Tor 0.2.7 series, has a number of new features, including a way to manually pick the number of introduction points for hidden services, and the much stronger Ed25519 signing key algorithm for regular Tor relays (including support for encrypted offline identity keys in the new algorithm).

Support for Ed25519 on relays is currently limited to signing router descriptors; later alphas in this series will extend Ed25519 key support to more parts of the Tor protocol.

If you typically build Tor from source, you can download the source code from the usual place on the website.
Packages should be up in a few days.

Changes in version 0.2.7.2-alpha - 2015-07-27
  • Major features (Ed25519 identity keys, Proposal 220):
    • All relays now maintain a stronger identity key, using the Ed25519 elliptic curve signature format. This master key is designed so that it can be kept offline. Relays also generate an online signing key, and a set of other Ed25519 keys and certificates. These are all automatically regenerated and rotated as needed. Implements part of ticket 12498.
    • Directory authorities now vote on Ed25519 identity keys along with RSA1024 keys. Implements part of ticket 12498.
    • Directory authorities track which Ed25519 identity keys have been used with which RSA1024 identity keys, and do not allow them to vary freely. Implements part of ticket 12498.
    • Microdescriptors now include Ed25519 identity keys. Implements part of ticket 12498.
    • Add support for offline encrypted Ed25519 master keys. To use this feature on your tor relay, run "tor --keygen" to make a new master key (or to make a new signing key if you already have a master key). Closes ticket 13642.
  • Major features (Hidden services):
    • Add the torrc option HiddenServiceNumIntroductionPoints, to specify a fixed number of introduction points. Its maximum value is 10 and default is 3. Using this option can increase a hidden service's reliability under load, at the cost of making it more visible that the hidden service is facing extra load. Closes ticket 4862.
    • Remove the adaptive algorithm for choosing the number of introduction points, which used to change the number of introduction points (poorly) depending on the number of connections the HS sees. Closes ticket 4862.

  read more »

Syndicate content Syndicate content