Tor at the Heart: PETS and the Privacy Research Community

by arma | December 21, 2016

During the month of December, we're highlighting other organizations and projects that rely on Tor, build on Tor, or are accomplishing their missions better because Tor exists. Check out our blog each day to learn about our fellow travelers. And please support the Tor Project! We're at the heart of Internet freedom. Donate today!

So far in this blog series we've highlighted mainly software and advocacy projects. Today is a little different: I'm going to explain more about Tor's role in the academic world of privacy and security research.

Part one: Tor matters to the research community

Just about every major security conference these days has a paper analyzing, attacking, or improving Tor. While ten years ago the field of anonymous communications was mostly theoretical, with researchers speculating that a given design should or shouldn't work, Tor now provides an actual deployed testbed. Tor has become the gold standard for anonymous communications research for three main reasons:

First, Tor's source code and specifications are open. Beyond its original design document, Tor provides a clear and published set of RFC-style specifications describing exactly how it is built, why we made each design decision, and what security properties it aims to offer. The Tor developers conduct design discussion in the open, on public development mailing lists, and the public development proposal process provides a clear path by which other researchers can participate.

Second, Tor provides open APIs and maintains a set of tools to help researchers and developers interact with the Tor software. The Tor software's "control port" lets controller programs view and change configuration and status information, as well as influence path selection. We provide easy instructions for setting up separate private Tor networks for testing. This modularity makes Tor more accessible to researchers because they can run their own experiments using Tor without needing to modify the Tor program itself.

Third, real users rely on Tor. Every day hundreds of thousands of people connect to the Tor network and depend on it for a broad variety of security goals. In addition to its emphasis on research and design, The Tor Project has developed a reputation as a non-profit that fosters this community and puts its users first. This real-world relevance motivates researchers to help make sure Tor provides provably good security properties.

I wrote the above paragraphs in 2009 for our first National Science Foundation proposal, and they've become even more true over time. A fourth reason has also emerged: Tor attracts researchers precisely because it brings in so many problems that are at the intersection of "hard to solve" and "matter deeply to the world". How to protect communications metadata is one of the key open research questions of the century, and nobody has all the answers. Our best chance at solving it is for researchers and developers all around the world to team up and all work in the open to build on each other's progress.

Since starting Tor, I've done probably 100 Tor talks to university research groups all around the world, teaching grad students about these open research problems in the areas of censorship circumvention (which led to the explosion of pluggable transport ideas), privacy-preserving measurement, traffic analysis resistance, scalability and performance, and more.

The result of that effort, and of Tor's success in general, is a flood of research papers, plus a dozen research labs who regularly have students who write their thesis on Tor. The original Tor design paper from 2004 now has over 3200 citations, and in 2014 Usenix picked that paper out of all the security papers in 2004 to win their Test of Time award.

Part two: University collaborations

This advocacy and education work has also led to a variety of ongoing collaborations funded by the National Science Foundation, including with Nick Feamster's group at Princeton on measuring censorship, with Nick Hopper's group at University of Minnesota on privacy-preserving measurement, with Micah Sherr's group at Georgetown University on scalability and security against denial of service attacks, and an upcoming one with Matt Wright's group at RIT on defense against website fingerprinting attacks.

All of these collaborations are great, but there are precious few people on the Tor side who are keeping up with them, and those people need to balance their research time with development, advocacy, management, etc. I'm really looking forward to the time where Tor can have an actual research department.

And lastly, I would be remiss in describing our academic collaborations without also including a shout-out to the many universities that are running exit relays to help the network grow. As professor Leo Reyzin from Boston University once explained for why it is appropriate for his research lab to support the Tor network, "If biologists want to study elephants, they get an elephant. I want my elephant." So, special thanks to Boston University, University of Michigan, University of Waterloo, MIT, CMU (their computer science department that is), University of North Carolina, University of Pennsylvania, Universidad Galileo, and Clarkson University. And if you run an exit relay at a university but you're not on this list, please reach out!

Part three: The Privacy Enhancing Technologies Symposium

Another critical part of the privacy research world is the Privacy Enhancing Technologies Symposium (PETS), which is the premiere venue for technical privacy and anonymity research. This yearly gathering started as a workshop in 2000, graduated to being called a symposium in 2008, and in 2015 it became an open-access journal named Proceedings on Privacy Enhancing Technologies.

The editorial board and chairs for PETS over the years overlap greatly with the Tor community, with a lot of names you'll see at both PETS and the Tor twice-yearly meetings, including Nikita Borisov, George Danezis, Claudia Diaz, Roger Dingledine (me), Ian Goldberg, Rachel Greenstadt, Kat Hanna, Nick Hopper, Steven Murdoch, Paul Syverson, and Matt Wright.

But beyond community overlap, The Tor Project is actually the structure underneath PETS. The group of academics who run the PETS gatherings intentionally did not set up corporate governance and all those pieces of bureaucracy that drag things down — so they can focus on having a useful research meeting each year — and Tor stepped in to effectively be the fiscal sponsor, by keeping the bank accounts across years, and by being the "owner" for the journal since De Gruyter's paperwork assumes that some actual organization has to own it. We're proud that we can help provide stability and longevity for PETS.

Speaking of all these papers: we have tracked the most interesting privacy and anonymity papers over the years on the anonymity bibliography (anonbib). But at this point, anonbib is still mostly a two-man show where Nick Mathewson and I update it when we find some spare time, and it's starting to show its age since its launch in 2003, especially with the huge growth in the field, and with other tools like Google Scholar. Probably the best answer is that we need to trim it down so it's more of a "recommended reading list" than a resource of all relevant papers. If you want to help, let us know!

Part four: The Tor Research Safety Board

This post is running long, so I will close by pointing to the Tor Research Safety Board, a group of researchers who study Tor and who want to minimize privacy risks while fostering a better understanding of the Tor network and its users. That page lists a set of guidelines on what to consider when you're thinking about doing research on Tor users or the Tor network, and a process for getting feedback and suggestions on your plan. We did a soft launch of the safety board this past year in the rump session at PETS, and we've fielded four requests for advice so far. We've been taking it slow in terms of publicity, but if you're a researcher and you can help us refine our process, please take a look!

Comments

Please note that the comment area below has been archived.

December 21, 2016

Permalink

This series is aaawwwsome!!! Tor Project is tiny but doing so many wonderful things!

I suspect USIC executives reading it must be fuming, asking why they can't get so much work out of their own workforce--- I certainly hope you will decline to tell them in your forthcoming visit to DC!

I'd like to nominate stylometry as a suitable topic for privacy research, while stressing the importance of funding untainted by USIC (e.g, we need student protesters screaming "DARPA out!")

You didn't mention a currently urgent problem for privacy and cybersecurity research: the collapse of the Wassenaar Arrangement.

To summarize the sad history of what has just happened:

o Companies like Gamma International and Hacking Team which have been selling potent custom malware to some of the most oppressive governments on earth. News organizations such as Bloomberg (whodda thunkit?), Wikileaks and The Guardian published some superb exposes, and in the past few years Citizen Labs has done stellar work in carefully documenting attacks on human rights workers and political dissidents in countries such as Ethiopia, Kenya, Thailand, and many many others.

o Under pressure from human rights groups in EU and US, some lawmakers have become interested in reigning in the malware-as-a-service industry exemplified by Gamma and HT. As it happened, the Wassenaar Arrangement which regulates arms trafficking were coming up for renegotiation.

o Lobbyists for the bad guys (the governments which buy malware they can't make themselves, and the companies which sell it to them) got involved and completely traduced the negotiations, ensuring that changes to the Wassenaar accords would curb international transfer of such essential cybersecurity tools as nmap, international conferences on cybersecurity, sharing malware samples, and other things needed to perform privacy and cybersecurity research, while encouraging the further growth of the malware-as-a-service industry.

o The talks have just collapsed without fixing the overbroad definition of "cyberweapon".

See:

http://thehill.com/policy/cybersecurity/311080-new-export-control-agree…
Critics pan changes to cyber export rules
Joe Uchill
19 Dec 2016

> A coalition of policymakers and cyber experts say they've failed to agree on changes to an international export pact they worry will hurt cybersecurity. “I am deeply disappointed that Wassenaar member states declined to make needed updates to the intrusion software controls, particularly those related to technologies necessary for their development,” wrote Rep. Jim Langevin (D-R.I.) in a statement.

http://arstechnica.com/tech-policy/2016/12/us-fails-in-bid-to-renegotia…
Congrats, hackers: you’re now a munition (sort of)
Wassenaar rules require export licenses for anything that could be considered “intrusion software”—but not in US, yet.
Sean Gallagher
20 Dec 2016

> If your work involves exploiting vulnerabilities in software, congratulations—you're potentially an arms merchant in the eyes of many governments. Your knowledge about how to hack could be classified as a munition. A United States delegation yesterday failed to convince all of the members of the Wassenaar Arrangement—a 41-country compact that sets guidelines for restricting exports of conventional weapons and "dual use goods"—to modify rules that would place export restrictions on technologies and data related to computer system exploits. And while the US government has so far declined to implement rules based on the existing convention, other countries may soon require export licenses from anyone who shares exploit data across borders—even in the form of security training.

http://www.theregister.co.uk/2016/12/21/wassenar_negotiations_fail/
Wassenaar weapons pact talks collapse leaving software exploit exports in limbo
Some progress, but it's glacial
Iain Thomson
21 Dec 2016

> Security researchers face continued uncertainty after talks broke down between US negotiators and 40 other countries over the state of exploit exports. The negotiations concern the Wassenaar Arrangement, an arms-control pact in which members agree to limit the export of certain types of weaponry and "dual-use products." Usually this just covers conventional weaponry, but in December 2013, new wording was introduced that banned the export of software tools that could be used for online warfare – particularly code to exploit and attack insecure programs and servers.

December 21, 2016

Permalink

One important issue which you hinted at in the last section but didn't really explain: the question of how to persuade researchers, especially newcomers, to perform their research on the Tor network ethically.

Years ago, as you know and can explain much better than I, Tor Project made a substantial effort to provide a testbed where researchers could test attacks and defenses using a model Tor network in their lab, rather than red-teaming the real data of real Tor users, possibly including greatly endangered people living in very dangerous regions of the world.

Unfortunately, it seems that many (most?) researchers failed to adopt it, possibly because university PR flacks insisted that they can more easily write explosive headlines in their press releases trumpeting their university's research programs (which administrators believe helps to attract more government research grants, smarter graduate students, and ultimately larger alumni gifts) if the research deanonymizes real Tor users.

This phenomenon is part of a broader problem with modern academic research, which has grown into something far more about seeking headlines than about doing good science, IMHO.

December 21, 2016

Permalink

It's so heart warming to see that the Tor Project and this community of researcher have been able to produce something that even the billion dollars backed NSA/GCHQ have labeled as "champion of low latency" anonymity! Long live for the Tor Project! May your funds go high, and your latency go low.