A Statement from The Tor Project on Software Integrity and Apple
The Tor Project exists to provide privacy and anonymity for millions of people, including human rights defenders across the globe whose lives depend on it. The strong encryption built into our software is essential for their safety.
In an age when people have so little control over the information recorded about their lives, we believe that privacy is worth fighting for.
We therefore stand with Apple to defend strong encryption and to oppose government pressure to weaken it. We will never backdoor our software.
Our users face very serious threats. These users include bloggers reporting on drug violence in Latin America; dissidents in China, Russia, and the Middle East; police and military officers who use our software to keep themselves safe on the job; and LGBTI individuals who face persecution nearly everywhere. Even in Western societies, studies demonstrate that intelligence agencies such as the NSA are chilling dissent and silencing political discourse merely through the threat of pervasive surveillance.
For all of our users, their privacy is their security. And for all of them, that privacy depends upon the integrity of our software, and on strong cryptography. Any weakness introduced to help a particular government would inevitably be discovered and could be used against all of our users.
The Tor Project employs several mechanisms to ensure the security and integrity of our software. Our primary product, the Tor Browser, is fully open source. Moreover, anyone can obtain our source code and produce bit-for-bit identical copies of the programs we distribute using Reproducible Builds, eliminating the possibility of single points of compromise or coercion in our software build process. The Tor Browser downloads its software updates anonymously using the Tor network, and update requests contain no identifying information that could be used to deliver targeted malicious updates to specific users. These requests also use HTTPS encryption and pinned HTTPS certificates (a security mechanism that allows HTTPS websites to resist being impersonated by an attacker by specifying exact cryptographic keys for sites). Finally, the updates themselves are also protected by strong cryptography, in the form of package-level cryptographic signatures (the Tor Project signs the update files themselves). This use of multiple independent cryptographic mechanisms and independent keys reduces the risk of single points of failure.
The Tor Project has never received a legal demand to place a backdoor in its programs or source code, nor have we received any requests to hand over cryptographic signing material. This isn't surprising: we've been public about our "no backdoors, ever" stance, we've had clear public support from our friends at EFF and ACLU, and it's well-known that our open source engineering processes and distributed architecture make it hard to add a backdoor quietly.
From an engineering perspective, our code review and open source development processes make it likely that such a backdoor would be quickly discovered. We are also currently accelerating the development of a vulnerability-reporting reward program to encourage external software developers to look for and report any vulnerabilities that affect our primary software products.
The threats that Apple faces to hand over its cryptographic signing keys to the US government (or to sign alternate versions of its software for the US government) are no different than threats of force or compromise that any of our developers or our volunteer network operators may face from any actor, governmental or not. For this reason, regardless of the outcome of the Apple decision, we are exploring further ways to eliminate single points of failure, so that even if a government or a criminal obtains our cryptographic keys, our distributed network and its users would be able to detect this fact and report it to us as a security issue.
Like those at Apple, several of our developers have already stated that they would rather resign than honor any request to introduce a backdoor or vulnerability into our software that could be used to harm our users. We look forward to making an official public statement on this commitment as the situation unfolds. However, since requests for backdoors or cryptographic key material so closely resemble many other forms of security failure, we remain committed to researching and developing engineering solutions to further mitigate these risks, regardless of their origin.
We congratulate Apple on their commitment to the privacy and security of their users, and we admire their efforts to advance the debate over the right to privacy and security for all.
> So, would it currently require just two keys (a TLS key and a single update-signing key) to make a malicious Tor Browser update pass the built-in updater's authenticity checks? If so, are those keys at least hopefully not accessible to the same persons?
Yes, that is correct. Right now, two keys are required, and those keys are not accessible by the same people. They are also secured in different ways.
Again, because we additionally use our anonymity network, even that does not give you the ability to give malware to anyone in specific. You would have to scatter-shot feed the update to random people, or everyone.
If you choose to target everyone (perhaps by compromising our webserver in addition to the update key), then you run the risk of being detected by someone who reproduces our builds independently. In that case, you would also have to compromise the build engineers GPG keys, or else the build signatures would not match, either.
> It seems like most of the pieces are in place to finally begin requiring K of N independent builders' signatures. If Tor Browser isn't doing that already (perhaps I've misunderstood) I hope it will soon!
What the industry seems to be heading towards instead is adopting Certificate Transparency to create audit logs for programs in addition to HTTPS certificates, so that everybody can verify that they have the one true canonical copy of a specific release of a program with external auditors and verifiers. This project is called Binary Transparency.
However, being pedantic as we are, we are instead likely to list Tor Browser hashes in the Tor network consensus document, which is already signed via a K of N key mechanism. We will then audit the consensus itself with a Certificate Transparency-style log, since doing so would both provide further security for the Tor Browser binaries, and also alert us to the theft of a majority of the directory authority keys (as I alluded to in the post). In this way, we will get quite a great deal of defense-in-depth, both for the network, and for our software.
As an aside, it turns out that the Session Resumption protocol in TLS 1.3 also allows the creation of "perspectives" systems that can audit for theft of HTTPS certificates themselves.
If someone steals or obtains the HTTPS private key for a site that is using TLS 1.3 and tries to use that to intercept HTTPS connections for some subset of users, those users can use TLS 1.3 session resumption to verify their ephemeral HTTPS forward-secrecy keys via another cryptographic channel (such as via Tor, a VPN, an SSH tunnel, or one or more "notary" systems). If that interception is targeting all users, then sites can simply connect back to themselves via Tor or a VPN, and use session resumption to verify their own ephemeral key material for that connect-back.
In this way, it is actually possible to build distributed systems that verify against server key theft. This mechanism can work to protect both HTTPS websites, as well as Tor relays (since Tor relays could authenticate eachother, through circuits using other relays).