The Debian OpenSSL flaw: what does it mean for Tor clients?

by arma | May 14, 2008

There have been a lot of questions today about just what the
recent Debian OpenSSL
flaw means for Tor clients. Here's an attempt to
explain it in a bit more detail. (Go read the Tor security advisory before
reading this post.)

First, let's look at the security/anonymity implications for users who
aren't running on Debian, Ubuntu, or similar. These implications all
stem from the fact that some of the Tor relays and v3 directory authorities
have weak keys, so the Tor network isn't able to provide as much anonymity
as we would like.

The biggest issue is that perhaps 300 Tor relays were running with
weak keys and weak crypto, out of the roughly 1500-2000 total running
relays. What can an attacker do from this? If you happen to pick three
weak relays in a row for your circuit, then somebody watching your local
network connection (or watching the first relay you pick) could break all
the layers of Tor encryption and read the traffic as if they were watching
it at the exit relay. (I don't want to say they could read the plaintext,
because if you used end-to-end encryption like SSL they wouldn't be able
to see inside of that -- unless of course the webserver you contact is
running Debian and affected by this bug!) Because this attacker can read
the traffic inside Tor, they would also break your anonymity: they know
both you and the destination(s) you asked for.

Worse, this attack works against past traffic too: what if an attacker
logged traffic over the past two years? As long as there's a single
non-weak non-colluding Tor relay in your circuit, you're fine -- that
relay will provide encryption that the attacker can't break, then or
now. But if you ever picked a path that consisted entirely of relays
with broken RNGs, and an attacker logged this traffic, then he can unwrap
the traffic from his logs using the same approach as above.

(Somebody who knows a Tor relay's private key could also impersonate that
relay. So he can do a man-in-the-middle attack, intercepting your traffic
to the "real" Tor relay and handling it himself. But this wouldn't give
him anything that he can't already do just by watching. Another attack would
be to create a fake descriptor and upload that. But this wouldn't give him
anything he can't do by starting his own relay and uploading a descriptor for it.)

This evening we've begun the process of making the directory authorities
reject all uploaded descriptors that are signed using these keys, so
we will effectively cut them out of the network. Peter Palfrader, our
Tor debian package maintainer, has also put out a new deb package that
will automatically discard the old relay keys when the relays upgrade,
so they'll automatically generate new safe keys.

The next big issue is that three of the six v3 directory authorities
were using weak keys for their directory votes and signatures. This
issue doesn't affect Tor clients running the 0.1.2.x (stable) series,
since those clients use the v2 directory system, none of whose keys
(I think) are weak.

What can three v3 authorities do? If they could forge a new v3
networkstatus consensus, they could trick users into using their own
fabricated Tor network, which would totally ruin their anonymity. Worse,
they could do this in a way that would be very hard to detect, by just
giving out their forged consensus to a few target users and giving out
the "real" consensus the rest of the time. But fortunately, Tor clients
require a majority of signatures before they'll believe the consensus --
and that's four of six. (Whew, that was close!)

Now, three v3 authorities can still influence the consensus a lot. As
one example, they could pick their favorite relays (say, because they
operate those relays, or because those are the ones that are easiest to
monitor), and put in three votes claiming that all the other relays are
unusable. The resulting consensus will then list only those relays as
"Running", since no other relays got enough votes. Then we're back to
the above scenario. But in this case at least one other v3 authority
would need to participate in building the consensus, so this couldn't be
a selective one-off attack. The whole consensus would appear different
to everybody, and hopefully somebody would notice.

The 0.2.0.26-rc release resolves these concerns by replacing those
three weak identity keys with new strong ones. Once you upgrade, your
new Tor won't trust any of those old keys -- so if anybody tries the
above attacks on you, your Tor won't buy it.

And the last issue that affects non-Debian users? If you use a hidden
service that generated its ".onion" key on a weak system, then you can
no longer be sure that you're actually talking to the original person
who generated the key. That's because somebody else might have figured
out the private key for that service, and started advertising it himself.
(Ordinarily, hidden services guarantee that nobody can intercept and read
or modify your communications with the service, because only the person
on the other end knows the private key that generated the ".onion" name.)

Now, what about the effects on Tor clients that run Debian, Ubuntu,
or the like? Well, first they're affected by all the above attacks. And
on top of that, any encryption they do can be considered to have no real
effect. So for example if an attacker can observe your traffic either
locally or at the first relay, he can see right through it all.

Similarly, if anybody has logs of traffic coming out of a Debian or Ubuntu
Tor client, they can strip it of its encryption, and thus retroactively
break the anonymity.

And lastly, don't forget there are plenty of other issues that this
OpenSSL bug causes that are unrelated to Tor. As a simple example, if your
bank generated its SSL cert on Debian or Ubuntu, then your SSL connection
to your bank (likely including your password) is readable. Or if you ever
tried to ssh into (or out of) an affected system, you have a problem. I
imagine some broader lists of examples will start appearing soon.

Comments

Please note that the comment area below has been archived.

May 16, 2008

Permalink

"What can three v3 authorities do? If they could forge a new v3
networkstatus consensus, they could trick users into using their own
fabricated Tor network, which would totally ruin their anonymity. Worse,
they could do this in a way that would be very hard to detect, by just
giving out their forged consensus to a few target users and giving out
the "real" consensus the rest of the time. But fortunately, Tor clients
require a majority of signatures before they'll believe the consensus --
and that's four of six. (Whew, that was close!)"

So if te attacker had control of one of the other v3 authorities ALSO this attack would definitely work.. not really *Whew* then is it, more of a - "we've been scuppered".

This attack was done and was successful.

Let **nobody** tell you different.

Also the true IP address of **all* "hidden services" is being identified (by USA .edu ops), after less than 24hrs up time for a brand new "hidden service" (non-debian), when only 1 access (single page request) of the hidden service from the tor network in that time.

"So if te attacker had control of one of the other v3 authorities ALSO this attack would definitely work.. not really *Whew* then is it, more of a - "we've been scuppered"."

Could you expand upon how this is possible with just one? One of seven does not a consensus make.

"This attack was done and was successful."

Proof of this?

"Also the true IP address of **all* "hidden services" is being identified (by USA .edu ops), after less than 24hrs up time for a brand new "hidden service" (non-debian), when only 1 access (single page request) of the hidden service from the tor network in that time"

Feel free to provide some proof of this actually working. I'm happy to setup a hidden service and have it identified by IP address as a test.

May 20, 2008

In reply to phobos

Permalink

> One of seven does not a consensus make.

They're probably saying that if an attacker had broken into one of the unaffected machines and stolen its keys, and if that attacker had also known about the Debian OpenSSL vulnerability, they'd be able forge a consensus. That much is possible.

>> This attack was done and was successful.
> Proof of this?

Agreed. Frankly, it doesn't seem very likely that somebody would report an actual authority compromise by saying "An authority has been compromised but I won't tell you when, how, how I know about it, or how you can confirm that I'm telling you the truth." If somebody knew about a compromise and they wanted to help Tor, they would report it so it could be fixed. If they knew about a compromise and wanted to attack Tor users, they would keep it secret so they could exploit it. This looks like random FUD to me too.

As for this hidden-service-identifying attack, could you maybe provide some kind of a link to what you're talking about? When we speak anonymously, it's very important to provide evidence for contentious claims, since we can't very well expect people to accept them based on our reputation.

phobos

May 20, 2008

In reply to phobos

Permalink

Even better would be to have published research on the topic.

I send this question in the or-talk list before read here with care.

Could you answer in or-talk?

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

// Backward decryption of Tor traffic after Debian OpenSSL bug disclosure

Let some passive adversary haves a records of traffic between users Debian
GNU/Linux tor-client and servers of Tor-network (a lot of Debian's too).
The records dated 2006-may 2008.

Now Debian OpenSSL PRNG bug disclosed. All ~250000 "pseudorandom" values known.

Is it possible to adversary use this data to backward partially decryption of
recorded and stored users traffic?

- From predicted states of broken PRNG he can compute Diffie-Hellman params,
reconstructs ephemerial keys and extract session AES keys between nodes in circuit
if two of circuit has broken PRNG's.

Is it real? Or openSSL PRNG used in tor for generating auth. keys only and not
for session keys material in the case of tor?
-----BEGIN PGP SIGNATURE-----

iD8DBQFILcYLRkm9ZEvRLEARApaoAKCHz8Pk4H8jLI4xgzbCnK1EgRzH1gCffINB
tto9W39Qr3hb4cq978zBC0s=
=vUFM
-----END PGP SIGNATURE-----

March 20, 2009

Permalink

Maybe the directory authorities should switch to OpenBSD or FreeBSD. They both do code audits, which should help prevent this sort of thing from happening. OpenBSD's code audits are more thorough than FreeBSD's code audits.