Tor, NSA, GCHQ, and QUICK ANT Speculation

Many Tor users and various press organizations are asking about one slide in a Brazillian TV broadcast. A graduate student in law and computer science at Stanford University, Jonathan Mayer, then speculated on what this "QUICK ANT" could be. Since then, we've heard all sorts of theories.

We've seen the same slides as you and Jonathan Mayer have seen. It's not clear what the NSA or GCHQ can or cannot do. It's not clear if they are "cracking" the various crypto used in Tor, or merely tracking Tor exit relays, Tor relays as a whole, or run their own private Tor network.

What we do know is that if someone can watch the entire Internet all at once, they can watch traffic enter tor and exit tor. This likely de-anonymizes the Tor user. We describe the problem as part of our FAQ.

We think the most likely explanation here is that they have some "Tor flow detector" scripts that let them pick Tor flows out of a set of flows they're looking at. This is basically the same problem as the blocking-resistance problem — they could do it by IP address ("that's a known Tor relay"), or by traffic fingerprint ("that looks like TLS but look here and here how it's different"), etc.

It's unlikely to have anything to do with deanonymizing Tor users, except insofar as they might have traffic flows from both sides of the circuit in their database. However, without concrete details, we can only speculate as well. We'd rather spend our time developing Tor and conducting research to make a better Tor.

Thanks to Roger and Lunar for edits and feedback on this post.


September 11, 2013


What about the fact that Tor uses 1024-bit RSA, is there a defence somewhere of this decision? (A recent Ars Tech article said everybody should move to 2048, so it'd be nice to have some reassurance here.)

Tor 0.2.4.x uses a new stronger circuit handshake and stronger link encryption:…
As soon as I get time to write the release notes, this will become the new Tor stable.

For more technical details, see Section 6 of this research paper:
and you can read more about curve-25519 at

2048-bit asymmetric crypto would be nice, but it's way too expensive CPU-wise -- the relays are maxed out right now handling circuit create requests from the botnet traffic, and the network would be in way worse shape right now if those were all 2048-bit requests.

[Edit: here's a point I made in response to a journalist question, which I'm pasting here too for completeness:

"It's not at all clear that NSA can break 1024-bit keys easily, or even at all currently. The main risk is that there will come a time in the future when it is easy -- and we don't know when that time will arrive -- and if they've logged Tor traffic flows from today, they'll be able to break those flows at that future point.

That's the biggest reason why upgrading to Tor 0.2.4.x is a good idea security-wise."]

Arma, thank you for your analysis and perspective..

I'm puzzled by your writing:

"if they've logged Tor traffic flows from today, they'll be able to break those flows at that future point."

due to Tor's use of forward secrecy technique.

Is this because you are distinguishing between "decrypt" and "break flows" -- are you referring to correlation techniques to identify who is connected where rather than decryption of content?

Forward secrecy means that if you break into the relays later, nothing you learn from them will help you decrypt stuff from the past.

It doesn't mean that the traffic flows from the past are magically undecryptable even by an adversary with enough computing power (or some other break on the crypto).

The Tor handshake provides this forward secrecy property, such that after the relay rotates its onion key there's no point breaking into the relay to help learn it. But if the attacker can just straight-up break the encryption, the forward security doesn't help.

Am I correct that you are explaining that breaking the PK asymmetric crypto leads to breaking the final symmetric crypto even when forward secrecy is used ?

To a non-expert it seems that employing forward secrecy using Diffie-Hellman means that the symmetric crypto's key is never exchanged between the parties. Therefore that key cannot be discovered by breaking the asymmetric crypto of old recorded traffic flows; the symmetric key used is simply not there.

Where do I have it wrong?

Thank you.

Unfortunately, what you describe doesn't exist as far as I know.

DH with forward secrecy means you use a new asymmetric key for each transaction. So they have to break each transaction separately (and breaking one doesn't help them break any others).

Edit 1: Ah, but I realize I was imprecise in my earlier explanation. There are actually two steps to breaking an old-style Tor handshake. First you have to break the relay's (1024-bit) onion key, which is rotated once a week. Once you've broken that, you can see the client side of the DH handshake (that is, g^x). If you can see g^x and g^y and also you're great at breaking 1024-bit DH, you can learn the session key for that circuit. So step one, they have to break a new key every week. And if they do that, they *also* have to break a new key for every circuit.

Edit 2: Oh, and you have to break TLS before you can see any of this. Or be the relay.

Ah, some people might be great at breaking the 1024-bit key, which isn't that hard by the way. It's almost 2015 and statistically, you may need only a little over 50% of the effort.
And, oh, I haven't had any idea about the huge period of time that passes until the relay key is rotating. I thought it's a matter of minutes, hours at most, especially if it comes to a cheap 1024-bit key.
Well, it might be that in the near future you'll end up with users improving the Tor code on their own.

I encourage you to learn about the old design -- it is breakable by an adversary who finds breaking 1024 bit keys easy, but I think its main problems are not the one you describe.

In any case, you should realize that I keep saying "old-style" and "old" design. The new design, NTor, is believed to be much stronger. See the links from…

As for users improving the Tor code, that sounds great! Everybody who thinks that Tor was made in a closed room by two brilliant people and then it sprang forth fully-formed into the world is, well, misunderstanding how open source and research works.

Arma, I'm back, the one who was asking before the smart aleck stepped in front with his "Ah" and "oh" and blah.

You pointed out "what you describe doesn't exist as far as I know".

Where would my interpretation of Wikipedia be wrong? (I'm not relying on W as my only source). This seems critical to true forward secrecy.

Article "Diffie-Hellman key exchange" says that DH

"allows two parties that have no prior knowledge of each other to jointly establish a shared secret key over an insecure communications channel".

In other words, supposing in a belt-AND-suspenders approach a channel to determine a symmetric-key for use by both parties is encrypted anyway (with asymmetric crypto) even though DH works fine with plaintext against a passive attack. Breaking the asymmetric crypto (by obtaining the private key) doesn't reveal the symmetric crypto's key because using DH the symmetric key was never transferred thru the channel in the first place --- exactly as I was suggesting above.

In Diffie-Hellman, Alice sends g^x, Bob sends g^y, and they each compute their shared secret key g^{xy}. Anybody watching can't compute the secret key, because you need to know either x or y in order to compute it.

But if I can compute the discrete log of g^x to learn x, I win. Or if I learn y from g^y. In either of those cases, I can compute the secret key.

... which has nothing to do with the asymmetric key, unless the old TOR handshake is broken. I'm supposing it is not, as it's based on TLS. So you have to break the symmetric key for each session.

It is becoming clear that an extended blog thread is the wrong forum for this discussion. :)

The old Tor handshake is not based on TLS. But that is probably not the root of your confusion. I'm not sure what is. I suggest irc, or the mailing list, or our shiny new

Arma, I've got a question concerning this issue: Assuming that they have recorded tor traffic in the past and they are able to break all those encryption at a future point - What will they "see"? afaik, the content of the transmitted data as well as the IP from the client, so they would be able to break the anonymity, right?

To say it clear, if this is correct it would be a major issue for all tor users who care about their anonymity, which was the initial goal of tor. In the worst case, all tor traffic from the last years could have been recorded and then, when 1024 bit RSA/DH is broken (which is believed to happen soon or already has happened) be decrypted and associated with the clients as well as the servers identity?

Would it help if the clients ISP doesn't save any IP logs at all or if they are deleted after a few days/weeks? So the attacker won't be able to do the last step and link the IP to the end user?

The real threat happens if the attacker is logging traffic at the client side. In that case they know the user's IP, and they're trying to decrypt the traffic in order to learn the destination websites that the user has visited.

So in this case it doesn't really matter what the ISP keeps or doesn't keep -- they've already let the nice man put the big black box in the secret room, or given them a realtime feed of all the data that goes over their backbone, or however it's done.

If the attacker isn't logging at the client side, then it's going to be a lot harder for them to trace any traffic back to a particular client, even if they're good at breaking crypto.


September 11, 2013


I am currently investigating the sourcecode of "Mevade.A" they have Anti-Debugging methods and reportings if someone is debugging the Code. I have done with them, they don't catch me debugging cause I am debugging from outside the VM. Now this coculd be a break through cause I am a part of the Bot.Net outside of TOR. They think I am a TOR node but I am not and I am communicating with some other Bots within the TOR Network. They unhide their Identity by contacting me directly not through TOR cause I have an Exit Node and TOR Relay routed through my VPN. They can't see that their traffic is routed otherwise. Its time that my Node got attantion from the C&C Servers to discover their hosts. This takes some time... :-(

But I am offending them and kick their lame cypher asses!

best regardes!


September 11, 2013


Staiing within .onion is secure , and yes timing can be used if the input and output are unencrypted.....
if one ore both the sides is encrypted its mutch harder.


September 11, 2013


1) Why does Tor project continue to accept money from the US Department of Defense and the Broadcasting Board of Governors? Is this a conflict of interest?

2) Why would the Tor project allow the Broadcasting Board of Governors to run major Tor relays and exit nodes when it is widely know that they are part of the CIA/DoD's psywar operations?

3) Is there any reasonable expectation that the feds haven't just been running relays/exit nodes and saving the traffic for later decryption?

4) What does Dingledine have to say about his NSA internship? Or the fact that Paul Syverson still works at the Naval Research Lab? Or that Dingledine and Matheson were private contractors for the Naval Research Lab?

1) Because we do great things with the funding, and everything we do is open and you can look at it. I would rather have more diverse funding (anybody know other funders we can talk to?), but so long as they only ask us to do things we wanted to do anyway, I think it's better than not.

You might like… for more details (if you can get over the inflammatory headline).

2) BBG doesn't run any relays or exit nodes. Citation please? (Also, you are mistaken to think they're part of CIA/DoD, but whatever, that's irrelevant one way or the other here.)

3) They might be running exit relays and saving the traffic. That said, you are totally right to be worried, but you're worried about the wrong thing. You should be worried that they're *monitoring* existing honest exit relays and saving their traffic. See also…

4) I'm Roger. I'm glad I worked there for a summer -- I wanted to learn if it was the sort of place I wanted to work at more, and I learned that it sure wasn't. Happy to explain more in person sometime. Almost all the people there are typical government employees (not very motivated and not very competent). That perspective also helps me to figure out how best to educate other groups, like law enforcement, about Tor:

As for Paul still working at NRL, see point '1' above, and also see the research papers that he produces, as listed e.g. on

Why are you concerned about Paul but not concerned about university professors, who get research funding (which in turn pays their students) from this same system?

For more discussion on sustainability, see also…

As for the original funding from NRL through DARPA... same answers as above.

I understand that it's easy to see conspiracies everywhere these days (and there clearly *are* many conspiracies out there these days), but if you think there's a conspiracy and you don't look at all our code, design documents, research papers, and so on, you're doing it wrong.


September 12, 2013


If HTTPS security relies completely on trusting a central authority (CA's), and the CA's are under the control of the NSA, then can we assume that HTTPS is broken?

Using HTTPS is way better than not using it, but you are right to be concerned about how safe it is. The real trouble isn't that all the CAs are under the control of NSA. The trouble is that there are 200 or so of them, and if *any* of them is run by, compromised by, or otherwise working with your adversary, you can lose.

For a close-to-home example, it is silly that Firefox will believe an https certificate for that's signed by the Chinese government's certificate authority. One fix there involves certificate pinning (ask your favorite search engine about it), but that's also not a complete solution.

This issue with https is part of why we're so fervent at trying to get people to check PGP signatures on their Tor downloads:
I am extra sad here that it's so difficult for Windows users to check signatures smoothly and correctly.

Quoting from

"Some software sites list sha1 hashes alongside the software on their website, so users can verify that they downloaded the file without any errors. These "checksums" help you answer the question "Did I download this file correctly from whoever sent it to me?" They do a good job at making sure you didn't have any random errors in your download, but they don't help you figure out whether you were downloading it from the attacker. The better question to answer is: "Is this file that I just downloaded the file that Tor intended me to get?""

If we provide MD5 hashes along with the downloads, you have the same problem you had before: how do you know the MD5 hashes are actually the ones we meant for you to get?

CRC32 hashes have a much worse problem: I can easily generate a Tor Browser Bundle whose CRC32 matches any CRC32 you give me. It's way too short to have any security for this scenario.

But how do i know that the key i fetch via "gpg --recv 0x...." is the right one?
I can check the fingerprint, but HOW ? How can i know if the key fingerprint is the original one or not? The only way to be sure about that clould be for me to fly to the US so that you or some other Tor developer give me the original fingerlrint for that key.
it's unrealistic. We live in a Totally insecure world.

the same apply as well for debian/fedora/slackware/ecc keys fingerprints. I cant trust the Net because routing makes my traffic pass through the U(/N)SA territory or through ISP that are collaborationists with the Nsa.

I know there's something called web-of-trust of the gpg keys but how can i do that? To do that i would need to personally know/meet some developer of that software (Tor/debian/fedora/ecc) i'm downloading. not so pratical/realistic.

Btw is fingerprint a warranty for collisionless of the key?

Ah, do you know that using the TBB is IMPOSSIBLE to download the signatures and chòecksums for devian live?
With ALL version of the Tor Bundle. it freezes. only with that files. Try! It always freezes if you try and then the only way to get the files remains the "plaintext Internet" (aka "nsa-net")

Thanks for your time reding this and the other post about the insecure Tor version for android (the one from, DATED 2012!! That leaks the device modelname in the useragent)

Re: the web of trust, see also the last paragraph of

It's actually not that crazy to find a Debian person in your city who is connected to the pgp web of trust. They're everywhere. And I bet the signature chain between them and me is surprisingly short.

Re: debian live, what URLs specifically? Sounds like you should file a ticket on trac if it actually breaks your Tor Browser Bundle. It might be a bug in https-everywhere, or in our Tor Browser patches, or in something else.

hi, ( and thanks for the reply! )

GPG web of trust:
I've managed to get safe-enough gpg keys of debian from and old (checksummed sha256) system backup and now i'm confident that my installation is ok (original).
I notice that there are debian-keyring and a torproject-keyring packages available via apt-get. Is there a way to use them to verify & trust Tor keys? Does torproject-keyring contains the normal Tor keys? can i use it to verify the legitimacy of the Tor apt-key file and the Tor Bundle keys ?

Debian live checksums:
I tried to disable HTTPS-Everywhere but nothing changes.
the urls:

1)… it freezes(***) the browser for like 5 seconds and then the browser is usable again but the page of that tab is still empty (but it loaded the page icon for the tab)
(***)freeze = unusable, and i see the "loading circle" that's freezed too.

It managed to load the file but *only* 4 lines and almost an half of the 5th line.

3)the bugs remains also if i disable HTTPS-Everywhere, Javascript and the images.

I can download them if instead of TBB i use the "apt Tor" from the torproject repository + privoxy + iceweasel-OR-links-OR-elinks-OR-firefox
And yes, i think the bug is in the modified browser itself, not in Tor.

OT: is there a way to learn coding&understanding Tor & involved cryptography? i mean a "developer guide". I tried looking at the code some time ago and i've found it not so easy to understand.
Any advice about the needed background? Is there something about it on this website?
I'm reading Applied Cryptography (Schneier), is is a good startpoint?

thanks for your help.
you all are doing a very good work and i think it's useful not only to chinese/russian dissidents but specially to the US and European ones (we have FB and Google etc! :( )

(Still me)

hi arma,
Does that bug only happens to me?
have you tested if it the TBB you use has the same problem?

was my connection to hijacked by some US agency in order to give me a modified version (aka non original, aka NSA-friendly version) of TBB?

I agree that it's disappointing how difficult it is for Windows users to check signatures.

Would it not be possible to code a simple Firefox addon that could be incorporated into the Tor Browser Bundle (and used in non-Tor Firefox of course) that allows the user to select the .asc and the .exe (after downloading them of course) and have it check and confirm the signature automatically? Perhaps also with an option to paste in a sha1sum instead of selecting the .asc file as an alternative check.

The real challenge is that first download -- how do they know they got the right thing? That's the same problem Windows users have with GPG right now: our instructions start with "first, fetch this exe using this http URL..." It doesn't matter how amazing the program would be if you didn't actually get it in the first place.

That said, you're absolutely right to point out that things can be made a lot easier assuming the user bootstrapped correctly the first time.

Ask google about 'tor thandy' for details on our secure updating design, which is stuck in part because there's no such thing as a package manager or package format for Windows.

One of the upcoming steps we're going to try is using the Firefox updater to auto fetch updates for your TBB. Then it can auto check the signatures on them too. (Though last I checked, Firefox doesn't do this -- it just relies on https, which in turn relies on every single one of the CAs. See above thread. :/ )

And finally, check out the Tails instructions for verifying the download:
I like them, but then, they're targetting users who are planning to run a Linux LiveCD, so they probably won't work so well for the broader Windows crowd.

As you say, there's no real way to be sure that the OpenPGP I download, to verify that GPG isn't comprised, is itself uncompromised!

Tor Thandy sounds very promising though and I look forward to it.

I agree...
that it is disappointing that people still use windows then have the nerve to bitch to anyone about problems with their system. Same for the TBB when Tails is head and 'tails' above it not to mention it provides a Linux experience for all those who think Linux is a difficult OS which could not be further from the truth. May have been true years ago but not anymore in any way. Both Mac OS whatever & Windoz are in fact much more difficult OS's to work with. The amount of people who complain and think that PGP/GnuPG is difficult are ALWAYS windoz, crApple, TBB users. The same users that DID NOT UPGRADE then complained and BLAMED the tor devs for their own incompetence and apathy after the TBB exploit in late June. That was a pathetic thing to watch, the blame game by TBB users who couldn't be bothered to upgrade or use Tails. arma and crew can't hold your hands every second and IMO they already have to spend too much time coddling these types of users. The same users who tore them to bits in every comments section of every article dealing with the FBI's pathetic attempt in June. How bout you ingrates say thank you to these guys for the opportunity then go and teach yourself. Thank you arma, and everyone else who does there best and Roger, you're a better man than I, your patience is amazing.

I have an idea to help users check PGP signatures.
I constantly use the extension DownThemAll for any and all of my downloads, the reason I use this extension on Firefox is because of it's built-in ability to check MD5, SHA1, SHA265,SHA384, and SHA512.

If we could talk to the developers about programing it to give the user the option to check PGP signatures as well, then this would all around be a lot easier on the end user.
If they do make DTA capable of checking PGP, then it would also be a great idea to have it bundled in with the Tor Browser bundle.

I personally have very little programming experience, so I'm not entirely sure on how feasible this would be, but it would certainly help make many users that much more secure when downloading (should they choose to use it).

Here is the website for DTA:


Bundling it with TBB would be the wrong way around. Then you'd have to use the thing you fetched in order to check if it was the thing you should have fetched.

I think we can solve the "how to check if your update is legit" problem (though we haven't chosen a good way to do that yet, so it remains an issue in the short and mid term). The really hard problem is the initial bootstrapping of trust.

On Debian, you get gpg in the operating system, so you have a trust anchor. On Windows you don't. Oops.

Sure, on one hand it seems illogical to "trust a downloader to verify a downloader", but most people aren't going to compile the initial downloader and inspect it for themselves (and never will), so, on the other hand, the flow has to incorporate a trusted package that was, at some point, downloaded. If you're going to bother to design verification well enough to be trustworthy, why not incorporate the "ideal" verification method to download the initial bundle into DTA or another plugin? It's silly to think that just because someone has downloaded gpg4win from some other server than the one of the package I need to verify, then somehow that means it is substantially more trustworthy. Anyone who attacks your fancy anononymizer encryption thingy will almost certainly prioritize attacking the other websites hosting the tools used to verify your thingy. Demanding that a separate process verify the thing you provide than the thing you provide just means that it is either another host's, or a different program's responsibility to ensure. At best, you are forcing the attacker to secure multiple targets, neither of which apparently has guarantee against compromise.

On another note, it could work to have one verification method for the TBB, but still incorporate the above poster's DTA suggestions for other downloads that TBB will handle in the course of it's daily use. Downloading is a part of any browser experience, secure mechanisms absolutely should be built in.

One of the main ways to make sure a checksum or hash, or the file they pertain to, haven't been intercepted is to obtain them using different computers on different connections at widely different times. This doesn't ensure the absence of tampering, but it hedges the danger. The attacker would have to have compromised a broad array of servers over a long period of time (depending on the distance in time the downloads took place) in order to make sure that the downloader got and installed or executed a bum file after trusting a tampered checksum or hash.

In this way, it is almost more important for the creators of the software to make the hashes secure and available and than the distributed software itself. Get Tor off Tucows or Softpedia, why not? (ok, gross... but anyway...) The hashes will let you know what you do or don't have. You can try getting the hashes weeks apart, if you need to be extra sure. Just don't forget to make sure the version is the same.

Word to those newish folks who are wondering about all this.

But regarding the post above, perhaps a method could be made that employs this specifically? Hashes not only distributed, but aged? Or some such?

It would be assumed that if the user was a target that their personal email would be a target, so they'd have to make sure to get either the hash or the bundle or both from separate throwaway email accounts.

Essentially, in the case of getting the bundle mailed, your adversary at that point would have to own every large-attachment-allowing free email provider and sensitize it to incoming communication from Tor's servers; not an absurd prospect considering the aforementioned dearth of providers allowing such sizes. Better to use one of a thousand random small providers and just get the hash, taking your chances to find its corresponding bundle in whatever wilds you happen to be in. Even then I'd try to check it against Tor's website and any other official mirror - I don't really understand why Tor maintainers do not print the hash next to each download in addition to making it available as a download. Seems like the best idea.

At some point, if not already, the resources of nation states will be as such that subtly owning all relevant assets of a target will be as easy as owning one or two. I'm not sure if we're at that point yet, so spreading out the trust sources (hashes) of Tor across as many allied domains as possible seems to be a really good tack.

It'd be pretty cool (though of course a little tedious) to cross reference the latest bundle's hash on the downloads page with one posted on and with, say The Colbert report's website and that of Senator Wyden. Making no assumptions here about their sympathies to the project - I just said it would be cool.

Actually, posting the hash would be a better and more direct political statement than some banner ad. Not to mention spreading out the damages able to be claimed and litigation vectors when one of those hosts finds out they've been tampered and potentially gets pissed about it.

very interesting post.
Only one thing: the hash changes for each release/update.
A better solution could be to spread (via, Colbert report website and others...) the hash (better sha512sum ) of the GPG *key* of TBB. So that everybody can get the key and check if it's the original one.
For example the user in possession of the hash could download the key from a large list of servers and then he could check the sha256sum of the key.
If the key is ok ---> problem solved

they could use some of the many available programs to check the sha256sum (or sha512sum) of the key, and then they could use gpg (windows versions do exist) to use the key to verify the downloaded file.

pretty simple

Gpg4win is a mess visually. I've run before and it annoyed the crap out of me. The workflow is all over the place... "how many applications am I running at once for this again?" "why does that thing have to camp out in the tray since all day I only needed it once?" 9_9 ugh

Version numbers changing isn't relevant; presumably the posted banner or sidebar or page or whatever would state the relevant version and platform alongside the CH (Keccak yet, anyone?). If someone is clueless enough that they use the CH for a version it was clearly posted not to pertain to, that would have to fall outside the sphere of the project's concern.

Anyway, screw the keys - distributing software packages isn't chatting. Just broadly distribute CHs.

CA technically means 'Certificate Authority' not 'Central Authority'. There are a bunch of them, and they sign public keys. One popular CA is verisign. You have to trust the CA to trust that a public key belongs to who it says it belongs to. Its a weak point in public key crypto, surely. The NSA could corrupt CAs by a variety of means such as legal threats, or spy type operations (the Iranians did just this a while back). They don't need to 'control them', they just need to get them to sign a public key that says they are, say google, and/or modify a certificate revocation list.

Right. And remember that they just have to compromise any one of the hundreds that are out there. Even if you totally trust Verisign, that doesn't matter because they could go ask Turkish Telecom, or whoever, for help.

As a little tidbit to add: I hear Chrome is giving up on the certificate revocation check -- it's a big hassle for usability and speed, but most importantly a local attacker could just prevent that connection on the network, preventing your browser from learning whether any revocation has happened.


September 12, 2013


Earlier, I do remember reading that the recent traffic increase was solely about directory information exchange and that it didn't affect the tor network itself. Ever since it has started, I was thinking, given sufficient computing resources, that they (could be more than one entity) are actually trying to map the whole tor network, which isn't that big, by doing these directory requests and de-anonymize, even decrypt in real-time (1024 bits RSA keys are jokes ATM! 4096 might last another 5-10 years) the most part of the streams they focus on.

Some suggestions for the next "full" version of Tor:
- advertize/randomize/change peer connections faster and the peers/relays keys renewals - might increase the administrative trafic but it's worth "obfuscating" the internal functionality
- go for 4096 bits keys - commercial sites are using 2048 and tor is supposed to provide a higher level of security. The end users, the ones who need the service won't be affected and the relays/bridges will maybe handle less (but better secured) connection, that's if they are not running on some new CPU that has AES capabilities embedded

Best regards and thank you for your hard work and this beautiful piece of software. It truly helped me a lot.

P.S. I also strongly suggest to all tor users/relays/bridges to disable (if possible - in Linux it's very simple) the tcp timestamping.

Unfortunately, the spike is not solely about directory fetches. Those were the first thing we noticed, but soon after we noticed a huge spike in circuit create requests:
Latest theory is that each of these new clients is hitting a hidden service (thus generating many circuits) pretty often.

Mapping the whole Tor network, if by that you mean relays, is not hard. It's exactly what we already tell you in the directory information. It's not a secret. See also

So making a bunch of new circuits as a client won't help deanonymize other Tor users like that. There are some other potential attacks where it could help, e.g
But I think for an adversary of this size there are much simpler attacks than this one.

I think changing peer connections faster could actually introduce other anonymity attacks -- see e.g.
So it is not this simple.

As for 4096 bit keys, sounds great except the cpu load on relays would be unbearable. See comments elsewhere on this blog post for why the ECC-based handshake is the only practical way forward at this time.

And as for disabling TCP timestamps... go ahead I guess, but I think there are many many things on the list before this one.