Tor, NSA, GCHQ, and QUICK ANT Speculation

Many Tor users and various press organizations are asking about one slide in a Brazillian TV broadcast. A graduate student in law and computer science at Stanford University, Jonathan Mayer, then speculated on what this "QUICK ANT" could be. Since then, we've heard all sorts of theories.

We've seen the same slides as you and Jonathan Mayer have seen. It's not clear what the NSA or GCHQ can or cannot do. It's not clear if they are "cracking" the various crypto used in Tor, or merely tracking Tor exit relays, Tor relays as a whole, or run their own private Tor network.

What we do know is that if someone can watch the entire Internet all at once, they can watch traffic enter tor and exit tor. This likely de-anonymizes the Tor user. We describe the problem as part of our FAQ.

We think the most likely explanation here is that they have some "Tor flow detector" scripts that let them pick Tor flows out of a set of flows they're looking at. This is basically the same problem as the blocking-resistance problem — they could do it by IP address ("that's a known Tor relay"), or by traffic fingerprint ("that looks like TLS but look here and here how it's different"), etc.

It's unlikely to have anything to do with deanonymizing Tor users, except insofar as they might have traffic flows from both sides of the circuit in their database. However, without concrete details, we can only speculate as well. We'd rather spend our time developing Tor and conducting research to make a better Tor.

Thanks to Roger and Lunar for edits and feedback on this post.

Anonymous

September 12, 2013

Permalink

And if my previous comment was approved, then add to the suggestions list:
- a minimum key size for the encryption protocol, like 256 bits for AES, that's if you don't already consider any other protocol than AES. Key sizes should be doubled every 5 years or so (128 bit is already dangerous) - just to keep pace with the technological developments - see Moore's law.

I think you have your math wrong. If you stick to Moore's law of computation power doubling every 18 months, then the change between 128-bit and 256-bit is 192 years, not 5 years.

In any case, both 128-bit AES and 256-bit AES are doing great against brute force attacks. The real worry, well before the 192 years elapse, is that they'll have some new analysis to make the attack significantly quicker than brute force.

Anonymous

September 13, 2013

Permalink

The main problem this article describes is at the exit points, where NSA and others can monitor traffic.
But what about hidden services? AFAIK, if someone visits a hidden service, his/her traffic will never go through an exit point, since all communications remain inside tor.

So I guess visiting (and/or hosting) a hidden service is safe from this kind of eavesdropping?

In short, unfortunately, the answer is "not at all and maybe it's even worse".

It sounds like you're conflating two issues.

The first is that if you make a connection through Tor to some external destination, and you don't use encryption, then somebody sniffing at various points on the Internet (but not inside the Tor network) can see the plaintext of your traffic. This is a big deal, but it isn't the issue here. (It also has at least theoretically a very simple fix: use encryption.)

The second is that an attacker who can watch a lot of places on the Internet can examine traffic flows and realize that they're correlated. And this has nothing to do with whether traffic "exits" the Tor network. One variant of this attack would be for the attacker to visit the hidden service, and then see if the traffic flow on his Tor client is correlated with any traffic flows he sees elsewhere on the Internet, e.g. connecting from a Tor relay to a Tor client. In this case hidden services may actually be weaker than normal Tor connections, since the attacker can induce them to talk at a timing and volume of his choice, potentially making the correlation attack easier.

By dynamically changing routes and delaying packets inside tor, there will be no correlation possible. Hidden services should by default act as relays, better as exit nodes, then they'll have plenty of traffic going through, even when nobody is accessing the hidden service itself.
Throughput should be the least important factor in router decisions, focus on uniformity instead. Tor should not prioritize on streaming data like video,voip or file sharing.

As for the first paragraph, I'd also love for these claims to be true, but currently they appear to not be true. If you think it results in "no correlation possible", it means you haven't studied the statistics enough. Now, I am optimistic (though some others aren't) that some sort of changes in padding could improve our resistance to correlation attacks. But I'll try being more concrete: "Give us a specific proposal? Because every previous specific proposal that anybody has ever come up with has been broken."

As for the second paragraph, the main reason Tor is slow right now is from queueing inside the relays. That is, it's because there are many more bytes trying to go through the Tor network than can fit. If clients choose their paths by anything other than weighted by relay capacity, we would make these hotspots even hotter, making performance way way worse. For example, web browsing would be unusable. And then all the users would disappear, and all the relays would disappear, and then what.

I think I do know enough about statistics and their limitations.
Briefly, I'll cover some proposals:

1.I haven't been through the tor path allocation algorithm but my suggestion is focusing on the very end of it.
Say you have 3999 available paths, which you already sort according to several factors like throughput, link stability, etc and you end up with some candidates.
You take these candidates, just the conjuncture top ranking ones and put them in a matrix with their according priority, like:
ABCD
1234
Then, the most simple way that comes to mind, you generate a random number, in Linux by using urandom (plus some environmental salt) and in Windows hmmm... the same, by using the number of running processes, cpu cores, free RAM, xor-ed (or maybe a succession of binary operations, not too many since you don't want to kill the CPU) with a randomly generated number. Say you'll get 6598 as a random number, which you will xor with the 1234 form above, then the new matrix will look like:
ABCD
7444
By sorting this, you'll get BCD as prioritized candidates. The statistics would end up trying to guess the random number, awesome.
So, you'll satisfy a little of both worlds, keep prioritizing on throughput/link stability and add some entropy in the path allocation process, all this achieved with a little overhead and through an irreversible function.

2. Queues, I forgot about these, sorry, my bad. I guess the algorithm is FIFO, therefore the same entropy generation from above would shuffle the queue a little bit. It will also solve the timing/delay pattern recognition issue.

3. Timers, they are very important in both the path allocation and circuit building processes. They should be put in place and generated randomly, in such that you won't use a circuit more than several minutes, after that you should end up in another round through the above points 1&2.
I just noticed, that as a client I ended up with the same 2-3 peers for several hours, although I played with the already available CircuitPriorityHalflife, NewCircuitPeriod, MaxCircuitDirtiness and MaxClientCircuitsPending. I must admit that I observed some unsuccessful new circuit creations every now and then but that was all. Not sure now if these conf variables do have any impact at all.

4. Packet size, shape - header - payload. At this level, although it will be identified as tor traffic, a little uniformization will make any circuit recognition/backtracking impossible.

You are right, unfortunately the actual tor performance is pretty bad but changing the rules might discourage the ones that are testing their science and algorithms right now.

I'm just suggesting to do some small architectural changes, by diverting the potential adversary statistical efforts at an entropy generator, all this by following the KISS principle. I guess it will address more than what is covered in the freehaven papers, which are all based on assumptions and trying to make the math consistent with the idealistic and deterministic environment where it is applied.

Anonymous

September 13, 2013

Permalink

hi, please can you link me here an URL o the 2.4.17 (or must be a 16 ??) varinate of TOR for WIN? i had some months ago problems with NOnameScript as well - they attacked the TOR and became no more safety.

Anonymous

September 13, 2013

Permalink

QUICK ANT is most likely a correlation between exit and entrance nodes. Unfortunately, the absence of such correlation also leaks information about hidden services.

Regardless of the efficacy, the Tor project, and those who support it, are modern-day heroes.

"Most likely"? I don't see that from the screenshot at all.

See the last two paragraphs of the blog post for my theory.

I mean, I don't want to declare that NSA has no tools for correlation of flows across the Internet. We've long assumed they do, and the remaining game is to wonder how much of the Internet they can see vs where the Tor flows go. But in any case, it's not at all clear to me that this 'quick ant' thing is that.

Anonymous

September 15, 2013

Permalink

"Tor flow detector" -- old question but I'm confused - so if I run my own Entry and Exit nodes can NSA and others get my data?

Running your own entry and exit points doesn't really change much unfortunately.

There's still the chance that a large attacker has done deals with your upstream to be able to tap the traffic. Where will you run them such that their ISP, or their ISP's ISP, or so on up the chain, is out of reach of large attackers?

Plus if you run your own relays and you use them preferentially, and an attacker knows this, and the attacker sees some anonymous person using them, he can guess that it's more likely you than the average Tor user. See
http://freehaven.net/anonbib/#ccs2011-trust
for more discussion of this second issue.

Anonymous

September 15, 2013

Permalink

We are taking tor off all our pc's. It isn't any good anymore! If it ever was!

Anonymous

September 15, 2013

Permalink

Al you geeks can argue about technical details. TOR doesn't work and it probably never did! Moderate that!

(I assume this comment and the one above it are from the same person.)

I guess the answer is "it depends on your threat model". There are many people out there for whom the Internet as a whole is just too scary for their security requirements, and it is a reasonable choice for them to stop using the Internet.

Or said another way, I think Tor is still better than the alternatives, but in this world you're right that that may not be enough.

geeks and tech details.... Dude, you are on Tor blog with heavy participation from a lead dev... what did you expect and do you need to post here to tell all that you're done w/tor????? lol Sounds like you got bigger issues than not trusting tor.
***And, "moderate that!" ?????
Real helpful addition to your worthless post.
this(my) post may be equally worthless but I can not stand to see all the haters, do you see this man responding politely to all of the insulting/ignorant commenter's??
He's too good of man to call you fools out but I am not.

Anonymous

September 15, 2013

Permalink

- Authentication wise - which is the most important step in the tor game, for the increasing the 1024 bits RSA key size proposal - look at the "theoretical" TWIRL device. Personally, I guess it's already done and maybe even parallelized - very cheap solution.
- it's naive to believe that somebody will use supercomputers (clusters) to compute software overhead (just changing the value of a variable takes some processing cycles) instead of building a specialized ASIC or FPGA system. I guess the Moore's law suggestion was misinterpreted, there are a lot of "groundbreaking" technologies appearing (see the quantum computing and the application of Grover's algorithm) and by looking at the timeline of changing ciphers, you would notice that every 5-10 years, things are fundamentally changing. Besides, AES256 if implemented correctly (very important aspect) has more rounds and greater complexity than AES128.
- about obfuscation, it was 2010 or 2011 that I first saw some articles about using many instances (over 10) of tor, load-balanced through a proxy server (squid). Some even developed and published scripts for doing that - use the "don't be evil" guys search engine for more details. That's all for achieving an increased anonymity by using the principle "if I don't know what I'm doing, nobody else will". Pretty healthy principle, but abusive and unethical towards the tor network and usage principles. This might also be a cause for the recent traffic increase. I was also tempted to do this but using only two tor instances now, on which I alternate (kill and restart) in a random fashion - since sending a NEWNYM through the tor control interface is a little bit more complicated to automate - you should consider a variable in the conf file for this purpose (I guess a lot of people would like to randomly change their identity, even for the purpose of escaping an IP ban and maybe re-authenticate too, that's with the very helpful 1024 bit key)
- I'm using tor (together with a lot of browser plugins) solely for clearing my way through the more and more invasive algorithms that tailor the content for you - it's already psychiatrical what these people are doing by trying to distract you form your "carefully planed browsing or interests/study".

Keep it upbeat and again, thank you for this beautiful piece of software that keeps me sane and focused on my tasks while browsing the internet and ... poisoning the "evil" algorithms with false identities. It's so nice when I get an advertisement in a language I don't understand - my mind rejects it instantly.

Anonymous

September 15, 2013

Permalink

So pretty much the answer to any suggestion made is "Just be cautious, we can't/won't do anything".

No, we're continuing to do a huge amount on both research and development of how to build a scalable good anonymity / blocking-resistance system. And by 'we' I mean a broad community of people, not just the Tor people.

The lesson to learn from some of these responses is that building a good anonymity system is *hard*, and most of the first ideas you'll have are wrong in counterintuitive and interesting ways.

If you want to contribute, step one is to get some more background on previous designs, previous attacks, and some of the defenses against those attacks. Otherwise you're very likely to produce an approach or design that falls to those same attacks.

To quote https://research.torproject.org/ :
"To get up to speed on anonymity research, read these papers (especially the ones in boxes):
http://freehaven.net/anonbib/
"

I guess the other lesson is that a several-sentence suggestion in a blog comment is the wrong way to contribute usefully to the design. See
https://www.torproject.org/docs/documentation#UpToSpeed

The Vision/Mission of tor was (hope still is) to help its users (some of them in really bad sociopolitical situations) to achieve a certain degree of anonymity and not too much security. However, in the actual context, anonymity implies strong security.

The successful hacks on tor were predominantly statistical (pattern seeking, correlative), here, a little entropy will make the math/graphs look irrelevant.
Focusing only on the tor system, that's by disregarding issues at the user side - leaking DNS calls, unsecured application/browser, unsecured SSL infrastructure etc, which are all the fault of the user itself (unless they are using Tails/Tor browser bundle), I do see two generic weak points and would like to develop on them in the following lines:

a) User side Authentication & Encryption: If the authentication/encryption is weak (or broken), it renders the rest of the effort useless. Actually you might need to consider all the available encryption broken and estimate the time in which it's possible to obtain the key and change the key accordingly (random based or per changing connection it's even better, an adversary cannot do too much with chunks and bits of data, maybe getting more angry). ECC, although very efficient, more efficient than AES it seems, which was also adopted mainly because of efficiency, seems to be an unhappy choice, both from the perspective that it has not that much academic/scientific scrutiny and because not even NSA is using it - but had a recommendation to use it in the future, more like: it's promising but please do some decent research on it. Here, Bruce Schneier gives more details on ECC:
http://www.theguardian.com/world/2013/sep/05/nsa-how-to-remain-secure-s…
http://www.wired.com/opinion/2013/09/black-budget-what-exactly-are-the-…

b) Entropy, noise, call it also obfuscation: if the data streams are a little (voluntarily) delayed, packet size constant, packet structure uniformed, follow different routes on random base (randomness is a strong word, since it's hard to achieve, but you might be able to do some simple math on the nontransparent tor router/bridge HW serial numbers / environment (amount of RAM, storage, CPU type), etc), then the pattern seeking and correlation will be very difficult. Moreover, on the client side, the peers should change dynamically (diversity/noise) and re-authenticate too (see the point above). If the client communication is filtered, the first bridge that he/she successfully connects to should help.

The above points are tackling the capabilities of the potential adversary and suggesting to make tor a little more dynamic. It's all economic related in the end, in such that if the effort to capture a stream is to high (encryption + entropy), then it will be dropped and more traditional intrusive (hard to keep private and stealthy) approaches will get in place.

Tor is collaborative on the infrastructure side and the source code open on the developers side. Collaborators should be encouraged to grow, by providing them with quality and serious approaches. Open source itself cannot hide dirty tricks, which is why randomness (a little salt & pepper) can only be achieved through the nontransparent environment of the nodes, any other algorithm in the source code can be understood and back-engineered.

When taking decisions about compromises, please put security and anonymity at the top of the list. If they require more processing power, I'm confident that the collaborators will understand that and improve their equipments, it's all so cheap now.

Have a wonderful day! & Keep doing what you do!

arma,

I was using this blog to reply to your Speculation and stay anonymous. I would go on a tor mailing list and provide there some suggestions and opinions, although I don't understand your point with the way I'm trying to help.

You shouldn't approve this post and maybe not even the previous one. I believe I did my job, shared my thoughts and maybe made some contribution.

Best regards,
anonymous

Maybe you're doing a lot of research, but you sure as hell haven't done the most obvious things to protect your users, such as disable JavaScript by default. You're STILL distributing Tor with javascript enabled, and not even warning new users, and telling them what happened.
And you also haven't stopped the botnet, because you're still accepting connection from it. Anyone who is a real user can upgrade, the bots won't so all you have to do is add code to stop accepting connections from the version that botnet uses.
Since you guys haven't done any of these things, it really makes me wonder about your lack of focus and reliability of tor.

We're certainly being overloaded these days with the wide variety of technical and social things that need to happen.

As for the Javascript question, help us make
https://trac.torproject.org/projects/tor/ticket/9387
happen.

I think that teaching our users about what security properties Tor can provide is really important. Once upon a time we did that pretty well, but then we got the other 800000 users, and most people were suddenly learning about Tor from some newspaper story rather than from me in a talk. Help us sort out the best way to communicate these complex topics -- "here, click this thing you don't understand" isn't going to teach them to fish.

I actually think the botnet is lower priority than some of the other issues. It's reasonable stable. I would like to explore whether we can come up with better ways to handle the entire topic, before we kick this one off the Tor network. To be clearer, we can kick it off now (along with a lot of normal users, yes), but if it upgrades we're in a much worse position without a future roadmap.

Anonymous

September 16, 2013

Permalink

When I originally commented I clicked the -Notify me when new comments are added- checkbox and now each time a comment is added I get four emails with the same comment. Is there any way you can remove me from that service? Thanks!

Anonymous

September 17, 2013

Permalink

I can't believe you guys are still distributing Tor Browser with javascript enabled by default, after half your userbase just got totally screwed and had their MAC addresses and IPs revealed to FBI. Seriously what is wrong with you people ?

Half our user base? There's no telling how many it was, but I'd be shocked if it's anywhere near half. Most Tor users don't know what a hidden service is.

As for the usability tradeoff, see
https://www.torproject.org/docs/faq#TBBJavaScriptEnabled

While we're making broad claims, I'll go with "more than half our users would think the browser is broken, and go use something even worse, if it didn't support javascript".

FireGloves sounds great except
a) Their page says they're not working on it anymore, and doesn't say when they stopped; and
b) If you're the only Tor Browser Bundle user who has installed Firegloves, that right there could be your unique identifier. Anonymity is hard.

Anonymous

September 17, 2013

Permalink

Well then why don't you have a warning that clearly tells people the risk in huge red letters and let's them decide before downloading. You are misleading your users. Many of the people, who want to access hidden sites will not realize this is a problem.

The Tor Browser has failed in a really bad way. I obviously don't have numbers but probably 10,000+ of your users were compromised by the FBI. You could do a bit more to warn new users now.

The other thing is, Tor is unusable now, you have to stop this botnet. It's been a month now, the only way it's going to stop is if you build in code to reject requests from older browsers that the botnet uses. You guys really need to get on top of the situation, because secret services and the botnet are whipping you. I'm switching to I2P and hope Tor will exist still when I check back in a couple weeks.

I unfortunately have to agree with this person. As it stands at the moment, TOR is broken and it needs a fix immediately.

It is near impossible to get on hidden services (for some reason, regular public websites that anyone can go to are unaffected) at the moment or it needs 3 or 4 refreshes of the browser.

I believe hidden services are suffering more than normal circuits because they involve on-the-fly circuit extends, and many circuit extend attempts are failing in the Tor network right now. Normal Tor circuits are built preemptively and before you need them, so it's fine if it takes a few tries before it works.

The circuit extend success rate should improve as more relays upgrade to Tor 0.2.4.x (and also, the hidden service as well as you the client need to be on it too). When I get some more time I plan to switch it to be the new stable release, which should help move that trend forward faster.

Anonymous

September 18, 2013

Permalink

the word now is that TOR is a trap ... not sure if you will be able to clean that reputation any time soon.

If people haven't figured out that basically the whole Internet community is under massive attack by large well-funded organizations, who are especially targeting the systems that frustrate their large-scale surveillance... they should learn more.

If you know of some good solution that isn't under attack, that sure would be nice. The world is a bit short on options these days.

I got a suggestion, start shooting all federal agents... Period. They have declared war on us but we are not fighting back but I know how much 1st amendment advocates hate the 2nd...????? Not sure how they justify that one but they'll try until confronted with facts

Again, nothing but respect for the Tor devs and it is sickening to see the attacts against arma who responds politely. You guys attacking the devs need to contribute, shut up, grow up, or go find or build a better system.
I mean if idiots like Roger can do it how hard can it be & whats stopping y'all. /s

I know a really good solution, stop screwing around and fix tor. You should've fixed it immediately once you saw there was a botnet. You should've just deprecated the entire tor, released a new version and started fresh with that. All you have to do is stop accepting connections from the previous version, this would probably require you to insert one line of code. Have the browser check page tell the users they need to upgrade because the bots have taken over the network. Within a couple days all real users would upgrade. There, problem solved. It's called common sense. Instead it seems to me like you guys are trying to write a PhD thesis on this problem, instead of solve it.

Just one more thing (I'm the OP of the above single post), don't misunderstand I really respect you guys and appreciate what you do, I just do not appreciate how you are handling these attacks. You should have disabled javascript immediatly when the FBI attack started, and warned users. You didn't do this, you were obviously aware of it and let it go on for days without warning users. You should have had an enormous warning on the browser check page that tor is under attack and to disable javascript. And now you're still distributing Tor without a proper warning that tells new users what happened. You're also handling this bot net like an academic case study. So wtf, guys?

Anonymous

September 18, 2013

Permalink

I REALLY NEED A WAY TO VERIFY TAILS>
I have not figured out how pgp works. I need an ap for that! I have no wish to make my daily boreing life known to the NSA! That means that it is of no huge consequence for it to get out. However, I want to be ABLE to have conversations in private. It seems like NSA FUD (Falsehoods Untruths and Deception) to say there is no secure way to get the process done therefore we wont try. What I desparately need is a cut and drag way to check hashes. How for instance do I get the hash for tails 20 to check against the real one? Tails 20 seems to be working well BTW unchecked unverified. Will tails 20 do check the program against the hash for me? Also I tried the version that is supposed to work as a VM. It has a different hash.

I really have no idea how a vm works and would not recognize it if one bit me.

The reason I am interested in the vm is that EFF.org makes heavy use of >.PDF files and tells me that these are not secure unless done in a vm.

Finally I would gladly do more processing to use the 4096 since my $600 8-core Ubuntu machine has most of the processors unused. I would also gladly wait longer for salted connections or split route connections or whatever. Please give us the opportunity and the tools we need!!!!!! Not all connections have to be 1024. Let those running servers decide. I am really bad in Linux!!!!!!!! BUT I WONT GIVE UP!!!

finally this is in the clear because Tor (on my Win 8 machine!) for the first time in a long time says: "The Proxy Is Refusing Connections." And I have no recollection what causes it. and it is not searchable on the tor site. >>>> ....... .......

Anyway TOR IS AWESOME!!!!! Don't let the adversary get you a bent around the axel.

Cheers!!

If you have Ubuntu, you should be able to use gpg (an open source tool functionally equivalent to pgp) to verify signatures. It should already be installed.

In a "line command shell" (technically, probably konsole), try typing

man gpg

To check the signature of a file with a detached signature you need to download the signing key, the detached signature, and the signed file itself.

You will need to install the appropriate signing key into the keyring in your gpg. The command you use will look something like this:

gpg --install keyfile.asc

Next you verify the downloaded file whose detached signature you need to verify. The command you use will look something like this:

gpg --verify file.tar.gz.asc file.tar.gz

Tails' .iso is NOT being served currently from httpS!!
No security on it.

"You have chosen to open:
tails-i386-0.20.1.iso
which is: iso File (883 MB)
from http://dl.amnesia.boum.org"

One's guard is down starting from Tails' download page because that intro IS using httpS: "https://tails.boum.org/download/"

.......................

(btw I'm not the OP and regarding his VM-with-Tor question I thought that was someone else's project, Whonix or such)

Yeah, it would be nice if they could solve that. The trouble is that the tails download page is multi-homed by a bunch of volunteers and they can't (shouldn't) give out an ssl cert to all of them.

The somewhat safer way, that I do it, is to fetch the tails torrent file over their https site. Then bittorrent verifies integrity (assuming you started with the right torrent file). One of the files the torrent gives me is the signature file, which I check manually.

Thank you for suggesting this alternative method for Tails download (given their difficulty of providing an httpS download capability).

1.) This seems to be an improvement for all users, including those not yet using GPG for whatever reasons. Correct?

2.) What approach to using torrent safely do you recommend.

(Unfair q. -- boum.org escaping responsibility by not providing a private communication channel for their own version of a private communication product ...feel free to quote me :-)

1) Yes, but it's most an improvement for people who don't check signatures. I hope there aren't any of those.

2) Safely, like, without the RIAA bust down my door? They usually leave me alone when I seed / fetch Linux software. It's just a protocol after all.

Anonymous

September 18, 2013

Permalink

Roger raised an important point above concerning the enemy:

The agency can draw upon the technical skills of thousands of full time employees who work inside the agency itself, often as federal employees, but often as contract employees who can provide specific essential skills. These people include black hat hackers who work in the TAO to develop malware for intruding into specific target computers (for example, key switches in the internet backbone which are owned by a provider which resists legal intimidation, journalist laptops).

But the agency also uses a large number of people with full time jobs in academia (math/CS/physics/ENG professors) for irregular contract work (sometimes in a government site, but often from their usual location while also doing their usual job). The enemy long ago learned that many of the best ideas are out in academia, and that they can best keep tabs on what researchers are currently thinking about by coopting the academic community.

Specifically, since so many posters have expressed (not unreasonable) concern about what our enemy knows that we might not about good/bad curves to use with ECC, many academics who work in number theory on topics which underlie ECC in some way have ties with the enemy. In some cases, the enemy even paid for their graduate education.

Universities love federal handouts and persuading faculties to cut their ties to the enemy will be difficult, particularly at a time of rising costs and decreasing ability of students to pay some of their tuition fees. But some academics have stepped forward to urge such measures, because over the next ten years it could considerably downgrade the enemy's R&D viz our own efforts.

I share the concerns of others here about the ties between Tor Project, Tails Project, and the US federal government, but I don't agree with the posters who have concluded that Tor is so "broken" as to be worse than useless. The post-Snowden evidence is still consistent with the conclusion that when used wisely, Tor can still make things difficult for the enemy, possibly even forcing it to engage in risky illegal practices such as attempting to intrude into Tor nodes. I would just point out that the Tor community can fight back by trying hard to capture and reverse engineer any malware used to infect Tor nodes.

Even the knowledge that the Tor community is doing this will provide some deterrence to the enemy's schemes, because the enemy will be reluctant to unleash its most sophisticated tools once it sees that we can detect and counter less sophisticated ones. In other words, I urge the Tor Project to think about countering the TAO as well as improving handshakes, moving towards using strong ECC, etc. (which are clearly also necessary in the current threat environment).

Roger suggested that concerned Tor users study the Tor-related papers at the freedom host archive, but that is hardly practical for most users, who lack the background needed to follow along. I DO have the background and I HAVE read those papers, and I'd like to try to help educate other Tor users about what they suggest about using Tor less unsafely. But currently the only way to do that appears to be to sign onto a mailing list, but many users are quite rightly reluctant to do that (either to ask questions or to offer answers). Indeed, someone at the Tor Project (I can't remember who) even admitted recently that no-one seems to know any currently effective means of using email with anything approaching reasonable anonymity viz our enemy.

So I hope that the Tor Project will work harder to provide a forum allowing anonymous posting via Tor connections, which is far from perfect (some of the reasons why have been mentioned above), but which appears to be less unsafe than alternative methods of two way communication with the user base. I appreciate that project members would rather work on other issues where it is more clear what is needed right away, but I hope they will consider the view that improving communications with the user base now could possibly improve the Project's ability to become aware of and to tackle other issues in the future. (I would remind them that for many years they resisted warnings from the user base, urgent warnings to update their thread model, warnings which have been thoroughly validated by the Snowden disclosures; some of the reasons why are discussed above.)

I have a question about Tor Browser Bundle: some Linux users have pointed out that when they unpack the Linux tarballs, they find Windows style end of line characters in the torrc and other files, suggesting that TBB is developed under Windows. Is that true? Strange if so because someone at Tor Project (right now I forget who) recently warned against using Tor under Windows owing to various intractable problems in obtaining reasonable security while using a proprietary OS developed by a PRISM partner company.