Facebook, hidden services, and https certs

Today Facebook unveiled its hidden service that lets users access their website more safely. Users and journalists have been asking for our response; here are some points to help you understand our thinking.

Part one: yes, visiting Facebook over Tor is not a contradiction

I didn't even realize I should include this section, until I heard from a journalist today who hoped to get a quote from me about why Tor users wouldn't ever use Facebook. Putting aside the (still very important) questions of Facebook's privacy habits, their harmful real-name policies, and whether you should or shouldn't tell them anything about you, the key point here is that anonymity isn't just about hiding from your destination.

There's no reason to let your ISP know when or whether you're visiting Facebook. There's no reason for Facebook's upstream ISP, or some agency that surveils the Internet, to learn when and whether you use Facebook. And if you do choose to tell Facebook something about you, there's still no reason to let them automatically discover what city you're in today while you do it.

Also, we should remember that there are some places in the world that can't reach Facebook. Long ago I talked to a Facebook security person who told me a fun story. When he first learned about Tor, he hated and feared it because it "clearly" intended to undermine their business model of learning everything about all their users. Then suddenly Iran blocked Facebook, a good chunk of the Persian Facebook population switched over to reaching Facebook via Tor, and he became a huge Tor fan because otherwise those users would have been cut off. Other countries like China followed a similar pattern after that. This switch in his mind between "Tor as a privacy tool to let users control their own data" to "Tor as a communications tool to give users freedom to choose what sites they visit" is a great example of the diversity of uses for Tor: whatever it is you think Tor is for, I guarantee there's a person out there who uses it for something you haven't considered.

Part two: we're happy to see broader adoption of hidden services

I think it is great for Tor that Facebook has added a .onion address. There are some compelling use cases for hidden services: see for example the ones described at using Tor hidden services for good, as well as upcoming decentralized chat tools like Ricochet where every user is a hidden service, so there's no central point to tap or lean on to retain data. But we haven't really publicized these examples much, especially compared to the publicity that the "I have a website that the man wants to shut down" examples have gotten in recent years.

Hidden services provide a variety of useful security properties. First — and the one that most people think of — because the design uses Tor circuits, it's hard to discover where the service is located in the world. But second, because the address of the service is the hash of its key, they are self-authenticating: if you type in a given .onion address, your Tor client guarantees that it really is talking to the service that knows the private key that corresponds to the address. A third nice feature is that the rendezvous process provides end-to-end encryption, even when the application-level traffic is unencrypted.

So I am excited that this move by Facebook will help to continue opening people's minds about why they might want to offer a hidden service, and help other people think of further novel uses for hidden services.

Another really nice implication here is that Facebook is committing to taking its Tor users seriously. Hundreds of thousands of people have been successfully using Facebook over Tor for years, but in today's era of services like Wikipedia choosing not to accept contributions from users who care about privacy, it is refreshing and heartening to see a large website decide that it's ok for their users to want more safety.

As an addendum to that optimism, I would be really sad if Facebook added a hidden service, had a few problems with trolls, and decided that they should prevent Tor users from using their old https://www.facebook.com/ address. So we should be vigilant in helping Facebook continue to allow Tor users to reach them through either address.

Part three: their vanity address doesn't mean the world has ended

Their hidden service name is "facebookcorewwwi.onion". For a hash of a public key, that sure doesn't look random. Many people have been wondering how they brute forced the entire name.

The short answer is that for the first half of it ("facebook"), which is only 40 bits, they generated keys over and over until they got some keys whose first 40 bits of the hash matched the string they wanted.

Then they had some keys whose name started with "facebook", and they looked at the second half of each of them to pick out the ones with pronouncable and thus memorable syllables. The "corewwwi" one looked best to them — meaning they could come up with a story about why that's a reasonable name for Facebook to use — so they went with it.

So to be clear, they would not be able to produce exactly this name again if they wanted to. They could produce other hashes that start with "facebook" and end with pronouncable syllables, but that's not brute forcing all of the hidden service name (all 80 bits).

For those who want to explore the math more, read about the "birthday attack". And for those who want to learn more (please help!) about the improvements we'd like to make for hidden services, including stronger keys and stronger names, see hidden services need some love and Tor proposal 224.

Part four: what do we think about an https cert for a .onion address?

Facebook didn't just set up a hidden service. They also got an https certificate for their hidden service, and it's signed by Digicert so your browser will accept it. This choice has produced some feisty discussions in the CA/Browser community, which decides what kinds of names can get official certificates. That discussion is still ongoing, but here are my early thoughts on it.

In favor: we, the Internet security community, have taught people that https is necessary and http is scary. So it makes sense that users want to see the string "https" in front of them.

Against: Tor's .onion handshake basically gives you all of that for free, so by encouraging people to pay Digicert we're reinforcing the CA business model when maybe we should be continuing to demonstrate an alternative.

In favor: Actually https does give you a little bit more, in the case where the service (Facebook's webserver farm) isn't in the same location as the Tor program. Remember that there's no requirement for the webserver and the Tor process to be on the same machine, and in a complicated set-up like Facebook's they probably shouldn't be. One could argue that this last mile is inside their corporate network, so who cares if it's unencrypted, but I think the simple phrase "ssl added and removed here" will kill that argument.

Against: if one site gets a cert, it will further reinforce to users that it's "needed", and then the users will start asking other sites why they don't have one. I worry about starting a trend where you need to pay Digicert money to have a hidden service or your users think it's sketchy — especially since hidden services that value their anonymity could have a hard time getting a certificate.

One alternative would be to teach Tor Browser that https .onion addresses don't deserve a scary pop-up warning. A more thorough approach in that direction is to have a way for a hidden service to generate its own signed https cert using its onion private key, and teach Tor Browser how to verify them — basically a decentralized CA for .onion addresses, since they are self-authenticating anyway. Then you don't have to go through the nonsense of pretending to see if they could read email at the domain, and generally furthering the current CA model.

We could also imagine a pet name model where the user can tell her Tor Browser that this .onion address "is" Facebook. Or the more direct approach would be to ship a bookmark list of "known" hidden services in Tor Browser — like being our own CA, using the old-fashioned /etc/hosts model. That approach would raise the political question though of which sites we should endorse in this way.

So I haven't made up my mind yet about which direction I think this discussion should go. I'm sympathetic to "we've taught the users to check for https, so let's not confuse them", but I also worry about the slippery slope where getting a cert becomes a required step to having a reputable service. Let us know if you have other compelling arguments for or against.

Part five: what remains to be done?

In terms of both design and security, hidden services still need some love. We have plans for improved designs (see Tor proposal 224) but we don't have enough funding and developers to make it happen. We've been talking to some Facebook engineers this week about hidden service reliability and scalability, and we're excited that Facebook is thinking of putting development effort into helping improve hidden services.

And finally, speaking of teaching people about the security features of .onion sites, I wonder if "hidden services" is no longer the best phrase here. Originally we called them "location-hidden services", which was quickly shortened in practice to just "hidden services". But protecting the location of the service is just one of the security features you get. Maybe we should hold a contest to come up with a new name for these protected services? Even something like "onion services" might be better if it forces people to learn what it is.

Anonymous

November 01, 2014

Permalink

Comments on part four:

There's another reason for wanting to have https to an onion address: guarantee that no other .onion site is proxying/MITMing the service's data stream, by showing that the .onion address has a key actually possessed (or at least authorized) by the one who owns the site.

The entire reason for third-party certification in TLS is to guarantee that you're talking with who you think you're talking to. Sure, Tor is self-authenticating, in that you can guarantee you're talking with "someone who has access to the key necessary to claim the .onion address". This is useful in a lot of circumstances. But, when your adversary can carry out low-cost attacks (such as low-cost ad campaigns which claim that access to your service can be gained at a particular .onion address that your adversary holds the private key to), it becomes much more important to have a third-party attestation of just who is providing the service, and who actually possesses the private key in question.

It is possible to use TLS without a server certificate. However, the web browser threat model is basically "the secure website must not be impersonated or transparently proxied". This is why web browsers specifically do not permit uncertified TLS. (Of course, there's a lot of legitimate anger that this works as an extortion racket; I personally support having multiple user interfaces for unauthenticated versus unauthenticated keyed versus third-party certified versus third-party extended validation, so that site operators can choose for their own sites what threat level they're willing to impose on their users, and turn it into an uncoerced market. Ultimately, .onion addresses provide unauthenticated-keyed connections, even though the browser doesn't understand how to provide any kind of useful UI with it.)

Right now, the CA/Browser Forum is debating how or even whether to put .onion addresses in TLS certificates. The debate appears to hinge on the idea of "alternate DNS roots", which are alternate entry points into alternate directories to look up names versus IP addresses. These have already existed, and have already had name-collision problems when ICANN chose to authorize new root names under its system that alternate DNS root providers had already allocated. A short-term fix for this could be for Tor to approach ICANN to ensure that it will not allocate a new .onion TLD, thus reserving it for the Tor network, for some (renewable) number of years. I don't know how likely this might be. This would cause some problems down the road, though, for other onion-routing topologies and softwares.

There's a couple of alternatives for how to handle certification of the .onion address. Certifying the .onion key at the Tor layer is not useful, because Tor does not have a certification field. This means that the certification of a Tor key would have to assign that key to the webserver as well; there's no communication between the web browser and the Tor software to verify that the key used for the .onion address and the certificate presented by the webserver even match. (Cryptographic digests inherently suffer a vulnerability called the "birthday attack": multiple messages can exist that compute to the same digest value. It is always important to check if the key itself matches, without simply checking that the digest itself matches.)

And please, get out of the thought that only the Tor browser is going to be used with Tor. Other protocols can use it, and other browsers can be used with it after setup; you can bet that as Tor catches on with service providers, other browsers are going to configure themselves to use Tor.

.onion name certification alternative 1: Certify the .onion key, and require the TLS server to have access to that specific certified key as the certificate for the TLS endpoint. I don't recommend this option, because it would increase the risk of that specific key being compromised and the "brand" of the .onion name being rendered worthless. It also flies in the face of good webserver key hygeine, which suggests that after a period of time you should always rekey your webservers.

.onion name certification alternative 2: Rely on the fact that the .onion name is already hashed and cryptographically bound to that key, and use that hash as the "proof of authority to use the name". Then, include the .onion name in a standard certificate over a separate keypair. I believe that this would be more secure and would lessen the potential risk to the brand, by reducing the attack surface against the name key. However, there are potential additional caveats to this option, as well. One is that CABF has deprecated SHA1 as providing less of a security strength than it is comfortable with. Another is that .onion addresses only encode half of the SHA1 in the first place, thereby halving the security strength that CABF is already uncomfortable with.

CABF is risk-averse and caveat-averse. Remember that Facebook could mine 40 bits of the SHA1. Computers are only becoming faster and more capable, and mining hashes efficiently is a known problem with documented approaches. This means that before too long 80 bits are going to be able to be mined cost-effectively, and Facebook's .onion name is going to suffer a collision. What will it do when that happens? (A bit of back-of-the-envelope arithmetic suggests that if every person on Earth [7.125 billion in 2013, according to Google] had a computer that could calculate 20-bit digest collisions in 8 seconds, like @ErrataRob on Twitter claimed that his did, it would take 20.58 minutes to try 2**40 possibilities. This doesn't mean that a collision would be found in 20.58 minutes, but the numerical possibility doesn't bode well in light of botnets and zero-day vulnerabilities.)

Ultimately, Tor is going to need to switch to a stronger digest algorithm, and encode more bits into the name. This is for the security of everyone, including the services who operate .onion servers. Determining how it's going to approach this problem should be a priority; as Tor is the de-facto manager of the .onion namespace, though, it's vital that it have a plan to do so. This will become even more important if/when ICANN officially allocates the management of the .onion TLD to Tor.

Ultimately, I'd like to see .onion be deprecated, for a .tor TLD that addresses the issues. The amount of work that would need to go into this, though, may be prohibitive.

Anonymous

November 01, 2014

Permalink

Not entirely on topic but while Facebook is taking actions to improve their PR with this move, Ello (a 'more private' alternative to Facebook) has decided to block Tor saying: "Access To Website Blocked".

Anonymous

November 01, 2014

Permalink

Very interesting!

So, why would HTTPS://xyz.onion be better than a HTTP onion address that re-directs to the website that's running HTTPS with it's own cert? (For a website that offers clearnet and onion addresses.)

And how would Tor2Web figure into this discussion, because I'm sure many Facebook users don't have TB but may want to use the new onion address.

Why should the onion be https instead of http that is then forwarded to the https site over the clearnet?

1. because the http onion doesn't have to redirect to the website that's running https with its own cert.

2. because the http onion could serve drive-by malware to the browser. (remember that Tor Browser is not the only browser that can be used with Tor.)

3. because the exit node could block connections to the website that's running https with its own cert.

4. because the exit node could perform a POODLE or BEAST or other attack against the connection.

5. because the http onion could proxy the https site with full functionality, relying on people who know how Tor works to think that there isn't a man-in-the-middle attack happening at the boundary between security protocols.

6. because the http onion doesn't have to be owned by the owner of the target site, and could thus collect statistics that belong legitimately to the owner of the target site and potentially the requesting user.

7. because the http onion could collect browser fingerprinting data (such as canvas fingerprints) before forwarding the connection... and fail the forwarding if the fingerprint data isn't provided or is blank.

I'm certain there are other reasons.

Right. The reason not to use https directly to www.facebook.com is that any of the 300+ certificate authorities around the world can produce an https cert for www.facebook.com that your browser will trust. Those include Turkish Telekom, China has one, etc. Go read about Diginotar for an example of how this can go wrong:
https://en.wikipedia.org/wiki/DigiNotar
The "CA infrastructure" is not particularly robust.

As for using tor2web to reach Facebook's new onion site, that is also a poor idea. It means you'll be using encryption between you and the tor2web site, and the tor2web site will be using encryption between it and Facebook, but the tor2web site will get to watch everything you do, as well as knowing both your location and your destination. In short, you'll get the sort of protections you get from using a single-hop proxy or a VPN, which is much weaker than the protections you get from going to a .onion site (whether it uses http or https) with Tor Browser.

See also
https://svn.torproject.org/svn/projects/articles/circumvention-features…

Anonymous

November 01, 2014

Permalink

Regardless of whether it's secure or not, if you have a personal Facebook account you've got to ask yourself this question:

Would you want a company that is known for complete disregard for privacy, and hands data over to NSA/GCHQ easily to know that you are a user of TOR?

It's highly likely that the account you are using to store the fact that you are using TOR. NSA/GCHQ are known to target people who have searched for TOR on Google for example.

> NSA/GCHQ are known to target people who have searched for TOR on Google for example.

Then I was fucked when I was sixteen and started thinking this stuff was cool. May as well go all the way.

Also, I think people are deeply misinterpreting the "NSA are known to target people who search for Tor" thing. The reality seems to be that NSA are known to target all sorts of people for all sorts of things. It's easy to write new rules to target whatever they think of, so they've probably written a huge number of rules and generated a huge number of lists of people from them.

So would you rather they a) have a list of all the websites you've visited, or b) know that you're a Tor user?

The assumption that "if I don't go near Tor, they won't target me" is similar to the "I'm not interesting, so why would they go after me?" reasoning. It assumes we're in the old world where investigations take human time and energy so they have to focus them, rather than the new world where you might as well gather it all because that's cheaper than making decisions about what not to gather.

Anonymous

November 01, 2014

Permalink

Even with their onion service you still cannot get into facebook without using javascript. Suspicious??

Anonymous

November 01, 2014

Permalink

I am not convinced about Facebook accidentally coming up with that onion address the same way I am not convinced any even slightly sane person would opt to use FB, onioned or not.

While it's a good thing for tor reaching the masses, it certainly isn't a good thing for anonymity and privacy.

What FB really wants is to reach people in countries banning FB so that they can organize "orange revolutions".

Anonymous

November 01, 2014

Permalink

About Part four:
A CA is a choke-point. Those are used for acquire and retain control, lock-in. A dictatorship that decides that is and is not a reputable service.

A choke-point is a weak point in the system that will be exploited.

You would effectively weaken the security, trust and validity of tor.
That tor really is an unbiased and uncensored system.

There are also legal and political issues to consider, if the above is not enough motivation.

Anonymous

November 01, 2014

Permalink

Cert is a bad idea. Security system that trusts something is inherently broken.
When it comes to security you should not trust anything. Always assume lies and deception.
Even the so called transparency reports are untrustworthy and even a warning sign of deception. It is called trying too hard. "I will give you this to distract you from something else"

How can you really verify that nothing have been excluded from the "transparency" reports?
Openness, trust and transparency is a farce. Manipulation and deception.

Would need unfiltered access to the developers private financials, including friends and family members and so on. Spending habits.

Excluding the legal tax reason, this means nothing:
https://blog.torproject.org/blog/transparency-openness-and-our-2013-fin…
https://www.torproject.org/about/financials

Should we not be concerned that this author have political motivations?
Is tor getting corrupted from the inside out as cancer?
Why is the improvements and security bugs taking so long to implement. Why are they so hesitant?
After all american mean untrustworthy.

At least be funny and original if you are going to call me a conspiracy nut.

- A concerned tor user, living in fear of her life, the only thing left to take.

Anonymous

November 01, 2014

Permalink

I vote for these two approaches :

"One alternative would be to teach Tor Browser that https .onion addresses don't deserve a scary pop-up warning. A more thorough approach in that direction is to have a way for a hidden service to generate its own signed https cert using its onion private key, and teach Tor Browser how to verify them — basically a decentralized CA for .onion addresses, since they are self-authenticating anyway. Then you don't have to go through the nonsense of pretending to see if they could read email at the domain, and generally furthering the current CA model."

Anonymous

November 01, 2014

Permalink

> Maybe we should hold a contest to come up with a new name for these protected services?

"TOR Protected Services"?

Anonymous

November 02, 2014

Permalink

Thanks for clearing the question I had in mind
about FB apparently forcing/forging its onion selector !

Let me add as just a note that I (and I'm sure, not just I)
am very much annoyed at every suggestion and proposals
which rely on the assumption that Tor users must be
using the "tor browser". First, I (and many people I lnow) /hate/ Firefox and would never ever use it or a derivative thereof
for browsing with or without Tor,
and second, even if you happen to /like/ FF (one wonders what is to be loved there, but, hey! love is blind as we all know),
trying to impose "the" Tor browser - be it firefox in disguise or anything else - is dangerous as it goes against diversity, hence against security.

Not to mention the self evident point that Tor is not JUST for browsing the (private?/public?) webz !

Please Tor folks, stop supporting Firefox exclusively, it is a disservice to the community in my very humble opinion.

Yes, I'd also love to live in a world where you could use any browser you like with Tor. The problem is that all browsers have huge privacy problems. Firefox is the one we've spent the most time and energy fixing, and that set of fixes (available in a fork called Tor Browser) is the only one we can recommend. To be clearer, there are known fingerprinting and/or deanonymization bugs in all the others.
https://www.torproject.org/docs/faq#TBBOtherBrowser

For example, you might like the "why not Chrome" section of
https://blog.torproject.org/blog/isec-partners-conducts-tor-browser-har…
for further reading.

And finally, you're right, there are other things to do on the Internet besides web browsing. But if you're using any complex program that hasn't been audited specifically for Tor, the odds are good that it has some privacy bugs :( since not enough developers think about privacy when writing their programs. This is a huge area that needs more help in many ways.

I have also often wondered about a "lighter" TBB, perhaps one with Midori, but I understand your point of view and how much hard work maintaining a secure bundle is.

Which brings my question, would a VM-based model like Whonix allow for more versatility? For example, instead of using the Whonix-provided browser VM using your own VM with another browser of your choice?

An isolating proxy solution (for instance Whonix) does reduce some concerns but doesn't make you any harder to fingerprint. That's why the Whonix devs still suggest you only run Tor Browser - albeit slightly altered to eliminate tor over tor.

See:
www.whonix.org/wiki/Tor%20Browser#Anonymity_vs_Pseudonymity
www.whonix.org/wiki/DoNot#Do_not_confuse_Anonymity_with_Pseudonymity.

Potential issue: Now we're hitting an HTTPS URL the first (maybe every?) time we hit the .onion URL. So any way to detect a visit to that site under Tor+HTTPS is now also detecting you even though you did the "safe" thing and used the .onion address.

I don't see the issue. Are you worried about a website fingerprinting attack, where they look at your encrypted traffic flow and guess what site you're connecting to? Seems to me that using https doesn't make these attacks any worse.

Anonymous

November 02, 2014

Permalink

hi nice!

Anonymous

November 02, 2014

Permalink

It should be kept in mind that facebook is just another surveillance project. It has all properties. And it is most intrusive, the users give all their data about every aspect of their lives. The NSA does the same, but without people knowing about it for the most part. Facebook is severe data retention. The NSA is larger, but both are doing the same. Think about that. Most websites have like-buttons which tell facebook on pageload who opens which websites, even without being logged on to facebook. This isn't funny anymore.

The certificate isn't just to stop MitM between Facebook's hidden service and Facebook's core, but to give users confidence that the hidden service is run by Facebook. The certificate is not for facebookcorewwwi.onion, but for *.facebook.com with a subject alt name of facebookcorewwwi.onion. This is not something the browser UI makes obvious, but it at least gives visitors to the hidden service who care to check a good degree of assurance that it belongs to the same owner as facebook.com. Before issuing a certificate for facebook.com, the CA would have performed at least some checks that the request really came from Facebook.

Anonymous

November 02, 2014

Permalink

Hi arma!

I have recognized that Facebook now blocks Tor users who visit the normal https web site. They call it "bad ISP".

I guess they only let you use Facebook over Tor if you come via the onion-address.

This is true .
Chances of this happening is not so uncommon and increased over time . Not a bug .
Recently having hard time using FB over Tor .
One more thing , you haven't mentioned anything about FB's mandatory javascript on requirement !

Hi.

This (blocking) should not be the case - or, if blocks have been put in place for a given exit node then it is not because of its status as a Tor exit node, and in time the blocks will likely expire or be removed.

The goal of the experimental Facebook onion address is to provide a more accessible and secure Tor-based means of access to Facebook in addition to what is already available.

We currently see no benefit in intentionally blocking legitimate access via Tor exit nodes, not least because Tor exit nodes are publicly listed and easily identified and there appears to be little value in prioritising one form of Tor-sourced traffic over another. It is possible that this stance may change in future but I find it difficult to comprehend what benefit would come from doing so.

In the meantime I would also like to take this opportunity to remind everyone that the Facebook onion address is an experiment and that there will be implementation and user-interface changes as time progresses. As mentioned in the original blog post, one of these will be work related to possibly making the Facebook mobile website available, which might also provide also features which have been requested by many people who access Facebook over Tor.

In the process of deployment there may be occasional brief service outages, but we shall endeavour to minimise such surprises.

Again, to quote the blogpost: the onion service is "of an evolutionary and slightly flaky nature".

Best wishes :-)

Alec Muffett
Security Infrastructure
Facebook Engineering
London

Anonymous

November 02, 2014

Permalink

Pardon for going slightly offtopic but I think it still fits in here somehow.
How do you plan to address services that routinely block Tor, e.g. CloudFlare? In the past months more and more websites have become inaccessible via Tor and the number keeps growing. It's only a matter of time until this will become a major issue. We now have awesome tools to circumvent blockage on ISP level but it's still very easy for destination servers to block Tor users via exit node list.

Concerning Facebook, I think it is just a matter of how you use it. Noone forces you to reveal personal information, I solely use it to obtain such from others. Also keep in mind that quite a lot of political activists use the site, for them the ability to remain anonymous may be extremely crucial.

Anonymous

November 02, 2014

Permalink

This could also allow for clear distinction between tor and non tor users. As we already know, anonymity systems are a breeding ground for unscrupulous behavior that undermines legitimate use. Normal facebook users would previously see this behavior and dismiss it as random trolling or contamination. If FB now chooses to clearly distinguish tor users, such behavior will now be by many previously unaware, directly associated with tor, it's users and the anonymity world altogether. This could reverse growth by slowly programming users of the social giant to detest Tor/Anonymity users because of the acts of the immature and inconsiderate. I mean, wow! The crap I've seen submitted to facebook by likely untraceable users is without a doubt the worst thing a person will see in their lifetimes. A deep scar experienced by a huge user base with a clear link to TOR. Time to take mitigating steps.

P.S. I wonder the costs incurred by FB for this undertaking. How profitable will this be? Hidden services are currently very inefficient. Only a fraction of tor users access facebook using tor. Censoring governments could (if tor users are now labeled) then more effectively target it's users. If FB helps to improve the HS protocol, great! Otherwise we could likely walk away from this with a darker cloud over the anonymity scene with nothing to show for it other than a foul experience cause of some impulsive small minded closet monsters. This of course predicated on tor user labeling.

Exactly. The facebook administrative are aware of the difference but average users could (if tor users are clearly labeled when using this new service) now see the distinction. Now your everyday clearnet user will be able to associate abuse and tom fuckery with Tor users. This could have a slow degenerative effect on public opinion regarding tor and anonymity.

Example:

A troll befriends a large number of unsuspecting clearnet users then drops a picture bomb like cp. Clearnet users are scarred, panic, and report the atrocity. Facebook admins respond in several ways and assures it's users. These users move on scarred but unaware that tor was used to drop the image that's caused so much turmoil in their psyche. Things go on as it has been.

Now, if facebook admins choose to clearly label a tor user as such. Any inappropriate behavior on part of the tor user will be easily associated with tor. Clearnet users will now know and attribute the misuse of facebook to the perpetrating tor users.

With facebooks far reaching social structure, it's feasible that a small group of attackers could ruin tor's public image by gathering a large amount of facebook "friends" over a period of say 3 to 6 months to then finally drop a cp bomb viewed by millions. If each of those viewers could easily link the image to tor, then game over. Public outcry would be enormous and arguing the primary goal of tor and it's benefits could fall on dead ears. Most people can't think rationally when angry, and as we all know cp is endless fuel.

Facebook is massive with a great degree of exposure. Tor users and the tor dev team need to tread lightly. Else a massive outcry for it's ban could occur. If not an outright ban then at a least gradual degradation of it's image would occur. I don't have a FB account so they're likely some flaws in my understanding of the capabilities of it's users. Though if the above can hold true then the Tor dev team must urge FB not to label it's tor users. I can imagine some three letters agencies drooling at this potential attack vector. Sway public opinion and you don't have break tor, just wipe it away like a smear on a windshield.

Anonymous

November 02, 2014

Permalink

Another way to fingerprint your surfing habits and link you to your real identity... TOR dev's might as well come out a say "Were working closely to fool you into thinking you are safe but in reality we want a nice trail of your surfing habits to give to the NSA" Forgetting TOR is majorly funded by DARPA? WAKE UP!!

Anonymous

November 02, 2014

Permalink

Tor isn't limited to only routing the Hypertext Transfer Protocol (HTTP) of the World Wide Web. Tor is capable of routing many other application layer protocols, such as Ricochet's custom protocol. I consider 'net' instead of 'web' to more accurately reflect Tor's capabilities.

Anonymous

November 03, 2014

Permalink

I dont understand this.

Whats the point of setting this up if,

1. facebook continues to block tor traffic with checkpoints.
2. facebook filters out tor traffic when trying to register. forces phone verification.
3. facebook checkpoints and auto locks outs users with pre existing accounts that connect over tor

ok. So I bite the bullet and connect my phone to my Facebook account since im locked out for connecting over tor.

thats solves the problem 15 minutes. but then im asked to upload a form of ID or drivers license to unlock my account. Thats just to far in my opinion.

None of this makes any sense whatsoever. Why even waste the time and resources brute forcing the onion address and setting up a hidden service that is borderline useless.

Anonymous

November 03, 2014

Permalink

Almost feels like an April Fool's joke. Best thing fb ever did, now if they could just reform the rest of their terrible privacy policy...

Anonymous

November 03, 2014

Permalink

*******************
Exit Node not working for facebook onion site.

I configured an exit node of France to use Facebook onion website but the onion website always pick up random exit node. Another thing, the IP address logged by Facebook (which we can view in the security options' last log in info) is always that of UK.

Is there any problem with my browser or is it that all onion sites use exit nodes randomly?

Anonymous

November 04, 2014

Permalink

I wouldn't trust Facebook further than I could through Zuckerburg. Zuckerburg just looks, well... just ;) Look what happened to torrent sites when they moved onto the pages of the BBC website (always a bad sign) and became more "mainstream" - ISPs started blocking them (now we was to use Tor to access them). Saying that it does solve the "bad IP" nonsense when trying to access Facebook through Tor... although the site still "works better with JavaScript enabled ;) We will have to wait and see how this pads out...

Another thing, that "one more step... you appearing to be using an anonymising network" ... here is a stupid capcha to fill in that a more and more websites

Anonymous

November 04, 2014

Permalink

If a singular huge network had the majority of exit routers all working at the same time?
I have to ask doesn't this have implications for Tor security.

If Tor is massive it doesn't matter but if you got a proportionally massive amount of use on a single network then trouble?

Also the use of certs and any browser all sounds a really bad idea to me.

?

Anonymous

November 04, 2014

Permalink

It is very nice to see https together with onion, Not for tls itself, As if it is Run On the same machine it would be only adding Extra round trips and thus increasing latency upon Connection BUT:
There Are some nice benefits of this Setup see:
facebookcorewwwi.onion:443 direct:// spdy/3.1
fbcdn23dssr3jqnq.onion:443 direct:// spdy/3.1
This means, SPDY through darknet is increasing your Connection throughput alot. You can compare or Benchmark it On your own by Connecting To http://facebookcorewwwi.onion:80 and actually feel the difference. For load balancing reason they may offloading by https Proxy wich means in this Setup your Connection is encrypted even To the last End. See Facebook Statement, Look nsa google internal Network unencrypted eavedropping traffic papers.

With https, a Site can authenticate To users. If the Site gets MitM or proxyed you would notice a different cert. Or if onion key gets stolen/Lost.

SPDY by Default requires TLS!