Hidden Services need some love

Hidden Services are in a peculiar situation. While they see a loyal fan-base, there are no dedicated Tor developers to take care of them. This results in a big pile of features that need to be researched, implemented and deployed to make Hidden Services more secure and effective.

The purpose of this blog post is threefold:

  1. Introduce Hidden Service operators to various shortcomings of the Hidden Service architecture.
  2. Introduce researchers to various research questions regarding Hidden Services.
  3. Introduce developers to the plethora of coding tasks left to be done in the hidden Service ecosystem.


Note that not every idea listed in the blog post is going to turn out to be a great idea. This post is more of a brain-dump than a solid fully-analyzed agenda.

In any case, let's get down to the issues:




Hidden Service Scaling


The current Hidden Services architecture does not scale well. Ideally, big websites should have the option to completely migrate to Tor Hidden Services, but this is not possible with their current architecture.

One of the main problems with a busy Hidden Service is that its Introduction Points will get hammered by clients. Since Introduction Points are regular Tor relays, they are not intended to handle such load.

Therefore, one of the first steps for improving Hidden Services scalability is increasing the durability of its Introduction Points. Currently, a Hidden Service selects the number of its Introduction Points (between one and ten) based on a self-estimation of its own popularity. Whether the formula currently used is the best such formula is an open research question.

Another problem with Hidden Services is the lack of load balancing options. While you can load-balance a Hidden Service using TCP/HTTP load balancers (like HAProxy), there is no load-balancing option similar to DNS round-robin, where load balancing happens by sending clients to different server IP addresses. Such load-balancing could be achieved by allowing a Hidden Service to have multiple "subservices". Such an architecture, although appealing, introduces multiple problems, like the intercommunication between subservices, where the long-term keypair is stored, how introduction points are assigned, etc.


Defense against Denial of Service of Introduction Points


The adversarial version of the previous section involves attackers intentionally hammering the Introduction Points of a Hidden Service to make it unreachable by honest clients. This means that an attacker can temporarily bring down a Hidden Service by DoSing a small number of Tor relays.

To defend against such attacks, Syverson and Øverlier introduced Valet nodes in their PETS 2006 paper: "Valet Services: Improving Hidden Servers with a Personal Touch". Valet nodes stand in front of Introduction Points and act as a protection layer. This allows Hidden Services to maintain a limited number of Introduction Points, but many more contact points, without clients learning the actual addresses of the Introduction Points.

Valet nodes are not implemented yet, mainly because of the big implementation and deployment effort they require.


Key Length


The long-term keypair of a Hidden Service is an RSA-1024 keypair which nowadays is considered weak. This means that in the future, Hidden Services will need to migrate to a different keysize and/or asymmetric cryptographic algorithm.

A side effect of such migration is that Hidden Services will get a different onion address, which might be troublesome for Hidden Services that have a well-established onion address. To make the transition smoother, Hidden Services should be able to use both old and new keypairs for a while to be able to point their clients to the new address.

Unfortunately, while design work has started on strengthening some parts of Tor's cryptography, there are no proposals on improving the cryptography of Hidden Services yet.


Attacks by Hidden Service Directory Servers


Hidden Services upload their descriptor to Tor nodes called Hidden Service Directory Servers (HSDirs). Clients then fetch that descriptor and use it to connect to the Hidden Service.

In the current system, HSDirs are in an interesting position which allows them to perform the following actions:

  • Learn the .onion address of a Hidden Service and connect to it
  • Evaluate the popularity of a Hidden Service by tracking the number of clients who do a lookup for that Hidden Service
  • Refuse to answer a client, and if enough HSDirs do this then the Hidden Service is temporarily unreachable

These scenarios are explored in the upcoming IEEE S&P paper titled "Trawling for Tor Hidden Services: Detection, Measurement, Deanonymization" from Alex Biryukov, Ivan Pustogarov and Ralf-Philipp Weinmann. Be sure to check it out (once they publish it)!

Let's look at some suggested fixes for the attacks that Hidden Service Directory Servers can perform:


Defences against enumeration of onion addresses

Hidden Services use a hash ring to choose which HSDirs will host their descriptor; this means that HSDirs can just wait to get picked by Hidden Services and then collect their descriptors and onion addresses. Also, since the hash ring is rotating, HSDirs get new Hidden Service descriptors in every rotation period.

One possible solution to this issue would be to append a symmetric key to the onion address and use it to encrypt the descriptor before sending it to HSDirs (similar to how descriptor-cookie authentication works currently). A client that knows the onion address can decrypt the descriptor, but an HSDir who doesn't know the onion address can't derive the Hidden Service name. The drawback of this scheme is that the size of onion addresses will increase without increasing the security of their self-authentication property. Furthermore, HSDirs will still be able to extract the Hidden Service public key from the descriptor, which allows HSDirs to track the descriptors of specific Hidden Services.

A different solution was proposed by Robert Ransom:

Robert's scheme uses the long-term keypair of a Hidden Service to derive (in a one-way fashion) a second keypair, which is used to encrypt and sign the descriptor that is uploaded to the HSDirs. This construction allows the HSDir, without knowing the long-term keypair of the Hidden Service or the contents of its descriptor, to validate that the entity who uploaded the descriptor had possession of the long-term private key of the Hidden Service. A client who knows the long-term public key of the Hidden Service can fetch the descriptor from the HSDir and verify that it was created by the Hidden Service itself. See the relevant trac ticket for a more robust analysis of the idea.

Robert's idea increases the size of onion addresses, but also makes them more resistant to impersonation attacks (the current 80-bit security of onion addresses does not inspire confidence against impresonation attacks). Furthermore, his idea does not allow HSDirs to track Hidden Service descriptors across time.

While Robert's scheme is fairly straightforward, a proper security evaluation is in order and a Tor proposal needs to be written. For extra fun, his idea requires the long-term keypair of the Hidden Service to use a discrete-log cryptosystem, which means that a keypair migration will be needed if we want to proceed with this plan.


Block tracking of popularity of Hidden Services

HSDirs can track the number of users who do a lookup for a Hidden Service, thereby learning how popular they are. We can make it harder for HSDirs to track the popularity of a Hidden Service, by utilizing a Private Information Retrieval (PIR) protocol for Hidden Service descriptor fetches. Of course, this won't stop the Introduction Points of a Hidden Service from doing the tracking, but since the Introduction Points were picked by the Hidden Service itself, the threat is smaller.

If we wanted to block Introduction Points from tracking the popularity of Hidden Services, we could attempt hiding the identity of the Hidden Service from its Introduction Points by using a cookie scheme, similar to how the Rendezvous is currently done, or by using Robert's keypair derivation trick and signing the introduction establishment with the new keypair. A careful security evaluation of these ideas is required.


Make it harder to become an adversarial HSDir

Because of the security implications that HSDirs have for a Hidden Services, we started working on making it harder for a Tor relay to become an HSDir node.

Also, currently, an adversary can predict the identity keys it will need in the future to target a specific Hidden Service. We started thinking of ways to avoid this attack.


Performance improvements


Hidden services are slooooowwww and we don't even understand why. They might be slow because of the expensive setup process of creating a Hidden Service circuit, or because Hidden Service circuits have 6 hops, or because of something else. Many suggestions have been proposed to reduce the latency of Hidden Services, ranging from Hidden Service protocol hacks to Javascript hacks, and to radically changing how the Hidden Service circuit is formed.

Let's investigate some of these proposals:


Reducing Hidden Service Circuit Setup complexity

During PETS 2007 Syverson and Øverlier presented "Improving Efficiency and Simplicity of Tor circuit establishment and hidden services" which simplifies Hidden Service circuit establishmentby eliminating the need of a separate rendezvous connection.

They noticed that by using Valet nodes, the concept of Rendezvous Points is redundant and that a Hidden Service circuit can be formed by just using Valet nodes and Introduction Points. Karsten Loesing wrote a Tor proposal for a variant of this idea.

The reason this scheme is not implemented is that the security trade-offs introduced are not well understood, and there are also some technical obstacles (like the fact that sharing of circuits between multiple clients is not currently supported).


Analyze Hidden Service Circuit Establishment Timing With Torperf

Establishing a connection to a hidden service currently involves two Tor relays, the introduction and rendezvous point, and 10 more relays distributed over four circuits to connect to them. No one has really researched how much time Tor spends in each step of that complicated process. It wouldn't be surprising if a large amount of time is spent in an unexpected part of the process.

To investigate this properly, one should use Torperf to analyze the timing delta between the steps of the process. Unfortunately, Torperf uses controller events to distinguish between Tor protocol phases but not all steps of the Hidden Service circuit setup have controller events assigned to them. Implementing this involves adding the control port triggers to the Tor codebase, running Torperf and then collecting and analyzing the results.


Hidden Services should reuse old Introduction Points

Currently, Hidden Services stop establishing circuits to old Introduction Points after they break. While this behavior makes sense, it means that clients who have old hidden service descriptors will keep introducing themselves to the wrong introduction points. This is especially painful in roaming situations where users frequently change networks (and lose existing circuits).

A solution to this would be for Hidden Services to reestablish failed circuits to old Introduction Points (if the circuits were destroyed because of network failures). We should explore the security consequences of such a move, and also what's the exact time period that Introduction Points are considered "old" but still "worth reestablishing circuits to".


Encrypted Services


Encrypted Services is the correct way of implementing the now-defunct Exit Enclaves.

Encrypted Services allow you to run a non-anonymous Hidden Service where the server-side rendezvous circuit is only one hop. This makes sense in scenarios where the Hidden Service doesn't care about its anonymity, but still wants to allow its clients to access it anonymously (and with all the other features that self-authenticating names provide). See Roger's original proposal for more use cases and information.

On this topic, Robert Ransom proposed to implement Encrypted Services as a program separate from Tor, since it serves a quite different threat model. Furthermore, if done this way, its users won't overload the Tor network and it will also allow greater versatility and easier deployment.


Human Memorable onion addresses


Zooko's triangle characterizes onion addresses as secure and global, but not human memorable. By now a couple of schemes have been proposed to make hidden services addresses memorable, but for various reasons none of them has been particularly successful.




These were just some of the things that must be done in the Hidden Services realm. If you are interested in helping around, please read the links and trac tickets, and hit us back with proposals, patches and suggestions. Use the [tor-dev] mailing list, or our IRC channels for development-related communication.

Finally, note that this blog post only touched issues that involve Tor's codebase or the Hidden Service protocol and its cryptography. However, if we want Hidden Services to be truly successful and influential, it's also important to build a vibrant ecosystem around them. For example, we need privacy-preserving archiving systems and search engines (and technologies and rules on how they should work), we need easy-to-use publishing platforms, Internet service daemons and protocols optimized for high-latency connections, anonymous file sharing, chat systems and social networks.

Thanks go to Roger, Robert and other people for the helpful comments and suggestions on this blog post.

PS: Don't forget to use anonbib to find and download any research papers mentioned in this blog post.

Are we able to donate to Tor Project and make sure said donations are only used for use X, which in this case would be to research/implement some of the points above?

I'd send in at least $100 a month to support such a project.

"I'd send in at least $100 a month to support such a project." (Hidden services)

Would you mind discussing what motivates this readiness to donate on your part?

How you benefit from or why, particularly, you see special value in hidden services?

I hope the Tor folks don't miss this opportunity.

I want to run a forum on clearnet while offering Encrypted Services for Tor users, I had long planned on using Exit Enclaves but they're broken.

I think Encrypted Services could be very useful in many use cases.

I would support a bounty for Encrypted Services . . .

Oh yea,

I'm also really interested in Human Memorable onion addresses, as well as increasing performance of HS respective to Encrypted Services for my use case.

Regarding memorability: apparently it's now computationally feasible to find keys corresponding to such names as deeproadworksbwj.onion (it's 13 characters / 65 bits (!)). However I have no idea how much time and money did the search take.

Wow, only one comment?!

(were are the crickets?)

Also I would like to re-use the private key of the hidden service for authentication purposes. Currently it is not easy to obtain this key in any way.

Why not focus on outproxies and leave the darknet features to projects like I2P, which are already vastly more robust in those regards?

First, why do you think that Tor hidden services are a darknet? Second, why do you think that I2P or other projects are somehow more "robust"?

Is I2P still alive?

Why should we have to rely on only one or two darknets? Hidden services are essential to the anticensorship agenda. If local authorities can simply take down your site and arrest you in your home then your software has failed. Hosting offshore is not a solution if your government can find out who you are.

"Why should we have to rely on only one or two darknets?"

Has anyone actually said such a thing?

As much as more may be merrier, don't we have to start by improving what we already have?

Because he doesn't use hidden services because he doesn't have opinions or beliefs that are anthema to whatever community or country he resides in. He doesn't care, and you shouldn't either.

He's a totalitarian probably, all about consensus and how you must obey the will of that consensus.

What definition are you using for darknet? Do you think that Tor Hidden Services are a darknet? What is "dark" about it?

I2P is in no way more robust than Tor. There are a lot of researchers (and papers) studying Tor, but virtually none for I2P. I2P may well have obvious vulnerabilities that no one noticed because no one studied it deeply enough yet.
I would be wary of running something like SR in the I2P network.

The Silk Road has been DoS'd to oblivion this week, the founder says the only way forward may require improvements to Tor Hidden Service protocols.

The PSYC project seem to have some interesting ideas, and they are voicing against federated social networks (http://about.psyc.eu/Federated_social_network); but I think running personal federated social network servers over Tor hidden services might be a viable way to bridge the gap there. All one has to do is patch one of the existing popular servers (Diaspora, Friendika) to run as a hidden service and package it as an installer like the Tor Browser Bundle.

1.) Is there a reason why blog posts as long as this are posted IN THEIR ENTIRETY on the home page of the blog? This makes scrolling past to see the next post on the page rather tedious. And for what?

Why not post only a preview with a "read more" link that would open the full post in a separate page?

2.) Please be aware that the text/links in the right-hand column runs into the comment box. (When the post and/or comments are short enough that said box appears along side said text/links)

We used to post only snippets, and then everybody told us it was too hard to click on them to read the rest. Can't win, I guess.

Can't see what could be so difficult about having to click to read the rest of a post. (Barring cases where only an EXTREMELY limited amount of RAM is available).

Thanks for replying.

Any data on what percentage of .onions are for "CP" and the like? Just wondering.

People can create hidden services for the sole purpose of manipulating those statistics. They are worthless.

To what extent can merely visiting an .onion fsck-up one's system?

Same risks as visiting any other website. (Web browsers are enormous complicated programs with a large surface area for attack.)

There's no need for fear. If you experience anything, tell the community to get the word out. The World Wide Web is much more dangerous than Tor hidden services in my opinion.

"If you experience anything,"

By then, it would likely be too late...

If you're concerned about it, use something like Whonix.

Right. Or Tails.

1.) What about BIOS/firmware/hardware threats?

Running a live environment will not protect against these, will it?

2.) Regarding Tor-usage in a live environment:

TBB is updated more frequently than Tails.

Thus, I think that running TBB from an updated live environment (such as, for example, whatever the most recently-released distro that runs live happens to be at any given moment) would be safer than Tails. (Providing, of course, that one's network activity will be limited to browsing within TBB).

I wonder how many people do this (run TBB from a *general* distro run live)

I'm curious about tracking of popularity of Hidden Services. Do the directory servers see the client's IP when the client requests information on a hidden service? Is there a good resource to get a better understanding of the client and directory server interaction?

https://www.torproject.org/docs/hidden-services.html.en
https://www.torproject.org/docs/tor-hidden-service.html.en
https://gitweb.torproject.org/torspec.git?a=blob_plain;hb=HEAD;f=rend-spec.txt

your third link was exactly what i was looking for. much appreciated!

To answer the question, no, both the user and the website are both anonymous and end-to-end encrypted.

Can someone please explain the benefits of using the I2P darknet in contrast with Tor Hidden Services? Does I2P suffer from similar attacks detailed in this blog post?

Recently IEEE published a scientific article about how hidden services can be deanonymized to reveal their real IP adresses. It was published here: http://www.ieee-security.org/TC/SP2013/papers/4977a080.pdf

I would like to see a response from authorities within the Tor-project to this publication as soon as possible.

Thanks

There are discussions on the Tor mailing lists, e.g.: https://lists.torproject.org/pipermail/tor-dev/2013-May/004909.html

Me too.

Ok guys, for now let's stick with dices, pencil and paper ...

Diceware?

At the rate things are going, the only choice may be to go back to the old-time Sherlock Holmes style methods of private communication/ cryptography/ steganography.

Semen purportedly makes good invisible ink...

"Don't forget to use anonbib to find and download any research papers mentioned in this blog post."

Page is not HTTPS or even .onion! Why?!

Syndicate content Syndicate content