Changes to the Tor Exit List Service

by irl | March 9, 2020

We've made some changes to the way we present Tor exit lists. The Tor Exit List service maintains lists of IP addresses used by all exit relays in the Tor network. Service providers may find it useful to know if users are coming from the Tor network, as they may wish to provide their users with an onion service. Tor Project also uses this information to help maintain a healthy network and to perform troubleshooting.

Exit lists are provided through three interfaces: raw measurement results are archived by CollecTor, a text file containing all exit addresses is available for download to query locally, and finally, we provide a DNS exit list service to allow services to perform real-time lookups. The DNS system is described below.

Changes to the DNS-based system

The DNS-based system for looking up whether a client is connecting through Tor has been replaced with a brand-new service featuring a simplified lookup mechanism. The new system is up and running and behaves closer to a typical DNS-based list service, and so it may be easier to integrate without requiring custom code implementation. Operators currently using this need to switch from the old system which will be retired on the 1st of April.

If a client IP address is a Tor exit relay, the new service will return an A record of 127.0.0.2. You'll also be able to look up a TXT record with the fingerprint of the relay to learn more about the individual relay.

Changes to the bulk exit list

As was the case previously, https://check.torproject.org/torbulkexitlist still provides a bulk list of IP addresses, with a simplified interface - queries of the exit list based on exit policy are no longer permitted. If we have observed an exit relay using an IP address through our active measurements, this will be listed as an exit relay in the new service regardless of the exit policy and will be returned in the bulk list of IP addresses regardless of the query made.

The full details about the changes can be found in this post to the tor-project mailing list.

Comments

Please note that the comment area below has been archived.

March 11, 2020

Permalink

SO BAD!

Listing Tor exit nodes only allows more and more websites to blacklist Tor nodes, and so is a real pain for us users!

I doubt people working on the Tor project are so dumb to not realize this. To the contrary, it is yet another sad proof that the Tor project is undermined by government agencies acting as a Trojan horse...

It is not a goal of Tor to provide access to websites that don't want to be accessed, but for websites that do want to be accessed the Tor network will do its best to ensure users can connect with that website.

Some websites will block Tor no matter what we do. In part, this service exists to help websites that want to block Tor do it with minimal damage to the Tor network. For example, some websites naïvely scrape the directory consensus for all relay IP addresses. This is bad for the website, because they will miss IP addresses used by exits not listed in the consensus, and it's bad for Tor because it adds load to the directory mirrors and causes problems for non-exit operators who are needlessly blocked.

Others use this service to enhance Tor user experience, for example by offering onion services to users connecting via Tor. It is also possible to monitor how many users may be connecting to a website via Tor, and use this to support an argument that you should provide an onion service.

We're not offering anything to websites that want to block Tor that they couldn't do themselves (and from experience, we've seen them do it badly) but we are providing a way to do it without harming the Tor network more than necessary.

Actually, I think it's more good than bad.

What you have to understand is that most of the access denied pages you encounter actually have nothing to do with Tor. It's not that the site is blocking Tor, it's that the site is ignorant towards Tor. They don't even realize it's a Tor exit node. As a result, traffic coming from that exit node appears to be malicious, due to the volume and traffic patterns and such, so the exit node gets blocked.

By publishing a list of exit nodes, or allowing sites to query if an IP is an exit node, the site can distinguish Tor exit traffic from unusual or malicious traffic. Then, they can whitelist the node, or treat the node differently (e.g. require sign-in before letting you post to a forum) instead of blocking the node outright. In other words, that way they know it's a shared IP. I believe cloudflare does this, as one example.

As previously mentioned, the exit list doesn't really make it any easier for sites to block Tor. Even without any list or any query mechanism, sites would still be able to detect the use of Tor, for example by analyzing traffic patterns (e.g. packet size, circuit lifetime) or by fingerprinting the browser.

March 13, 2020

Permalink

"We're not offering anything to websites that want to block Tor that they couldn't do themselves (and from experience, we've seen them do it badly) but we are providing a way to do it without harming the Tor network more than necessary."

I have to agree with the original commenter. I'm aware that there are other ways that websites could block Tor, but the more convenient you make it, the more websites will do so.

In my experience it's very rare for websites to actually 'enhance' things for Tor users, much more common for them to degrade or deny access.

I can see there's a balance in terms of how this affects relay operators/load on the network, but usability has to be a priority! In short, if loads of websites block Tor, the whole project becomes significantly less worthwhile anyway.

"It is not a goal of Tor to provide access to websites that don't want to be accessed"

This bit is downright weird. If websites are going to discriminate against Tor users, are you saying you wouldn't want to do anything to help users bypass that even if you could?

If a website wishes to block Tor users, then we should allow them to do that. However, they are going to learn about how to do this from our documentation and give us the opportunity to explain why perhaps they might want to not block Tor users.

The new network health team will also be reaching out to website operators to ask them to reconsider their blocks, but we must do this through conversation and education. To use technical measures to circumvent blocks by website operators is not going to win us friends.

March 13, 2020

Permalink

If a client IP address is a Tor exit relay, the new service will return an A record of 127.0.0.2

Genious! But I'm curious what was the old way of detecting exit nodes over DNS?

March 16, 2020

Permalink

I use Tails almost exclusively to surf the web, which means I use the latest Tor Browser. During the past few weeks I noticed (thanks to OnionCircuits) that almost all my Tor circuits (trying to reach popular news sites and NGO websites) were going through exit nodes belong to a single family. Naturally I was alarmed and tried to report this as an obvious problem, but my attempts were unsuccessful.

I am happy to say that with the introduction of Tails 4.4 I immediately noticed that this problem appears (so far) to have been banished.

No idea what the issue was or how/why/who/what was done to fix it (or how it fixed itself), but I just wanted to try again to make sure that Tor devs keep on eye on large families grabbing too many circuits, even and perhaps especially in cases which may represent individual targeting.

Could I also suggest that TP consider

  • using a SecureDrop portal so that Tor users can report potential security problems based upon observations such as the one described above (which I think obviously represents a very serious problem in Tor network, hopefully at least temporarily fixed),
  • alternatively, using Whisperback as Tails Project does,
  • having a dev post in this blog a description of how users can report issues, with detailed and up to date links, and bearing in mind that many users will want to report anonymously,
  • assigning someone to liaise closely with Debian Project to keep the Debian repo onions healthy (I guess the servers might be stressed so perhaps a special purpose funding drive?)
  • assigning someone to contact cran.r-project.org to seek collaboration in setting up an onion mirror for CRAN (R, the open-source statistical platform, is essential for data journalism, COVID-19 response, and much more)

TIA!

P.S. Preview didn't work so I hope HTML tags are error-free