The value of Tor and anonymous contributions to Wikipedia

by antonela | June 25, 2020

Tor users are conscientious about the tools they pick to do what they do online. Often, discussions of controversial topics need a different level of privacy depending on a user's threat models. An activist in the Middle East can provide a different perspective on an article about politics in their own country than a collaborator in northern Europe. And they deserve to add their voices to the conversation safely.
 
There are many reasons a person might want to be anonymous when they write, edit, or share information. But some web services, including Wikipedia, ban (or have banned) Tor users from participating, effectively banning anonymous contributors.
 
According to a recently published research paper co-authored by researchers from Drexel, NYU, and the University of Washington, Tor users make high-quality contributions to Wikipedia. And, when they are blocked, as doctoral candidate Chau Tran, the lead author describes, "the collateral damage in the form of unrealized valuable contributions from anonymity seekers is invisible." The authors of the paper include Chau Tran (NYU), Kaylea Champion (UW & CDSC), Andrea Forte (Drexel), Benjamin Mako Hill (UW & CDSC), and Rachel Greenstadt (NYU). The paper was published at the 2020 IEEE Symposium on Security & Privacy between May 18 and 20.
 
By examining more than 11,000 Wikipedia edits made by Tor users able to bypass Wikipedia's Tor ban between 2007 and 2018, the research team found that Tor users made similar quality edits to those of IP editors, who are non-logged-in users identified by their IP addresses, and first-time editors. The paper notes that Tor users, on average, contributed higher-quality changes to articles than non-logged-in IP editors.
 
The study also finds that Tor-based editors are more likely than other users to focus on topics that may be considered controversial, such as politics, technology, and religion.
 
Related research implies Tor users are quite similar to other internet users, and Tor users frequently visit websites in the Alexa top one million.
 
The new study findings make clear how anonymous users are raising the bar on community discussions and how valuable anonymity is to avoid self-censorship. Anonymity and privacy can help protect users from consequences that may prevent them from interacting with the Wikipedia community.
 
Wikipedia has tried to block users coming from the Tor network since 2007, alleging vandalism, spam, and abuse. This research tells a different story: that people use Tor to make meaningful contributions to Wikipedia, and Tor may allow some users to add their voice to conversations in which they may not otherwise be safely able to participate.
 
Freedom on the internet is diminishing globally, and surveillance and censorship are on the rise. Now is the time to finally allow private users to safely participate in building collective knowledge for all humanity.
 
More info:

Comments

Please note that the comment area below has been archived.

June 26, 2020

Permalink

But what is the point of anonymous contributions, when those contributions are removed or held up in moderation for an unlimited time?

The above is what is happening now at the Tor Blog. Users are able to contribute here anonymously as well, but those contributions are very often suppressed, deleted or ignored and not published. Days, weeks often go by until the contribution might be 'approved', if at all.

Its good to focus on the value of Tor and its impact on Wikipedia contributions, an important discussion. But I think Tor needs to get its own house in order first before it passes judgment on someone else's.

I share your frustration with the less than satisfactory moderation process in this blog, but we should all try to always bear in mind that Tor Project has a small (and shrinking) staff who must grapple with a daunting (and growing) set of challenges. This means that TP has to prioritize, whihc is presumably how user feedback concerns get pushed onto the back burner. Clearly that is self-defeating in the long run, but it seems the problem can only be solved if users somehow collectively contribute so much money to TP that TP is able to hire enough people to cope with a problem (moderation) which has defeated much bigger and richer entities (such as Facebook).

June 26, 2020

Permalink

I strongly believe that Media drives society. And in the importance of progressive moral and civic values. I use Tor.

But my comments are being rejected, with reasons like:
"Your computer or network may be sending automated queries. To protect our users, we can't process your request right now. For more details visit our help page[https://developers.google.com/recaptcha/docs/faq#my-computer-or-network… ]

Can that be avoided? Thank you for your attention, and please be safe.

Guessing this is for Google-operated websites, e.g. YouTube. You can click the "New Circuit for this Site" button when you're on the site and that message should go away eventually.

Word of advice. This is something I've noticed on YouTube. Sometimes you'll be redirected to a captcha page on www.google.com instead of www.youtube.com, and if you click the "New Circuit for this Site Button", it'll make a new circuit for www.google.com and not www.youtube.com, so any requests to www.youtube.com will still be redirected to www.google.com. If that happens and you have no other www.youtube.com tabs open, all you can do is restart Tor (e.g. on Mac and Linux pkill -HUP tor) or the browser ("New Identity" button).

Good luck.

June 26, 2020

Permalink

TP urgently needs to address the existential threat to Tor and Tails posed by the LAEDA bill just introduced by a trio of notably brutal senators (Graham, Cotton, and Blackburn). Salon.com has published an excellent analysis which quotes people like Brian Krebs (author of the standard book on Linux security) who all agree that the bill would be a disaster. See the link to the pdf with the text of the bill which Tor users can DL (we are of course blocked from the Senate website) and note that the definition of the "services" and "operating systems" which would be required to break their own encryption absolutely include both Tor and Tails. If any version of this bill became law, Tails would become illegal in the US and Tor Project would have to either move overseas or shut down. IMO it is impossible to regard this bill as anything other than as an existential threat to Tor Browser, Tails, OnionShare, Secure Drop, Signal, and Telegram.

So, PLEASE, Isabela, address the LAEDA threat in a post in this blog, and PLEASE reach out to Washington State legislators because Tor Project is among their constituents.

I am perhaps uniquely qualified to speak toward the WP issue and if the moderators permit perhaps I will do so. This topic is so complex and fraught with so many dangers that I would prefer to share by strong encryption with Isabela before deciding what if anything to post here.

June 26, 2020

Permalink

I think it'd be interesting to have a system where Tor users who want to edit Wikipedia have their first edits require admin approval. I think this would deter at least some of the purported abuse. After a few edits, those users would be allowed to edit freely.
These initial edits should contribute a substantial amount of content, or else vandals will simply fix a few typos and get admin approval.
That being said, the Tor network is one of many ways one could abuse Wikipedia editing. Tor gets the spotlight because it's popular and requires minimal technical knowledge.

I agree that Wikipedia needs a more lenient policy on anonymous proxies and this system seems like a good idea, but I wonder how much work it'd create for the admins who'd have to approve every edit.

June 27, 2020

Permalink

Tor access to Wikipedia would be very nice! Why I have to login into wiki just to fix some simple typo?

June 27, 2020

Permalink

[Moderator: the first part of this submission may appear tangential to the issue raised in antonela's post, but if you keep reading you will find the second part is "on-topic"!]

Wikipedia is an (unconventional) encyclopedia, but the Wikipedia user community is a (virtually) Utopian society. Conventional wisdom says that Utopian societies always fracture and die out after a few years, but Wikipedia has defied convention by surviving and even, to some extent, thriving. Even more astonishing, Wikipedia has come to be generally regarded as a resource so essential to anyone seeking information that it seems unthinkable that it might someday simply vanish.

(But it might. All earthly things, including all websites, are vulnerable to extinction during the current mass extinction event.)

One aspect common to many Utopian societies--- including Wikipedia--- is that horrified authoritarian onlookers stridently denounce their "anarchic" nature. Or, not very consistently, to denounce them as a "cult". Wikipedia has been called both those things, but in truth, Wikipedia is neither chaotic nor a cult.

In fact it is only to be expected that any novel society will initially lack a clearly defined form, but will quickly begin to organize itself, with "cliques" and "social rules" developing, possibly in the absence of any conventional type of "leadership". In the early stages, some charismatic personality may play the role of "benign dictator", a term which has been applied to Jimmy Wales (but only in jest, at least on the part of informed commentators).

Ignorant onlookers are prone to regard this decentralized, unplanned, and often messy process of self-organization as a sign that "this cannot last", but the process is better seen as a kind of natural selection in action: some cliques persist while others die out, some rules persist while others are discarded, some leaders enjoy the respect of the community over a long period, while others come to be generally reviled. And it is true that in human society as in nature, most "experiments" lead to extinction. But Wikipedia is one of the few which seems not only to have proven viable, but which is even becoming, dare we say it, dominant.

Could it be that the corporations by which we have been too long plundered and the governments by which we have been too long oppressed, are doomed to die out, that our heirs will live a society which looks much less like Main Street or Wall Street or Silicon Valley, and much more like Wikipedia?

Just in the past five years, we who have the misfortune--- or opportunity?--- of living in turbulent times have experienced the early stages of what will likely prove to be the most rapid and profound social, economic, political, and environmental evolution in human history. The nearest approach in the past few millennia might be the first century AD and the years roughly 1775-1795. (For the latter, see the already classic book by Jay Winik, The Great Upheaval). But there seems to be a growing consensus that even the profoundly radical nature those times of change will not compare to our own.

Of these changes, the one which perhaps most personally affects everyone reading this blog is the fact that human societies (all over the globe) are rapidly transforming from societies in which the "default condition" is that adults are officially "employed" to societies in which very few people are employed in any sense which would be easily understood by a medieval serf or a twentieth century American. This change is so utterly confounding that even the best and brightest intellectuals, much less political leaders, have been unable to grapple with the implications, much less to make reliable predictions about what kind of society will replace the ones in which all previous generations of humans have existed.

From this perspective, one of the most striking aspects of Wikipedia is that "wikipedians" are not only willing but eager to contribute volunteer labor, in pursuance of the idealistic goal of preserving from harm and incrementally improving the encyclopedia, and moreover are not only willing but eager to work as hard or harder than they ever would if they were reaping the kind of inordinate financial rewards which are heaped upon the CEOs of multinational corporations. Which would seem to be consistent with the increasingly common view that capitalism is fundamentally inhuman. Actual people, it seems, do not want to work for "money" (whatever that might be) but for something equally intangible but (it seems) far more natural and far more "humane", something which we might hypothesize can be called something like "trying to make the world a better place". Optimists who reject the generally current trend towards degradation, dissolution, destruction and despair, may perhaps regard Wikipedia as an intriguing example of what kind of society might replace pre-twentyfirst-century societies (from hunter-gatherer economies to capitalistic economies).

For these and other reasons, the social evolution (to date) of Wikipedia holds extraordinary interest.

One aspect of the early evolution of Wikipedia which is most directly relevant to antonela's post is this: fairly early, one of the most pressing problems confronted by the Wikipedia community was that "wikivandals" often made hard-to-notice but unhelpful edits (such as mischievously inserting a word which changed a factually true sentence into a factually false one), or even easy-to-spot and easily-reversed but nonetheless annoying deletions of paragraphs or even entire articles. This led to a nonhuman ecosystem of "wikibots" which monitor the vast wikipedia.org website for such changes and automatically "revert" them.

A second aspect of the early evolution of Wikipedia which I think is relevant to understanding the interaction of the Wikipedia and Tor user communities was the appearance of fairly well defined subpopulations ("cliques" might be too strong a word), which could be roughly placed according to a multidimensional system of antipodalities such as these:

o serious <-> whimsical
o conventional <-> unconventional
o established <-> provocative
o scientific <-> anti-scientific
0 truthful <-> fictional
o inclusive <-> divisive
o complex <-> simple

For example, some users "specialized" in producing exhaustive lists (often but not always lists of "trivia items"), and a social norm emerged according to which such "articles" were given titles of the form "List of [noun phrase]". Some users specialized in producing articles explaining technical subjects too new or too specialized to have found a place in any conventionally printed encyclopedia. These topics often included major technical developments, often computer or science related, and the authors were often genuine subject matter experts. Because these articles offered up to date information on very new and newsworthy developments which were often quite important to many people, but concerning which accurate information was hard to find elsewhere, Wikipedia gradually became recognized as a critical resource, not only by journalists but even by jurists struggling to write a credible summary of some technical issue which had come before their court. And in jury trials, panelists are increasingly unlikely to heed stern warnings from the bench not to consult Wikipedia from the sacred confines of the jury room.

And so Wikipedia became impressively influential. Which is to say, Wikipedia became important. Suddenly, controversies involving Wikipedia became seen as consequential. Suddenly, well-informed citizens all over the world began to recognize that Wikipedia is one of a handful of websites which--- love it or loathe it--- quite simply *matter*.

On the dark side, Wikipedia's open format was quickly exploited by previously marginalized groups ranging from far right hate groups to earnest but self-deluded people desperately anxious to promote some "conspiracy theory" or pseudoscientific conception. But the majority of Wikipedians wanted to ensure that the encyclopedia was better described as "a good source of up to date and generally reliable information" than as "a cesspool of disinformation, misinformation, and hate speech", and they did not take kindly to "wikivandalism" or to more determined attempts to hijack the encyclopedia to serve some unpopular and unregarded special interest.

The earliest attempts to "keep Wikipedia clean" involved the formation of an extensive and increasingly convoluted ruleset, leading to the emergence of a new specialty: wikilawyering. Working within this system soon proved to be far too exhausting for most "content producers", but computer scientists saved the day by contributing increasingly sophisticated "bots" which took some of the pressure off those wikipedians who wanted the encyclopedia to remain mostly reliable and mostly free of hate speech.

Another early development was that articles on large and medium-sized companies and corporations appeared, and the executive suites began to take notice of the implications. This led to a new business: "consultants" who boasted (mostly without any real factual basis) of being "influencers" within the Wikipedia community and who sold their services as "wikishills" who would supposedly "ensure" that the public image of their corporate clients, as expressed in Wikipedia, would be favorable to shareholders.

Politicians were initially a bit slower to pay close attention to Wikipedia articles on the topic of themselves--- called "wikibiographies" in wikispeak--- and early attempts to suppress often "damaging but true" information about some often unpopular or controversial political figure led to such gaffes as "anonymous IP edits" of the wikibiography of some US Congressman which technically adept wikipedians were quick to expose as originating in the IP range assigned to US Congress. Since this type of "crying foul" quickly became too tedious for human oversight, it too was taken over by specialized wikibots.

At more or less the same time, there emerged a class of "academics" who quite shamelessly promoted themselves as "leading intellectuals" [sic], an often laughably excessive view held by essentially no one else. This kind of activity was easy for humans to spot but proved much harder for wikibots to combat, because the self-promoting small-potato academics (and later, more or less meritless "media personalities") often proved more ingenious at "gaming" the Wikipedia system than traditional political operatives.

In another notable development (which the Wikipedia "leadership" is understandably not eager to discuss), mysterious actors began to exploit the website in ways which seemingly had little if anything to do with the goal of providing information (true or false, misleading or not) for mass consumption. Mysterious edits in very obscure locations in wikipedia.org (seemingly "obscure by design") began to appear, resulting in a yet another new specialty: wikicryptanalyst. Some of these turned out to be arrangements for personal assignations in countries where unusual (or even "mainstream") sexual practices are strictly proscribed. (See the classic book by David Kahn, The Codebreakers, for historical precedents which attracted the attention of previous generations of amateur and professional cryptanalysts.) Other well hidden edits seemed to have some kind of criminal nexus, which led to some rather alarming requests for information from some truly scary sources.

Given this, it should not be surprising that well-funded intelligence agencies with global ambitions (not all of them American) were quick to try to exploit Wikipedia in various ways, such as by encouraging operatives to worm their way into the upper levels of the developing hierarchy of Wikipedia "influencers", with the immediate goal of tracking what was happening behind the scenes of this extraordinarily important virtual society, and in the longer term, with a view toward systematically "slanting" content in ways favored by certain governments. Hidden from the view of the public (and from almost all wikipedians), "out of band" discussions using increasingly more sophisticate cryptoprecautions began to occur with increasing frequency. One development which to some extent benefited from "spooky" advice was that Wikipedia hired cybersecurity professionals who greatly improved the cybersecurity of the website. Less happily, genuinely influential wikipedians began to find themselves subject to increasingly sophisticated state-sponsored "targeted" cyberattacks. But perhaps the most striking aspect of this generally unwanted, dangerous, and still growing spook infestation is that Wikipedia appears to have become the unique example of a utopian society which became so important to "the mainstream" that (seemingly) every intelligence agency in the world with even the most modest pretensions to global omniscience feels they need to have a "on-site presence". In contrast, the Pinkertons apparently never saw any need to infiltrate New Hope or the Amish community.

We can see here parallels with the history of the Tor community. We too have found that as we become more consequential in the world at large, we become subject to ever more determined targeted attacks by an ever more diverse set of corporate and state-sponsored adversaries. Which are the very same adversaries which are targeting the Wikipedia community. And one activity which all our adversaries energetically pursue is the systematic attempt to promote division, not only inside targeted societies, but between them. Thus, one of the things with which both Wikipedia editors and Tor developers must contend is the systematic attempt by sophisticated and well-funded adversaries to prevent us from recognizing and building upon common ground.

Two books which may aid in understanding how our most dangerous adversaries operate are:

o Mike German: Disrupt, Discredit and Divide. New Press, 2019.
o Tim Wiener: Legacy of Ashes. Anchor Books, 2008.

These books cover USG efforts to influence, disrupt, and corrupt everything everywhere, but of course the USG is not the only government which has us--- Wikipedia and Tor-- in their gunsights.

But if our communities can somehow survive these increasingly vigorous--- and increasingly desperate--- attempts at indiscriminate destruction, in the longer term everything which our adversaries believe is "vital" for "national security" (whatever a "nation" and its "security" might be) or "economic survival" (whatever that mean in an age when information is increasingly more "tangible" than traditional notions of "money") or "geopolitical power" (whatever that might mean in a world in which misinformation appears to be proving more powerful than missiles), all these things which our befuddled enemies think so essential, may vanish. That would surely be a truly happy and liberating development, of incalculable benefit to all peoples everywhere.

But that is the future, which is unknowable, until it happens. We live in the present. Which is always dangerous but which has perhaps never been more dangerous than in the polyterroristic times in which we live.

We are all now engaged in an epic struggle to shape the future. The world to come will be--- if our adversaries prevail--- unbearably awful. Unfit for civilization. Incompatible with even the most basic notions of freedom. Incompatible not only with the pursuit of happiness, but far worse, incompatible with any reasonable notion of a life worth living. The world to come may very well may not contain humans at all.

But there is a chance that if we continue to fight all those forces which oppose all things bright and beautiful, we might yet prevail. The world to come just might turn out to be more of a utopia than a dystopia.

Let us work for it, fight for it, even die for it. Because a better world would be well worth any sacrifice.

We need it, we deserve it, we can have it--- if we are willing to fight for it--- and we must do everything we can to ensure that our survivors actually get to live in it.

June 28, 2020

Permalink

Wikipedia has this lame policy on open proxies where you're allowed to use them only if you need to (i.e. Wikipedia is blocked) and you already have an account in good standing.

The first condition sucks because you should be able to edit Wikipedia anonymously. The second condition sucks because, as is said here:

However, it may be difficult to establish good standing and remain completely anonymous, as the former requires editing without using Tor.

Any alternative policy would be better.

June 29, 2020

Permalink

So that's what the researchers found... but has anyone from Wikipedia actually acknowledged or commented on these findings? Is there any realistic expectation that their policies, or their "Tor is bad, m'kay?" attitude, will change anytime soon?

Also,
> By examining ... edits made by Tor users able to bypass Wikipedia's Tor ban
I'm trying to be optimistic, but I can't help but think that the time and effort required to bypass the ban serves as an important barrier to abuse. If Wikipedia just opened up its doors to Tor, I have a feeling we'd see a lot more drive-by vandalism from people just wanting to blow off steam or get off to it in the bushes, after their real IP has been banned. (However, I'm not saying they shouldn't at least try it and see what happens.)

Wikipedia has missed out on some quality edits from me over the years, simply because I didn't feel like taking the time to bypass their stupid ban just to upload a picture or add a reference or something simple (for which I receive nothing material in return, mind you).

Also, I'm wondering, how does Wikipedia handle edits from mobile networks, which often have very ephemeral IP addresses and sometimes carrier-grade NAT?

June 29, 2020

Permalink

These initial edits should contribute a substantial amount of content, or else vandals will simply fix a few typos and get admin approval.

That would still be an excessive barrier to participation. I use Wikipedia every day, and see typos, missing citations, bad links, etc. on almost every page. But it's been more than a decade since I've had the ability to edit. If I've got to come up with some pseudonyms, track them and their passwords, establish reputations, and throw each away before it becomes too personally traceable... what, I'm going to do all that out of the evilness of my heart?

The history of my edits would closely reflect my viewing history, which would be much too revealing about me, possibly even over short terms. So, I just leave the errors there. How motivated should I be to help out a project that expresses a deep antipathy toward my participation? (Nevermind the other bullshit like edit wars and power-tripping administrators.)