NoScript Temporarily Disabled in Tor Browser

Due to a mistake in Mozilla's signing infrastructure, NoScript and all other Firefox extensions signed by Mozilla have been disabled in Tor Browser. Because they use NoScript, higher security levels are currently broken for Tor Browser users.

Mozilla is working on a fix, and we'll start building a new Tor Browser version as soon as their fix is available.

Meanwhile, anyone who is dependent on the security provided by the higher security levels can apply the following workaround:

  1. Open the address about:config in the Tor Browser address bar
  2. At the top of the page, search for xpinstall.signatures.required
  3. Set the xpinstall.signatures.requiredentry to false by double clicking it

Note: This workaround should only be used temporarily, as it disables a security feature. Please remember to set the xpinstall.signatures.requiredentry back to true again once the Tor Browser security update is applied.

Sorry for the inconvenience.

>The suggested workaround is an UNEQUIVOCALLY BAD IDEA. WTF, disable signature checks? Never, never, never!

Bad idea is mandatory signature verification for add-ons. If you can't install add-ons without Mozilla's permission - it's not your browser. Mozilla add-on signature give a false sense of security.

Signed is NOT verified by Mozilla.

Mozilla removed today (August 16, 2018) 23 Firefox (signed) add-ons that snooped on users and sent data to remote servers, a Mozilla engineer has told Bleeping Computer today.

The list of blocked add-ons includes "Web Security," a security-centric Firefox add-on with over 220,000 users, which was at the center of a controversy this week after it was caught sending users' browsing histories to a server located in Germany.

"I did the investigation voluntarily last weekend after spotting Raymond Hill's (gorhill) comment on Reddit, https://www.reddit.com/r/firefox/comments/96715s/make_your_firefox_brow… ," Wu told us. "I audited the source code of the extension, using tools including my extension source viewer."

"After getting a good view of the extension's functionality, I used webextaware to retrieve all publicly available Firefox add-ons from addons.mozilla.org (AMO) and looked for similar patterns. Through this method, I found twenty add-ons that I subjected to an additional review, which can be put in two evenly sized groups based on their characteristics.

"The first group is similar to the Web Security add-on. At installation time, a request is sent to a remote server to fetch the URL of another server. Whenever a user navigates to a different location, the URL of the tab is sent to this remote server. This is not just a fire-and-forget request; responses in a specific format can activate remote code execution (RCE) functionality," Wu said. "Fortunately, the extension authors made an implementation mistake in 7 out of 10 extensions (including Web Security), which prevents RCE from working."

https://www.bleepingcomputer.com/news/security/mozilla-removes-23-firef…

I consider signature checks to be a to be a security vulnerability themselves if they disable security features like this. I don't know how you think it's helping anyway, being signed by Mozilla means almost nothing in the context of Tor Browser.

> What exactly is the risk by setting it to false?

If you forget to also disable autoupdates of addons, potentially a malicious attacker might be able to trick your browser into installing an (unsigned!) piece of malware masquerading as a legitimate update. Or if you forget and install an (unsigned!) add-on, you will have... installed unverified software, which again might be malware. It's hard to guess how likely these scenarios really are, but the fact that they must be taken seriously because a certificate expired is really shocking and outrageous.

As I understand it, this emergency temporary mitigation is not a *fix* and it involves a security tradeoff.

o An intermediate cert needed to verify NoScript autoupdates expired owing to a goof at Mozilla.

o Ensuring that NoScript is working correctly is critical to Tor Browser.

o So you should disable signature verification until Mozilla fixes the cert.

o When that happens Tor Project will release an emergency new version of TB.

o Be careful to avoid installing any extension or allowing any autoupdates until the new TB is released; I think in most cases you should get a dialog box asking if you want to install something and of course you should say "No" because that something will not have been authenticated.

> If you forget to also disable autoupdates of addons, potentially a malicious attacker might be able to trick your browser into installing an (unsigned!) piece of malware masquerading as a legitimate update.

The bundled add-ons (HTTPS Everywhere and NoScript) update from mozilla.org by the preferences extensions.update.background.url and extensions.update.url in about:config[1]. So, if you don't trust Mozilla's review process for add-ons nor the developers of your add-ons, then you may want to temporarily disable automatic updates of add-ons. It's a greater concern if you installed add-ons after installing Tor Browser. Mozilla probably is being extra cautious about reviewing add-ons or not doing it at all until most people install the patch.

How to disable automatic updates of add-ons:
Add-ons tab -> gear icon -> uncheck Update Add-ons Automatically. Or set extensions.update.autoUpdateDefault to False in about:config. It prevents your HTTPS Everywhere and NoScript from receiving security updates, so remember to enable it or set it to True again after you install the patch to be released for this bug.

We actually get HTTPS-Everywhere from the EFF as we did not want to use the Mozilla version. That way in case Mozilla messes something up with their extensions we'd have at least HTTPS-Everywhere working as in this case.

People must understand that Signed != Verified

It's exactly the same bullshit as signed programs in M$ Windows. It takes control from you and gives it to corporation (Mozilla Corp in this case), while giving you the false sense of security.

You have a point. But verifying a sig is much better than nothing, and TP needs to always bear in mind the possibility of overwhelming newcomers with esoteric concerns.

It's a challenge, but we need to grow the Tor user base by leaps and bounds.

did this in the real firefox. the setting is there, but the addons in that browser remained disabled, and even after removal i couldnt redownload and install them with the workaround. i guess the same will be true for noscript: now its disabled the ff part of torbrowser will somehow know it was disabled unverified and prevent re-enable. cant reinstall. cant reenable. whats the point?

Supposedly using Chrome means getting less captcha challenges, so maybe changing user agent. Unless it's some other magic behind the scenes data Chrome is telling google about you to tell recaptcha you're real. Which I wouldn't rule out.

Don't forget that Google is not just a company or an executive suite but also a large workforce of highly skilled employees, many of whom rebelled against Dragonfly (Censorbrowser) and Project Maven (the killbot death listing AI project for the Pentagon). Unfortunately those employees have already experienced retaliation. So we should direct our ire and the executive suite, not neccessarily people who work at Project Zero (for example).

Mateus

May 04, 2019

Permalink

Is openly publishing for exit does still a good thing ? They become to easy blocked , tor is becoming heavily attacked more and more these days

I think it's unavoidable. Exits go to the clearnet, so the clearnet can always see whether traffic from an IP looks like a proxy. If they weren't published, third-party monitors would detect and list them as high traffic proxies anyway.

> Is openly publishing for exit does still a good thing?

I don't think there is any way of keeping the IPs of exit nodes secret. It is hard to even keep the IPs of bridges (unpublished "stealth" entry nodes needed for anti-censorship) secret.

Mateus

May 04, 2019

Permalink

This is crazy dangerous and can have put peoples life on risk. Why can addons be disabled remotely anyway? This was not some kind of update where it stopped working after the user applied a update, but it just happened all by itself in the background. WTF! Tor devs you should not trust mozilla this much leaving this open channel, this proves why

True, this was a serious blunder. But from what I understand, this happened because a certificate *expired*. If so, it wasn't disabled remotely; it was disabled because a certain predetermined time had elapsed. It wasn't a deliberate action from Mozilla or anyone else, and it wasn't something that a malicious actor could have triggered if they had wanted to.

I hope that Tor Browser devs (and Mozilla too) will learn from this and make the system more robust in the future. Tor Browser should always trust the extensions that are bundled with it, and that trust shouldn't be time-dependent. Ideally, it should also "fail closed" so that if NoScript is unavailable for any reason, the browser should default to javascript.enabled=false.

> It wasn't a deliberate action from Mozilla or anyone else, and it wasn't something that a malicious actor could have triggered if they had wanted to.

Expiration can be deliberate, and actors can disrupt attempts to extend the time. Think of its relation to revocation too. But there aren't indications at this point that this particular situation was deliberate.

Nothing is expired, they have timer that checks signatures every 24 hrs, like in corporate gaming consoles, look for yourself here:

app.update.lastUpdateTime.addon-background-update-timer
app.update.lastUpdateTime.recipe-client-addon-run
services.blocklist.addons.checked

> Why can addons be disabled remotely anyway?

To verify the cryptographic signatures of code before installing an add-on, we need to verify the certs in a chain. In this case, one of those certs expired because Mozilla goofed. That silently disabled NoScript, putting us all at risk. Outrageous? Yes. Incredible? Unfortunately not, if you have followed decades of criticism of the many weaknesses of current PKI.

Security is hard. Very hard. This incident reminds us that human error remains as much a threat as malicious attacks exploiting some unrecognized technical flaw in software incorporated into the Tor ecosystem.

Another perspective:
The sig files on the Tor Browser download webpage are a different type of cryptographic signature that we use to verify Tor Browser before installing it. They are created from Tor Project's cryptographic keys that are also capable of having expiration dates. If those keys expire and we tried to verify the Tor Browser installer, we would either not be able to verify it or be presented with a warning message from our verifier program, i.e., gpg. But we wouldn't be denied from installing it regardless or suddenly find it was uninstalled.

Good point. But I would never install unsigned code. Which unfortunately means I cannot help test alpha versions of some good stuff like upcoming Tails because (incredibly) these are unsigned.

Mateus

May 04, 2019

Permalink

There is more to this and a real solution will be for Tor to decouple from mozilla as much as possible. But really we need a new browser decoupled from all the states/govts and really if Tor doesn't change then it becomes suspect in govt. chicanery.

In an ideal world, clearly yes, Tor Browser should be independently developed with security in mind from the ground up, and regularly audited. That would take resources far beyond what TP will be able to muster in the foreseeable future.

I think the only long term solution, which satisfies among other desiderata the principle that "if you want it done right, do it yourself", is to evolve Tor Project from a tiny NGO dependent upon the "largesse" of untrustworthy governments and corporations to a user supported human rights NGO with a stupendous endowment enabling it to decline firmly offers of "help" :-p from the likes of Google, Amazon, Facebook, US State Dpt, DARPA, etc. That will take hard work, dedication, and a long-term commitment from community organizers, as well as a "no-strings" multimillion dollar gift to the Tor Foundation from some repentant tech billionaire.

Not an ideal world is required I'd say. Look at git. One guy started it (yes high visibility guy) and the ball started rolling. Why? Because it was clear to thoughtful people that it was the right thing to do. There are many thoughtful people that can see quite clearly that the current web trajectory is bad. We really have no choice but to create a new browser. What it's based on? Don't know.
Git is based on sound crypto science and the motivation of source freedom. And freedom from hindrances that alternatives had/have, technical/legal/etc.
So the right seed gets planted and right minded people will feel motivated to contribute. Otherwise it becomes yet another shitshow of which there are so many now (in all aspects of life).

Mateus

May 04, 2019

Permalink

After setting it to false restart Tor and then change it back to true so you won't forget to later on. As long as you don't restart again the addon still work for me fine though there is a warning about NoScript not being signed.

There is also another way to do this on Firefox though not Tor. Tools, Options, Privacy & Security, go to Browser Data Collection and allow Firefox to install and run studies. You can undo it right afterwards and it will still work.

Mateus

May 04, 2019

Permalink

@Calbillie What browser would you use as the base of Tor? Better yet would you build an entire new one? Because that is an insane amount of work that will take a long time. This doesn't even get into all the security checks that would need to be run by the community for quite a while before it would even be possible for download.

Then there would be porting over the addons over to the new browser, troubleshooting that until each work, make sure no security risks are added, etc.

Chromium would be the obvious choice. It's a much more secure and stable base than Firefox, with a lot more resources behind it. Not sure how easy it would be to adapt it to the Tor Project's needs though and to keep porting these changes to newer Chromium versions.

Something would have to be done about the integration.
Ungoogled-Chromium is not really a different/separate browser, it is just vanilla-Chromium gutted/patched to the extreme & updated per-release of vanilla-Chromium.
Other than being updated slightly faster, I fail to see how vanilla-Chromium is better in any way than Ungoogled-Chromium.

The problem with Chromium is that in effect you have to trust Google, a company which is reorganizing as part of the US military-surveillance complex (c.f. their AI kill-list for US drone strikes, to mention just the most notorious example). Or maybe even the CN military-surveillance complex (c.f. censorbrowser).

Yes, Chromium is open source, but is it really adequately audited on a continuing basis by reliable (non Google) coders? If not, Tor Project cannot possibly take on that job.

I have to agree with those who say that the only thing we can do is to urge Mozilla to try harder to avoid such dangerous (and embarassing) goofs in future.

Another advantage of Mozilla is that Tails is based on Debian which uses Firefox as the default browser. So there is a huge user base which has a stake in making Firefox safer.

Mozilla needs to comply with government requests just as Google does. Firefox also has Google integration, as well as its on trackers. You don't have any privacy advantage if you compare stock Firefox to stock Chromium. All you have is a much weaker security model on Firefox.

Years ago I remember the precursor to Project Zero freaking out because they discovered a new and very dangerous APT attributed to CN military. In those days Google did not publicize USG state-sponsored malware, but reacted strongly to CN state-sponsored malware. They had cause to regret trusting USG when they read the Snowden leaks and saw that infamous smiley.

These days it appears possible, even likely, that Google is betting it can make more money by kow towinig to the CN dragnt surveillance machine than the US dragnet surveillance machine. If so in future you may see Google publicizing NSA APT malware but saying nothing about CN APTs.

Something to think about when you think about the meaning of the phrase "government requests".

But Firefox is not safer. In fact, it doesn't even begin to compete with Chromium. There also isn't a "huge user base" in Tails. The user base of Tor and Tails is so tiny, it would barely even show up on regular website analytics. Just because Firefox's code is out there does not mean even one person just goes and audits it in their free time and keeps doing so for updated code. It's just not what happens. It's a fallacy to believe that just because something is open source, there will be people who audit its code. Google has a much better track record than Mozilla, when it comes to security. Security is an essential piece to even begin working on privacy. Firefox doesn't offer it.

If you want something audited, you can pay a team of professionals to do so, and then keep doing so as the code is being changed. Nobody does it for free, for obvious reasons.

> The user base of Tor and Tails is so tiny, it would barely even show up on regular website analytics.

And we need to change that. Spread the word! Teach your friends to use Tor and Tails!

Chromium might be a bad choice for what the Tor Project wants to accomplish, but Firefox is a horrible foundation. It's just objectively much worse in terms of security.

And besides all those reasons, Chromium's main developer is a company that's business model is built on ads and therefore "sharing" user data with other parties, and any decisions they make will be a compromise between real security and their bottom line, which is essentially built on breaking privacy.