Mission Improbable: Hardening Android for Security And Privacy

.frame {
text-align: center; margin: 1em 0;
}
.screenshot {
max-height:100%;
max-width:40%;
vertical-align:middle;
horizontal-align:center;
}

Updates: See the Changes section for a list of changes since initial posting.

After a long wait, the Tor project is happy to announce a refresh of our Tor-enabled Android phone prototype.

This prototype is meant to show a possible direction for Tor on mobile. While I use it myself for my personal communications, it has some rough edges, and installation and update will require familiarity with Linux.

The prototype is also meant to show that it is still possible to replace and modify your mobile phone's operating system while retaining verified boot security - though only just barely. The Android ecosystem is moving very fast, and in this rapid development, we are concerned that the freedom of users to use, study, share, and improve the operating system software on their phones is being threatened. If we lose these freedoms on mobile, we may never get them back. This is especially troubling as mobile access to the Internet becomes the primary form of Internet usage worldwide.

Quick Recap

We are trying to demonstrate that it is possible to build a phone that respects user choice and freedom, vastly reduces vulnerability surface, and sets a direction for the ecosystem with respect to how to meet the needs of high-security users. Obviously this is a large task. Just as with our earlier prototype, we are relying on suggestions and support from the wider community.

Help from the Community

When we released our first prototype, the Android community exceeded our wildest expectations with respect to their excitement and contributions. The comments on our initial blog post were filled with helpful suggestions.

Soon after that post went up, Cédric Jeanneret took my Droidwall scripts and adapted them into the very nice OrWall, which is exactly how we think a Tor-enabled phone should work in general. Users should have full control over what information applications can access on their phones, including Internet access, and have control over how that Internet access happens. OrWall provides the networking component of this access control. It allows the user to choose which apps route through Tor, which route through non-Tor, and which can't access the Internet at all. It also has an option to let a specific Voice over IP app, like Signal, bypass Tor for the UDP voice data channel, while still sending call setup information over Tor.

At around the time that our blog post went up, the Copperhead project began producing hardened builds of Android. The hardening features make it more difficult to exploit Android vulnerabilities, and also provides WiFi MAC address randomization, so that it is no longer trivial to track devices using this information.

Copperhead is also the only Android ROM that supports verified boot, which prevents exploits from modifying the boot, system, recovery, and vendor device partitions. Coppherhead has also extended this protection by preventing system applications from being overridden by Google Play Store apps, or from writing bytecode to writable partitions (where it could be modified and infected). This makes Copperhead an excellent choice for our base system.

The Copperhead Tor Phone Prototype

Upon the foundation of Copperhead, Orbot, Orwall, F-Droid, and other community contributions, we have built an installation process that installs a new Copperhead phone with Orbot, OrWall, SuperUser, Google Play, and MyAppList with a list of recommended apps from F-Droid.

We require SuperUser and OrWall instead of using the VPN APIs because the Android VPN APIs are still not as reliable as a firewall in terms of preventing leaks. Without a firewall-based solution, the VPN can leak at boot, or if Orbot is killed or crashes. Additionally, DNS leaks outside of Tor still occur with the VPN APIs on some systems.

We provide Google Play primarily because Signal still requires it, but also because some users probably also want apps from the Play Store. You do not need a Google account to use Signal, but then you need to download the Signal android package and sideload it manually (via adb install).

The need to install these components to the system partition means that we must re-sign the Copperhead image and updates if we want to keep the ability to have system integrity from Verified Boot.

Thankfully, the Nexus Devices supported by Copperhead allow the use of user-generated keys. The installation process simply takes a Copperhead image, installs our additional apps, and signs it with the new keys.

Systemic Threats to Software Freedom

Unfortunately, not only is Copperhead the only Android rebuild that supports Verified Boot, but the Google Nexus/Pixel hardware is the only Android hardware that allows the user to install their own keys to retain both the ability to modify the device, as well as have the filesystem security provided by verified boot.

This, combined with Google's increasing hostility towards Android as a fully Open Source platform, as well as the difficulty for external entities to keep up with Android's surprise release and opaque development processes, means that the ability for end-users to use, study, share, and improve the Android system are all in great jeopardy.

This all means that the Android platform is effectively moving to a "Look but don't touch" Shared Source model that Microsoft tried in the early 2000s. However, instead of being explicit about this, Google appears to be doing it surreptitiously. It is a very deeply disturbing trend.

It is unfortunate that Google seems to see locking down Android as the only solution to the fragmentation and resulting insecurity of the Android platform. We believe that more transparent development and release processes, along with deals for longer device firmware support from SoC vendors, would go a long way to ensuring that it is easier for good OEM players to stay up to date. Simply moving more components to Google Play, even though it will keep those components up to date, does not solve the systemic problem that there are still no OEM incentives to update the base system. Users of old AOSP base systems will always be vulnerable to library, daemon, and operating system issues. Simply giving them slightly more up to date apps is a bandaid that both reduces freedom and does not solve the root security problems. Moreover, as more components and apps are moved to closed source versions, Google is reducing its ability to resist the demand that backdoors be introduced. It is much harder to backdoor an open source component (especially with reproducible builds and binary transparency) than a closed source one.

If Google Play is to be used as a source of leverage to solve this problem, a far better approach would be to use it as a pressure point to mandate that OEMs keep their base system updated. If they fail to do so, their users will begin to lose Google Play functionality, with proper warning that notifies them that their vendor is not honoring their support agreement. In a more extreme version, the Android SDK itself could have compiled code that degrades app functionality or disables apps entirely when the base system becomes outdated.

Another option would be to change the license of AOSP itself to require that any parties that distribute binaries of the base system must provide updates to all devices for some minimum period of time. That would create a legal avenue for class-action lawsuits or other legal action against OEMs that make "fire and forget" devices that leave their users vulnerable, and endanger the Internet itself.

While extreme, both of these options would be preferable to completely giving up on free and open computing for the future of the Internet. Google should be competing on overall Google account integration experience, security, app selection, and media store features. They should use their competitive position to encourage/enforce good OEM behavior, not to create barriers and bandaids that end up enabling yet more fragmentation due to out of date (and insecure) devices.

It is for this reason that we believe that projects like Copperhead are incredibly important to support. Once we lose these freedoms on mobile, we may never get them back. It is especially troubling to imagine a future where mobile access to the Internet is the primary form of Internet usage, and for that usage, all users are forced to choose between having either security or freedom.

Hardware Choice

The hardware for this prototype is the Google Nexus 6P. While we would prefer to support lower end models for low income demographics, only the Nexus and Pixel lines support Verified Boot with user-controlled keys. We are not aware of any other models that allow this, but we would love to hear if there are any that do.

In theory, installation should work for any of the devices supported by Copperhead, but updating the device will require the addition of an updater-script and an adaptation of the releasetools.py for that device, to convert the radio and bootloader images to the OTA update format.

If you are not allergic to buying hardware online, we highly recommend that you order them from the Copperhead store. The devices are shipped with tamper-evident security tape, for what it's worth. Otherwise, if you're lucky, you might still be able to find a 6P at your local electronics retail store. Please consider donating to Copperhead anyway. The project is doing everything right, and could use your support.

Hopefully, we can add support for the newer Pixel devices as soon as AOSP (and Copperhead) supports them, too.

Installation

Before you dive in, remember that this is a prototype, and you will need to be familiar with Linux.

With the proper prerequisites, installation should be as simple as checking out the Mission Improbable git repository, and downloading a Copperhead factory image for your device.

The run_all.sh script should walk you through a series of steps, printing out instructions for unlocking the phone and flashing the system. Please read the instructions in the repository for full installation details.

The very first device boot after installation will take a while, so be patient. During this boot, you should note the fingerprint of your key on the yellow boot splash screen. That fingerprint is what authenticates the use of your key and the rest of the boot process.

Once the system is booted, after you have given Google Play Services the Location and Storage permissions (as per the instructions printed by the script), make sure you set the Date and Time accurately, or Orbot will not be able to connect to the Tor Network.

Then, you can start Orbot, and allow F-Droid, Download Manager, the Copperhead updater, Google Play Services (if you want to use Signal), and any other apps you want to access the network.

NOTE: To keep Orbot up to date, you will have to go into F-Droid Repositories option, and click Guardian Project Official Releases.

Installation: F-Droid apps

Once you have networking and F-Droid working, you can use MyAppList to install apps from F-Droid. Our installation provides a list of useful apps for MyAppList. The MyAppsList app will allow you to select the subset you want, and install those apps in succession by invoking F-Droid. Start this process by clicking on the upward arrow at the bottom right of the screen:

Alternately, you can add links to additional F-Droid packages in the apk url list prior to running the installation, and they will be downloaded and installed during run_all.sh.

NOTE: Do not update OrWall past 1.1.0 via F-Droid until issue 121 is fixed, or networking will break.

Installation: Signal

Signal is one of the most useful communications applications to have on your phone. Unfortunately, despite being open source itself, Signal is not included in F-Droid, for historical reasons. Near as we can tell, most of the issues behind the argument have actually been since resolved. Now that Signal is reproducible, we see no reason why it can't be included in some F-Droid repo, if not the F-Droid repo, so long as it is the same Signal with the same key. It is unfortunate to see so much disagreement over this point, though. Even if Signal won't make the criterion for the official F-Droid repo (or wherever that tirefire of a flamewar is at right now), we wish that at the very least it could meet the criterion for an alternate "Non-Free" repo, much like the Debian project provides. Nothing is preventing the redistribution of the official Signal apk.

For now, if you do not wish to use a Google account with Google Play, it is possible to download the Signal apks from one of the apk mirror sites (such as APK4fun, apkdot.com, or apkplz.com). To ensure that you have the official Signal apk, perform the following:

  1. Download the apk.
  2. Unzip the apk with unzip org.thoughtcrime.securesms.apk
  3. Verify that the signing key is the official key with keytool -printcert -file META-INF/CERT.RSA
  4. You should see a line with SHA256: 29:F3:4E:5F:27:F2:11:B4:24:BC:5B:F9:D6:71:62:C0 EA:FB:A2:DA:35:AF:35:C1:64:16:FC:44:62:76:BA:26
  5. Make sure that fingerprint matches (the space was added for formatting).
  6. Verify that the contents of that APK are properly signed by that cert with: jarsigner -verify org.thoughtcrime.securesms.apk. You should see jar verified printed out.

Then, you can install the Signal APK via adb with adb install org.thoughtcrime.securesms.apk. You can verify you're up to date with the version in the app store with ApkTrack.

For voice calls to work, select Signal as the SIP application in OrWall, and allow SIP access.


Updates

Because Verified Boot ensures filesystem integrity at the device block level, and because we modify the root and system filesystems, normal over the air updates will not work. The fact that we use different device keys will prevent the official updates from installing at all, but even if they did, they would remove the installation of Google Play, SuperUser, and the OrWall initial firewall script.

When the phone notifies you of an update, you should instead download the latest Copperhead factory image to the mission-improbable working directory, and use update.sh to convert it into a signed update zip that will get sideloaded and installed by the recovery. You need to have the same keys from the installation in the keys subdirectory.

The update.sh script should walk you through this process.

Updates may also reset the system clock, which must be accurate for Orbot to connect to the Tor network. If this happens, you may need to reset the clock manually under Date and Time Settings

Usage

I use this prototype for all of my personal communications - Email, Signal, XMPP+OTR, Mumble, offline maps and directions in OSMAnd, taking pictures, and reading news and books. I use Intent Intercept to avoid accidentally clicking on links, and to avoid surprising cross-app launching behavior.

For Internet access, I personally use a secondary phone that acts as a router for this phone while it is in airplane mode. That phone has an app store and I use it for less trusted, non-private applications, and for emergency situations should a bug with the device prevent it from functioning properly. However, it is also possible to use a cheap wifi cell router, or simply use the actual cell capabilities on the phone itself. In that case, you may want to look into CSipSimple, and a VoIP provider, but see the Future Work section about potential snags with using SIP and Signal at the same time.

I also often use Google Voice or SIP numbers instead of the number of my actual phone's SIM card just as a general protection measure. I give people this number instead of the phone number of my actual cell device, to prevent remote baseband exploits and other location tracking attacks from being trivial to pull off from a distance. This is a trade-off, though, as you are trusting the VoIP provider with your voice data, and on top of this, many of them do not support encryption for call signaling or voice data, and fewer still support SMS.

For situations where using the cell network at all is either undesirable or impossible (perhaps because it is disabled due to civil unrest), the mesh network messaging app Rumble shows a lot of promise. It supports both public and encrypted groups in a Twitter-like interface run over either a wifi or bluetooth ad-hoc mesh network. It could use some attention.

Future Work

Like the last post on the topic, this prototype obviously has a lot of unfinished pieces and unpolished corners. We've made a lot of progress as a community on many of the future work items from that last post, but many still remain.

Future work: More Device Support

As mentioned above, installation should work on all devices that Copperhead supports out of the box. However, updates require the addition of an updater-script and an adaptation of the releasetools.py for that device, to convert the radio and bootloader images to the OTA update format.

Future Work: MicroG support

Instead of Google Play Services, it might be nice to provide the Open Source MicroG replacements. This requires some hackery to spoof the Google Play Service Signature field, though. Unfortunately, this method creates a permission that any app can request to spoof signatures for any service. We'd be much happier about this if we could find a way for MicroG to be the only app to be able to spoof permissions, and only for the Google services it was replacing. This may be as simple as hardcoding those app ids in an updated version of one of these patches.

Future Work: Netfilter API (or better VPN APIs)

Back in the WhisperCore days, Moxie wrote a Netfilter module using libiptc that enabled apps to edit iptables rules if they had permissions for it. This would eliminate the need for iptables shell callouts for using OrWall, would be more stable and less leaky than the current VPN APIs, and would eliminate the need to have root access on the device (which is additional vulnerability surface). That API needs to be dusted off and updated for the Copperhead compatibility, and then Orwall would need to be updated to use it, if present.

Alternatively, the VPN API could be used, if there were ways to prevent leaks at boot, DNS leaks, and leaks if the app is killed or crashes. We'd also want the ability to control specific app network access, and allow bypass of UDP for VoIP apps.

Future Work: Fewer Binary Blobs

There are unfortunately quite a few binary blobs extracted from the Copperhead build tree in the repository. They are enumerated in the README. This was done for expedience. Building some of those components outside of the android build tree is fairly difficult. We would happily accept patches for this, or for replacement tools.

Future Work: F-Droid auto-updates, crash reporting, and install count analytics

These requests come from Moxie. Having these would make him much happier about F-Droid Signal installs.

It turns out that F-Droid supports full auto-updates with the Priviledged Extension, which Copperhead is working on including.

Future Work: Build Reproducibility

Copperhead itself is not yet built reproducibly. It's our opinion that this is the AOSP's responsibility, though. If it's not the core team at Google, they should at least fund Copperhead or some other entity to work on it for them. Reproducible builds should be an organizational priority for all software companies. Moreover, in combination with free software, they are an excellent deterrent against backdoors.

In this brave new world, even if we can trust that the NSA won't be ordered to attack American companies to insert backdoors, deteriorating relationships with China and other state actors may mean that their incentives to hold back on such attacks will be greatly reduced. Closed source components can also benefit from reproducible builds, since compromising multiple build systems/build teams is inherently harder than compromising just one.

Future Work: Orbot Stability

Unfortunately, the stability of Orbot itself still leaves a lot to be desired. It is fairly fragile to network disconnects. It often becomes stuck in states that require you to go into the Android Settings for Apps, and then Force Stop Orbot in order for it to be able to reconnect properly. The startup UI is also fragile to network connectivity.

Worse: If you tap the start button either too hard or multiple times while the network is disconnected or while the phone's clock is out of sync, Orbot can become confused and say that it is connected when it is not. Luckily, because the Tor network access security is enforce by Orwall (and the Android kernel), instabilities in Orbot do not risk Tor leaks.

Future Work: Backups and Remote Wipe

Unfortunately, backups are an unsolved problem. In theory, adb backup -all should work, but even the latest adb version from the official Android SDK appears to only backup and restore partial data. Apparently this is due to adb obeying manifest restrictions on apps that request not to be backed up. For the purposes of full device backup, it would be nice to have an adb version that really backed up everything.

Instead, I use the export feature of K-9 Mail, Contacts, and the Calendar Import-Export app to export that data to /sdcard, and then adb pull /sdcard. It would be nice to have an end-to-end encrypted remote backup app, though. Flock had promise, but was unfortunately discontinued.

Similarly, if a phone is lost, it would be nice to have a cryptographically secure remote wipe feature.

Future Work: Baseband Analysis (and Isolation)

Until phones with auditable baseband isolation are available (the Neo900 looks like a promising candidate), the baseband remains a problem on all of these phones. It is unknown if vulnerabilities or backdoors in the baseband can turn on the mic, make silent calls, or access device memory. Using a portable hotspot or secondary insecure phone is one option for now, but it is still unknown if the baseband is fully disabled in airplane mode. In the previous post, commenters recommended wiping the baseband, but on most phones, this seems to also disable GPS.

It would be useful to audit whether airplane mode fully disables the baseband using either OpenBTS, OsmocommBB, or a custom hardware monitoring device.

Future Work: Wifi AP Scanning Prevention

Copperhead may randomize the MAC address, but it is quite likely that it still tries to connect to configured APs, even if they are not there (see these two XDA threads). This can reveal information about your home and work networks, and any other networks you have configured.

There is a Wifi Privacy Police App in F-Droid, and Smarter WiFi may be other options, but we have not yet had time to audit/test either. Any reports would be useful here.

Future Work: Port Tor Browser to Android

The Guardian Project is undertaking a port of Tor Browser to Android as part of their OrFox project. This port is still incomplete, however. The Tor Project is working on obtaining funding to bring it on par with the desktop Tor Browser.

Future Work: Better SIP Support

Right now, it is difficult to use two or more SIP clients in OrWall. You basically have to switch between them in the settings, which is also fragile and error prone. It would be ideal if OrWall allowed multiple SIP apps to be selected.

Additionally, SIP providers and SIP clients have very poor support for TLS and SRTP encryption for call setup and voice data. I could find only two such providers that advertised this support, but I was unable to actually get TLS and SRTP working with CSipSimple or LinPhone for either of them.

Future Work: Installation and full OTA updates without Linux

In order for this to become a real end-user phone, we need to remove the requirement to use Linux in order to install and update it. Unfortunately, this is tricky. Technically, Google Play can't be distributed in a full Android firmware, so we'd have to get special approval for that. Alternatively, we could make the default install use MicroG, as above. In either case, it should just be a matter of taking the official Copperhead builds, modifying them, changing the update URL, and shipping those devices with Google Play/MicroG and the new OTA location. Copperhead or Tor could easily support multiple device install configurations this way without needing to rebuild everything for each one. So legal issues aside, users could easily have their choice of MicroG, Google Play, or neither.

Personally, I think the demand is higher for some level of Google account integration functionality than what MicroG provides, so it would be nice to find some way to make that work. But there are solid reasons for avoiding the use of a Google account (such as Google's mistreatment of Tor users, the unavailability of Google in certain areas of the world due to censorship of Google, and the technical capability of Google Play to send targeted backdoored versions of apps to specific accounts).

Future Work: Better Boot Key Representation/Authentication

The truncated fingerprint is not the best way to present a key to the user. It is both too short for security, and too hard to read. It would be better to use something like the SSH Randomart representation, or some other visual representation that encodes a cryptographically strong version of the key fingerprint, and asks the user to click through it to boot. Though obviously, if this boot process can also be modified, this may be insufficient.

Future Work: Faster GPS Lock

The GPS on these devices is device-only by default, which can mean it is very slow. It would be useful to find out if µg UnifiedNlp can help, and which of its backends are privacy preserving enough to recommend/enable by default.

Future Work: Sensor Management/Removal

As pointed out in great detail in one of the comments below, these devices have a large number of sensors on them that can be used to create side channels, gather information about the environment, and send it back. The original Mission Impossible post went into quite a bit of detail about how to remove the microphone from the device. This time around, I focused on software security. But like the commentor suggested, you can still go down the hardware modding rabbithole if you like. Just search YouTube for teardown nexus 6P, or similar.


Changes Since Initial Posting

Like the last post, this post will likely be updated for a while based on community feedback. Here is the list of those changes so far.

  1. Added information about secondary SIP/VoIP usage in the Usage section and the Future Work sections.
  2. Added a warning not to upgrade OrWall until Issue 121 is fixed.
  3. Describe how we could remove the Linux requirement and have OTA updates, as a Future Work item.
  4. Remind users to check their key fingerprint at installation and boot, and point out in the Future Work section that this UI could be better.
  5. Mention the Neo900 in the Future Work: Baseband Isolation section
  6. Wow, the Signal vs F-Droid issue is a stupid hot mess. Can't we all just get along and share the software? Don't make me sing the RMS song, people... I'll do it...
  7. Added a note that you need the Guardian Project F-Droid repo to update Orbot.
  8. Add a thought to the Systemic Threats to Software Freedom section about using licensing to enforce the update requirement in order to use the AOSP.
  9. Mention ApkTrack for monitoring for Signal updates, and Intent Intercept for avoiding risky clicks.
  10. Mention alternate location providers as Future Work, and that we need to pick a decent backend.
  11. Link to Conversations and some other apps in the usage section. Also add some other links here and there.
  12. Mention that Date and Time must be set correctly for Orbot to connect to the network.
  13. Added a link to Moxie's netfilter code to the Future Work section, should anyone want to try to dust it off and get it working with Orwall.
  14. Use keytool instead of sha256sum to verify the Signal key's fingerprint. The CERT.RSA file is not stable across versions.
  15. The latest Orbot 15.2.0-rc8 still has issues claiming that it is connected when it is not. This is easiest to observe if the system clock is wrong, but it can also happen on network disconnects.
  16. Add a Future Work section for sensor management/removal

Future Work: Disk Encryption via TPM or Clever Hacks

Unfortunately, even disk encryption and a secure recovery firmware is not enough to fully defend against an adversary with an extended period of physical access to your device.

Cold Boot Attacks are still very much a reality against any form of disk encryption, and the best way to eliminate them is through hardware-assisted secure key storage, such as through a TPM chip on the device itself.

It may also be possible to mitigate these attacks by placing key material in SRAM memory locations that will be overwritten as part of the ARM boot process. If these physical memory locations are stable (and for ARM systems that use the SoC SRAM to boot, they will be), rebooting the device to extract key material will always end up overwriting it. Similar ARM CPU-based encryption defenses have also been explored in the research literature.
-->

> Torifiable mobile voice communication!

Bingo! Suddenly I understand why that fine fellow driving the Landis+Gyr surveillance vehicle was so worried about "walkie-talkies". He wasn't using a military grade spectral analyzer costing hundreds of thousands of dollars because he is worried about walkie-talkies interferring with smart meters. He is worried that BLM supporters might start using Torified push-to-talk at the next big demonstration. And he probably isn't a Landis+Gyr employee at all, but a fed.

Whenever I see hard evidence that the Man really really does not want myself specifically to possess some specific technical tool, I always know that specific tool is just what I need to have, in time for the next big demonstration.

Let's make it so!

Anon

November 28, 2016

Permalink

So much FOSS supporters here and in other forums are just treating this blog post as shit. Threats are not binary and not everyone needs to be such a high-level threat model as these extreme moralists.

All these talk from the rather one-sided forums posts and articles about a 2nd processor that controls everything on phones, laptops and desktops just borderlines with FUD. That is more of an unknown than a certainty (due to lack of proper evidence). People who are more concern about Mass Surveillance don't really need to care of such things.

If threat models are to be put into an rather arbitrary, one-dimensional scale, I can see this phone setup to be very useful for people who fall somewhere around the middle of said scale. One example of why is that so it's the combination of secure communication apps with Tor on a more secure custom ROM really adds to the security.

I always interpret irate posts criticizing Tor Project's latest innovation as a sign that someone is seriously ticked off by no longer being able to easily eavesdrop. That means Mike is on the track of Something Good!

Definitely something good!! Although I always perceive privacy tools and and our own precautions are mostly better than nothing regardless of their effectiveness.

Anon

November 30, 2016

Permalink

Of course, the changes to Rule 41 are irrelevant if prosecutors feel free to simply forge judges's signatures. Repeatedly, every 30 days. In order to spy on the other parties in a love triangle.

https://www.techdirt.com
Brooklyn Prosecutor Forged Judges' Signatures On Wiretap Warrants To Eavesdrop On A 'Love Interest'
from the rules-are-for-other-people dept
30 Nov 2016

>> A high-ranking prosecutor in the Brooklyn district attorney’s office was arrested this week on charges that she used an illegal wiretap to spy on a police detective and one of her colleagues in what a law-enforcement official described as a love triangle gone wrong.

Meanwhile, in distant Manhattan island, Cy Vance Jr. keeps screaming that he needs backdoors into all civilian encryption, so his own prosecutors can read everyone's mail.

Anon

November 30, 2016

Permalink

So our efforts were all for naught:

http://thehill.com/policy/cybersecurity/308088-last-ditch-effort-to-pre…
Last-ditch effort to prevent changes to law enforcement hacking rule fails
Joe Uchill
30 Nov 2016

> A last-ditch effort in the Senate to prevent changes to a rule that will ease the process for law enforcement to use hacking in investigations failed Wednesday morning, allowing the controversial updates to Rule 41 to take effect at midnight.

http://arstechnica.com/security/2016/11/firefox-0day-used-against-tor-u…
Firefox 0-day in the wild is being used to attack Tor users
Publicly released exploit works reliably against a wide range of Firefox versions.
Dan Goodin
30 Nov 2016

> There's a zero-day exploit in the wild that's being used to execute malicious code on the computers of people using Tor and possibly other users of the Firefox browser, officials of the anonymity service confirmed Tuesday. Word of the previously unknown Firefox vulnerability first surfaced in this post on the official Tor website. It included several hundred lines of JavaScript and an introduction that warned: "This is an [sic] JavaScript exploit actively used against TorBrowser NOW." Tor cofounder Roger Dingledine quickly confirmed the previously unknown vulnerability and said engineers from Mozilla were in the process of developing a patch. ... "It's basically almost EXACTLY the same as the payload used [by FBI] in 2013," TheWack0lian told Ars. "It exploits some vuln that executes code very similar to that used in the 2013 Tor browser exploit. Most of the code is identical, just small parts have changed."

I second this question. Mike?

Google shows Nexus 7 2013 is out of support. No new Android versions guaranteed, no Google security updates and no official Copperhead support. Is this a stopper?

What about continuing the use of Nexus 7 with Cyanomod - risky as well?

re: Future Work: Wifi AP Scanning Prevention

In the blog post, mikeperry asked for feedback comparing the Wifi Privacy Police app in F-Droid to the Smarter WiFi app. That second app, i'll assume, is by Kismet and called Smarter Wifi Manager in the Silent Circle Store available to Blackphone1 users (maybe also in the SilentCircle store for Blackphone2, i dunno). I think that i did read that the developer of Smarter Wifi Manager (which i'll call Smarter Wifi here) was involved early on and for some time with the creation of the Blackphone, perhaps a founder and/or manager of some sort, but that he? has since left. Dont quote me on that, but it would make sense to me because when considering the function of the Smarter Wifi app, in my oppinion it was/is dangerously inferior to the function of the Wifi Police app (tho i hav not used or checked either app for changes in months and months).

Wifi Privacy Police prevents connecting to a previously used Wifi access point unless the name and password of that Wifi access point is as expected. If no match, u are so informed and given the opportunity to create a connection. In contrast, Smarter Wifi doesnt check for a password match and instead only checks to see if the name of the access point matches the expected geographical location of the access point, which seems to me quite inferior when trying to avoid an Evil Twin.

For example, along with ur expected access point to which u hav previously assigned a name and password and to which u hav previously successfully connected, what if there is a hidden, 2nd open access point (an Evil Twin) with same name and very strong signal (but no password) nearby ur expected access point? Unless a Wifi security app compares the Wifi passwords, u will unknowingly connect to the Evil Twin.

Im thinking that in the early days of the Blackphone, it was crucial to be able to trust the app developer, which im guessing maybe is why Silent Circle offered the Smarter Wifi app rather than the Wifi Privacy Police app in the Silent Store (ie, Silent Circle founders knew the Smarter Wifi developer, but didnt know the Wifi Police developer) (im guessing). I don kno either developer, and im not implying that the Smarter Wifi developer is malicious. I know nothing about these apps except their intended function. The Wifi Privacy Police worked for me as it claimed.

What is the difference between orwall and orbot's vpn ferature?

Orwall sounds like a firewall, and the other sounds like a VPN, which are two entirely different security measures which are generally not mutually exclusive. A firewall protects ur computing device from many different types of security breaches and threats, including privacy invasions, even when ur device is idle. A VPN is basically a way to hide ur Internet activity (ie, it mostly helps preserve ur privacy while u are on the web). Something like that. The two security measures are so different that there is no easy answer to ur question, and trying to answer ur question might derail the topic at hand, i dunno. Did u mean regarding security from state actors such as the NSA of the USA? Perhaps study up on the difference between firewalls and VPNs, then if u hav a more specific question about the difference, post an additional comment here?

Boycott Google Pixel phones re: leaky modem

It's time to deal with the devil, and this 2016 winter holiday is the prime time. I suggest a two prong approach:

1) negotiate with Google to create a smartphone without baseband vulnerabilities (and verifiably so).

I'm not sure what to suggest here, it's not my expertise, perhaps opensource modem firmware and hardware, perhaps re-designing the smartphone so that modem processor does not control the application processor. Perhaps it will be necessary to create a new, secure, but backward-compatible subset of communication protocols and infrastructure. Whatever is necessary--Google now has the leverage to get this done.

2) Boycott the purchase of Google Pixel smartphones Internationally until the above is accomplished.

-----

It's time to play hardball. Google Android is the most popular OS worldwide. Google is now manufacturing their own smartphones (ie, the Pixel phones). Google is now a wireless network service provider.

Google may not (yet) own the physical wireless infrastructure (eg, cell towers) or manufacture the modem chips inside their smartphones (yet), but Google has the leverage to plug the pervasive vulnerability of smartphone modems, and we hav the levarage (via a boycott) to encourage Google to do so.

Power yields nothing without a demand.

I'm not that impressed with the Pixel smartphones anyway--there doesnt seem to be much new about them. This is the season of giving. Google has just launched its first made-only-by-Google smartphones. Google's wireless network service is still an infant. Google's bottom line (ie, profits) are now significantly dependent on its reputation. This is a relatively new development. Google is vulnerable to a boycott now like never before, and perhaps like never again.

Let's face it, all our smartphones are hacked via their modem. Can we verify that this is not so? Then in this post-Snowden world, we can be assured that it is true. This includes the Blackphone, the Tor Phone, the iPhones, feature phones, etc

What can we do about it? Cover the cameras, sure. Remove the microphones, check. Better disconnect the gyro sensor also (it can also hear u). Airplane mode? ha, how can we kno it works? Anyway, snooped data can be recorded, saved, and transfered later when connectivity is re-enabled. Can we even verifiably disable GPS on modern smartphones? Hardware switches for mobile data, and Wifi, and Bluetooth...can we see where this is going? We are spending tons of time, energy, and limited resources patching sinking ships that we buy brand new. Im not saying that the effort is useless. I'm saying let's be more aggressive.

Where should this be posted? It's a bit off-topic here i think, and probably deserves its own blog post somewhere for discussion. Alternatively, copy and paste to wherever--u hav my blessing.

Re: Google Pixel smartphones boycott

Instead of competing with Apple for the highest-priced, best looking pocket-sized spying device, we need Google to not be evil.

To isolate the smartphone modem, Google may only need to switch from using Quallcomm to Mediatek, or threaten to do so, as Mediatek has previously gotten it right:

2-year-old post from Hacker News
re: Reverse engineering a Qualcomm baseband processor (PDF)
at https://news.ycombinator.com/item?id=8813098

Reply to "baseband processor is usually the master,
and the app processor is a slave"

Posted by userbinator 707 days ago:

For the Mediatek platforms I don't think this is true - the AP is the one that boots up first and loads firmware into the baseband, and at least for the MT6589/6582 the AP can enable protection so that the baseband processor(s) can't access anything outside of the configured ranges. You can look at  

https://github.com/varunchitre15/MT6589_kernel_source/blob/master/media…

which is the code that initialises the baseband modems by loading their firmware (there are two CPUs in the baseband since this is a dual-SIM SoC), and see the enable_mem_access_protection function at line 863. The table there also shows that properly set up, MD0 and MD1 can only access their respective areas and the small amount of shared memory they use to communicate with the AP.

Confirmed by kefka 708 days ago:

You are very correct. I'm also running a MT6589 on my own modded android install....My phone is a HaiPai Noble N7889...I have complete control over my phone (baseband and userspace)...

Wikipedia Google Pixel (smartphones)#History:

Sales were brisk following the initial release, and it seemed like Google might finally surpass Apple in the high-end smartphone market. But then a small group of privacy advocates at the Tor Phone Project claimed that the Pixel Smartphones had a severe security vulnerability that could not easily be fixed by an over-the-air update. The problem was Google's choice of internal hardware--specifically, the wireless modem which basically connects the phone's internal antenna to the Android operating system. That modem had a separate operating system (name unknown) which was not publicized and which controlled the Android operating system. Malware sent wirelessly to the phone could allow an attacker to control the phone remotely and secretly. The modem chip was manufactured by Quallcomm (headquartered in the USA) who also owned the proprietary operating system of the modem. Google recalled all the Pixel smartphones and replaced them with new hardware including a modem from Mediatek of Taiwan. The replacement phones (the Pixel 2 series) soon became extremely popular and was further modified by the privacy advocates at the Tor Phone Project to be even more secure. The Tor Phone set the standard for the modern mobile phone we hav today.

Google is no stranger to the ongoing conflict regarding seperation of church and state. I mean userspace vs baseband software. Whats the difference?

Google managed the sale of (a few million?) "Android One" smartphones to the developing world in the last two years, and Google's original reference design for that smartphone was basically the 3G Motto E (E=economy) with the privacy-friendly Taiwanese Mediatek MT6582 System-on-Chip replacing the System-on-Chip from Quallcomm of USA.

Google wasn't going into India, Indonesia and Pakistan with a smartphone that obviously enabled Uncle Sam to spy on their masses. That would be very bad public relations (ie, dangerous for Google). But after the natives accepted Google's smartphones and accessory trinkits, Google had a foothold there and in mid-2015 began switching back to using a Qualcomm System-on-Chip, which is what we are stuck with here in the USA.

We arent going to slow Google down by crying foul and forcing Google to recall its Pixel smartphones due to their modem vulnerabilities. I mean, let's make some noise about it now, but the subsequent recall of Pixel smartphones wont harm Google much. Google can absorb the cost and Google doesnt even depend on the smartphone modem vulnerability to spy on people--Google can do that via the smartphone userspace. Its Big Oil and their three-letter USA spy agencies that wish to preserve that smartphone modem backdoor, which is how Big Oil remains competititive in these days of irreversible climate change.

Quallcomm might even welcome the chance to open-source their baseband software stacks (depending on how evil they are--i dunno). Qualcomm is probably tired of never updating their ancient bug-ridden modem operating systems just to please Big Oil. That has to make the application of new code difficult and unreliable, no?

What im saying here is hav no fear. We are the professionals who are expected to speak up about this baseband vulnerabity which, with modern tools and modern armies of state-sponsored hackers, is now THE most glaring hole in smartphone security. Sure, we've been saying it all along, but now we hav a shining example to point to (ie, the Google Pixel smartphones). What about the Nexus 5x and 6p? What about Apple phones? Let Google decide how they want to handle our public accusation of the Pixel smartphones.

Will we be discredited? I doubt it. Just assign a CVE number to this issue and give it the priority it truely deserves. Call a press conference to announce the vulnerability like u usually do. Ur just doing ur job. Ho hum. Yes, Google's first phones hav a fatal flaw, but look at the Samaung Galaxy Note 7--these things happen, its terribly inconvenient for some, but we all benefit in the long run, blah blah, tweet tweet. Will Google need to recall their Pixel phones? It seems likely. End of story.

There is no reason that plugging this security hole should cause our phones to slow down or reduce ur driving range or raise the price of stinkin oil. This is a minor battle we hav to fight before we can even reach the front lines of evolutionary change. We hav to be able to communicate in private! We cant use our phones for that if we refuse to aknowledge the underlying technical problem staring us in the face.

Im not mike, but unlike the Nexus 6p (and 5x and newer Pixel phones), the 2013 Nexus 7 doesnt hav a 64-bit application processor and thus doesnt hav verified boot--something like that i think which yes, is a showstopper because of the resulting security vulnerability. However, as mentioned in the post regarding future work, there are other, remaining vulnerabilities in the Tor Phone such as the baseband processor and its associated OS being the master of the application processor and its associated Android OS. The idea is to plug the security holes that we can, when we can, and not throw up our hands and backtrack just because there is so much more work to be done to secure a smartphone. And there is a ton of more work to do.

A lot of work indeed. The thing is that we're trying to add security to something that wasn't created with security in mind. AME 2000 supposedly is very secure but isn't available for the layman.

there is a new signal fork called noise, which is working without gcm and and an alternative to libresignal.

https://copperhead.co/android/docs/usage_guide#messaging

anyway, non gcm support for signal seems to be on its way:

https://github.com/WhisperSystems/Signal-Android/pull/5962

but might still take a while.

is it possible to verify signal apk without using Java JRE/JDK ?
I just want to install it on my phone without using a google account and need an alternative means of verifying the file

>Signal

Do you really advertise a messenger which requires a user to prove his identity? I'm very disappointed.

Sounds like good news. Hopefully there's a secure OS that is supported by phone hardware that doesn't make my wallet cry.

First of all, sincere apologies if this is not the right place to ask for help, I don't know the 'rules' nor where else to go. Thank you for your understanding. Or/and for your help, or directions to another place, please...?

Hanging out in China regularly, and, equally non-private as far as I am concerned, the USA and the UK, I bought a Nexus P6 especially to install this prototype project when I read about it, but I got stuck at the ./run-all.sh and can't figure out how to solve it/finish properly by myself. I use Arch GNU&Linux, but I am not a developer at all. Following the instructions on github and issuing ./run_all.sh angler-nmf26q (using angler-factory-2016.12.25.22.27.39.tar.xz which I put and extracted in ~/mission-improbable/) halts. These lines are the problematic ones, I believe, before I am thrown back to the command prompt:

./make_keys.sh: line 6: ~/mission-improbable/helper-repos/android-simg2img/generate_verity_key: No such file or directory
...
...
...
Copperhead successfully installed!
~/mission-improbable/angler-nmf26q ~/mission-improbable
~/mission-improbable
cp: cannot stat 'keys/verity_key.pub': No such file or directory

I tried editing 'generate_verity_key' in that line 6 to 'generate_verity_key.c', as that is the name of the file in helper-repos/android-simg2img/ and I renamed that file to 'generate_verity_key'. But to no avail, and that is how far my creativity and logic reach - regarding this one at least...

Thank you all very much for this wonderful project, both Tor and Copperhead. I will happily continue to donate to Tor - and to Copperhead, once I get my P6 working like Mike Perry's! :-)

If you have not thought about crowdfunding yet, maybe it's time?

CopperheadOS has a Reddit - https://reddit.com/r/copperheados