Dropbox – a complete breakdown in trust (and what you can do about it)

A while ago I wrote about data security on my academic site. I believe data security to be of huge significance in academia and beyond: protecting significant quantities of sensitive data about ourselves and others is – or should be – an important part of what we all do now.

Collaborative tools like Dropbox can be very helpful in our work with others. Dropbox appears to offer reasonably secure ways of sharing specific folders or files with different people, and provided good passwords are used, most people will assume their data is pretty secure, somewhere “up there in the cloud” (actually, very much down on earth, on Dropbox’s computers…). Of course, if you use any kind of online service these days you may expect there to be data leaks. Dropbox, for example, “lost” 68 million user details in 2012, and recently asked users to change passwords as a result. So if you changed your password, all should be well and you can carry on using Dropbox, yes…?

No. Looking into it in more detail, I see that not only does Dropbox not encrypt data with keys that you create before sending your files to their computers (I gather this is why Edward Snowden advised against using Dropbox), if you have their desktop version installed on your Apple Mac, you are opening your computer to all kinds of vulnerabilities. This is because Dropbox installs itself as a permanent rootkit in your computer without telling you it is doing so. I was alerted to this a couple of days ago by a couple of tweets (1, 2) and then began to see a lot more (maybe Dropbox do this on Windows and other systems too but nobody’s found out about it yet, who knows?)

Perhaps unsurprisingly, any number of searches on the Dropbox help pages failed to give more information on all this.

Perhaps unsurprisingly, any number of searches on the Dropbox help pages failed to give more information on all this.

Even if you trust Dropbox not to take control of your computer (and I don’t see why you should, given they tricked you into giving them that possibility!), anyone who discovers or creates a vulnerability in Dropbox’s software now appears to have an open door to your computer – and if Dropbox can lose 68 million user details, why would you assume they’re particularly good at security? Anything and everything you do on your computer could be at risk. For details on this problem, I recommend the following two postings (and many of the comments are worth reading too):

  1. 28. July 2016: revealing Dropbox’s dirty little security hack
  2. 29. August 2016: discovering how Dropbox hacks your mac

Even if talk of hashtags and algorithms sends you to sleep, the key thing to note is Dropbox’s “explanation” for their actions; it is also highlighted on the second of these two links. Dropbox claim they:

need to request all the permissions we need or even may need in the future.

The problem is, they never ask their users if they could have permission to control all these permissions now and in perpetuity. Instead, Dropbox appear to have tricked users by using inappropriate dialogue boxes to gain this access, making it look as if users were giving their permission for something else.

In my book, this is an unforgivable breach of trust. I find myself asking why I should trust Dropbox with anything, if they deceive me into giving them control over my system?

What to do?

I’d suggest uninstalling Dropbox as soon as possible, and if you must still use it (for sharing with colleagues, for example), then just do so via the web interface. It is very simple to remove it from your Mac:

  1. move any files you want to take off Dropbox to somewhere on your computer
  2. follow the instructions on the Dropbox website to uninstall their desktop interface
  3. if your level of trust in Dropbox is the same as mine after reading all this, you might also want to remove permission for links to your Dropbox data, and then also delete the Dropbox apps on your tablet/mobile.

Then you might want to start looking for secure alternatives to Dropbox


The government wants to deprive you of secure encryption – but why does this matter?

In early October I wrote a blog posting on my academic blog about the need for strong encryption for academics; when tweeting this posting, I noted that it actually applied to all areas of life.

Now, according to an article in the Daily Telegraph, our dreadful Conservative government at Westminster wants to prevent anyone in the UK from having proper encryption.  That is not how they describe it, but it is very clearly what they intend.  They want to ensure that nobody can use encryption services that the government cannot access – in other words, the government is going to ban you from securely encrypting your data.  This is not what they say they are doing, but: if your data can be read by anyone other than the intended recipient (another person or yourself), then it is not securely encrypted.

You may think this does not matter to you, because you have no deep dark secrets that you communicate on Facetime or iMessage (the Apple examples given in the Telegraph).  I would dispute that view (see below), but even if that is not a concern to you, bear in mind that all your financial transactions are put at risk by idiocy such as this: if secure encryption is no longer allowed in this country, it is certain that not only will the government be able to access your data as it is transmitted, but before long others will be able to do so as well – others with aims at least as nefarious as those of the government.  Think about the implications of this next time you enter your credit card details to buy a book online, or transfer money using your online banking, or give someone (your employer?) your bank details so they can give you money.

The Telegraph article says:

Companies such as Apple, Google and others will no longer be able to offer encryption so advanced that even they cannot decipher it when asked to, the Daily Telegraph can disclose.

Measures in the Investigatory Powers Bill will place in law a requirement on tech firms and service providers to be able to provide unencrypted communications to the police or spy agencies if requested through a warrant.

But data is either securely encrypted or it is not.  Semi-secured encryption is an oxymoron.  To be sure, there are different levels of encryption (see the note below), but that is not the same as allowing a backdoor that can bypass the encryption altogether – that’s simply insecure.

If the government is allowed to pursue this madness, no communications in the UK can possibly be trusted, nor will it be possible to trust devices or software bought here.  The economic and computing argument is lucidly laid out in this blog posting.

More generally, as the leaks from Edward Snowden and others have shown, governments are absolutely not to be trusted with personal data. I believe we have a duty as members of the public to not be a totally predictable or transparent population.  As soon as we are transparent in our lives – whether this be our shopping patterns or our political views – we are tearing at the fabric of democracy, and governments of all kinds will appreciate us doing that for them! As responsible individuals, we must use encryption for ourselves in order to maintain the appropriate balance between our personal lives and the public good.  The government seeks to make the personal public by removing our right to privacy, whilst at the same time taking what is public away from us as persons.  This proposed removal of the right to secure encryption coupled with the current attacks on civil liberties – whether cuts to legal aid, restrictions on trade union activity, or the ever-diminishing space for public protest etc. – is a profoundly worrying situation.  Responsible citizens should use secure encryption, and we should resist the government’s attempts to take that away from us.  The Electronic Frontier Foundation has a very good guide to all kinds of security issues on its Surveillance Self-Defense pages – I highly recommend going through these and seeing what might apply to you.

I also strongly encourage you to write to your MP about this issue; you can find out who this is by clicking here.


Note: Regarding the different levels of encryption: this is about determining the ease with which an attacker could break the keys, perhaps by trying to reverse-engineer or brute force guessing of the password/passphrase etc.  Different levels of activity warrant different key lengths: clicking on the padlock in my browser bar shows that the WordPress site I am typing this text on uses 128 bit keys, which is ok for a simple blog:

Screenshot of Firefox's Security tab in the Page Info screen - click to see a larger version

Screenshot of Firefox’s Security tab in the Page Info screen – click to see a larger version

My online banking uses 256 bit keys which is better; my GPG encryption uses 4096 bit keys, which as far as anyone knows, means it is practically unbreakable at the present time: the only way to read something I have encrypted with GPG is by asking (or torturing…!) me or the recipient to provide our respective passphrases.  Of course, even GPG (which is based on Phil Zimmermann’s Pretty Good Privacy, or PGP) is not going to be secure forever, but at the moment it is the best there is for email (even Edward Snowden uses it), particularly because the code is open-source meaning that countless experts have studied and reviewed it and found it to be secure.

The EFF link above makes useful suggestions for other forms of communication, including text messaging (eg Signal/TextSecure), chat (eg ChatSecure), and more.