As mentioned in the comments, plain text keys aren’t bad because they are necessary. You have to have at least one plain text key in order to be able to use encryption

87 points
*

This is a, “it’s turtles all the way down!” problem. An application has to be able to store its encryption keys somewhere. You can encrypt your encryption keys but then where do you store that key? Ultimately any application will need access to the plaintext key in order to function.

On servers the best practice is to store the encryption keys somewhere that isn’t on the server itself. Such as a networked Hardware Security Module (HSM) but literally any location that isn’t physically on/in the server itself is good enough. Some Raspberry Pi attached to the network in the corner of the data center would be nearly as good because the attack you’re protecting against with this kind of encryption is someone walking out of the data center with your server (and then decrypting the data).

With a device like a phone you can’t use a networked HSM since your phone will be carried around with you everywhere. You could store your encryption keys out on the Internet somewhere but that actually increases the attack surface. As such, the encryption keys get stored on the phone itself.

Phone OSes include tools like encrypted storage locations for things like encryption keys but realistically they’re no more secure than storing the keys as plaintext in the application’s app-specific store (which is encrypted on Android by default; not sure about iOS). Only that app and the OS itself have access to that storage location so it’s basically exactly the same as the special “secure” storage features… Except easier to use and less likely to be targeted, exploited, and ultimately compromised because again, it’s a smaller attack surface.

If an attacker gets physical access to your device you must assume they’ll have access to everything on it unless the data is encrypted and the key for that isn’t on the phone itself (e.g. it uses a hash generated from your thumbprint or your PIN). In that case your effective encryption key is your thumb(s) and/or PIN. Because the Signal app’s encryption keys are already encrypted on the filesystem.

Going full circle: You can always further encrypt something or add an extra step to accessing encrypted data but that just adds inconvenience and doesn’t really buy you any more security (realistically). It’s turtles all the way down.

permalink
report
reply
13 points
*

This reminds me of the apparent gnome-keyring security hole. It’s mentioned in the first section of the arch wiki entry: https://wiki.archlinux.org/title/GNOME/Keyring

Any application can read keyring entries of the other apps. So it’s pretty trivial to make a targeted attack on someone’s account if you can get them to run an executable on their machine.

permalink
report
parent
reply
10 points
*

Exactly. Discord token stealers have been around for ages but no one gives Discord flak about leaving that secret unencrypted.

permalink
report
parent
reply
4 points

Discord aren’t marketing themselves on secure or encrypted messaging. Signal is.

permalink
report
parent
reply
13 points

Very good “Explain Like I’m Five” there!

permalink
report
parent
reply
8 points

Signal seem to be the least compromising messenger app out there with their privacy policy and open source code base. It’s only natural they are frequent victims of FUD.

permalink
report
parent
reply

Phones come with pretty easy encryption APIs that use hardware encryption stores to do the encryption work. You can copy the entire data folder to the same phone after a factory reset, but the messages won’t decrypt. It’s a useful extra encryption feature that’s pretty tough to crack (as in, governments will struggle) but trivial to implement.

Desktop operating systems lack this. I believe Windows can do it through Windows Hello but that requires user interaction (and Windows isn’t sandboxed anyway so it doesn’t protect you much if you’re running malware are the same time). Don’t know about macOS, but I assume it’s the same story. Linux lacks support for security hardware entirely and doesn’t even try (see: the useless Keychain API copy).

What desktop operating systems do protect you from, though, is offline attacks. Someone needs to know your password to log in and grab the keys, even if they know your disk’s encryption key. Not even your Bitlocker recovery key will suffice to get your keys out of a locked Windows Hello key store. Linux can implement this ad well, in theory, but nothing seems to actually implement any of it.

Leveraging modern key store mechanisms would protect Signal on macOS and Windows. On Linux you’d still be in the same shitty situation, though if they were to implement the key store API, someone could at least eventually make something secure in the future.

permalink
report
parent
reply
2 points

Don’t phones have something comparable to a TPM?

permalink
report
parent
reply
2 points

The article referenced is about their Desktop application

permalink
report
parent
reply
56 points

I kind of agree that this may be a little overblown. Exploiting this requires device and filesystem access so if you can get the keys you can already get a lot more stuff.

permalink
report
reply
12 points

A secure enclave can already be accessed by the time someone can access the Signal encryption keys , so there’s no extra security in putting the encryption keys in the secure enclave.

permalink
report
parent
reply
2 points
*
Deleted by creator
permalink
report
parent
reply
10 points
*

It’s eventually going to have to be stored in plaintext somewhere. Where are you then putting the encryption key for the encryption keys and how do you start the chain of decryption without the first key?

permalink
report
parent
reply
4 points

Sorry, I don’t think I understand what you’re suggesting. Are you saying encryption keys should themselves be encrypted?

FYI this story isn’t about plaintext passwords, it’s about plaintext encryption keys to chat history.

permalink
report
parent
reply
3 points

I deleted my reply, because I don’t know what I’m talking.

permalink
report
parent
reply
17 points

Also not a surprise because as the article notes it’s been known and discussed since at least 2018

permalink
report
reply
15 points

How else should the keys be stored?

permalink
report
reply
8 points

There are system specific encryption methods like keychain services on iOS to store exactly this kind of sensitive information.

permalink
report
parent
reply
10 points

How would that provide additional security in the particular circumstance of someone having access to the Signal encryption keys on someone’s phone?

permalink
report
parent
reply
5 points

This particular scenario involves the MacOS desktop app, not the phone app. The link is showing just an image for me - I think it’s supposed to be to https://stackdiary.com/signal-under-fire-for-storing-encryption-keys-in-plaintext/

That said, let’s compare how it works on the phone to how it could work on MacOS and how it actually works on MacOS. In each scenario, we’ll suppose you installed an app that has hidden malware - we’ll call it X (just as a placeholder name) - and compare how much data that app has access to. Access to session data allows the app to spoof your client and send+receive messages

On the phone, your data is sandboxed. X cannot access your Signal messages or session data. ✅ Signal may also encrypt the data and store an encryption key in the database, but this wouldn’t improve security except in very specific circumstances (basically it would mean that if exploits were being used to access your data, you’d need more exploits if the key were in the keychain). Downside: On iOS at least, you also don’t have access to this data.

On MacOS, it could be implemented using sandboxed data. Then, X would not be able to access your Signal messages or spoof your session unless you explicitly allowed it to (it could request access to it and you would be shown a modal). ✅ Downside: the UX to upload attachments is worse.

It could also be implemented by storing the encryption key in the keychain instead of in plaintext on disk. Then, X would not be able to access your Signal messages and session data. It might be able to request access - I’m not sure. As a user, you can access the keychain but you have to re-authenticate. ✅ Downside: None.

It’s actually implemented by storing the encryption key in plaintext, collocated with the encrypted database file. X can access your messages and session data. ❌

Is it foolproof? No, of course not. But it’s an easy step that would probably take an hour of dev time to refactor. They’re even already storing a key, just not one that’s used for this. And this has been a known issue that they’ve refused to fix for several years. Because of their hostile behavior towards forks, the FOSS community also cannot distribute a hardened version that fixes this issue.

permalink
report
parent
reply
4 points

In the device’s secure enclave (e.g. TPM).

permalink
report
parent
reply
13 points
*

How does that help when somebody has access to the phone via your PIN or password?

Anti Commercial-AI license

permalink
report
parent
reply
4 points

If I’m not mistaken you can save keys in these chips so that they can not be extracted. You can only use the key to encrypt/decrypt/sign/verify by asking the chip to do these operations with your key.

permalink
report
parent
reply
12 points

After your edit, the post points to an image only, no longer the link to the source. Please edit back the link, if not at least into the body.

permalink
report
reply
5 points

Fixed it! Thank you.

permalink
report
parent
reply

Free and Open Source Software

!foss@beehaw.org

Create post

If it’s free and open source and it’s also software, it can be discussed here. Subcommunity of Technology.


This community’s icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

Community stats

  • 1.6K

    Monthly active users

  • 816

    Posts

  • 12K

    Comments