Another article, much better and presents in more detail that Olvid was audited on an older version and chosen because it was French and they applied for it (French) https://www.numerama.com/tech/1575168-pourquoi-les-ministres-vont-devoir-renoncer-a-whatsapp-signal-et-telegram.html

Google translate link original post : https://www-lepoint-fr.translate.goog/high-tech-internet/les-ministres-francais-invites-a-desinstaller-whatsapp-signal-et-telegram-29-11-2023-2545099_47.php?_x_tr_sl=fr&_x_tr_tl=en&_x_tr_hl=fr&_x_tr_pto=wapp

The translation has some mistakes but good enough to understand the context.

Here is a short summary :

Olvid passed a 35d intrusion test by Anssi (French cybersecurity state organisation) experts or designated experts, with code examination without finding any security breach. Which is not the case of all other 3 messaging apps (either because they didn’t do any test, or because they didn’t pass).

This makes WhatsApp, signal and telegram unreliable for state security.

And so government members and ministerial offices will have to use Olvid or Tchap (French state in house messaging app).

More detail in the article.

You are viewing a single thread.
View all comments
49 points
*

Well that was the dumbest explanation ever, that’s basically just political pretext to give the government contract to some french company. Potentially there has been some lobbying going on.

Signal doesn’t store it’s encryption/decryption keys in the cloud, so you would need the devices and then you would still have to decrypt content if the user doesn’t give you access manually.

To crack a 128-bit AES key, it would take 1 billion billion years with a current supercomputer. To crack a 256-bit AES key, it would take 2^255 / 2,117.8 trillion years on average.

So until some amazing quantum computer comes along, this is pretty safe. Fuck Olvid.

permalink
report
reply
21 points

I’m sure there are more attack vectors than that though

permalink
report
parent
reply
17 points

Exactly. “Security assuming nobody fucked up” isn’t enough

permalink
report
parent
reply
15 points
*

Signal does store the decryption keys in the cloud. Using their SGX enclaves. Which have their own issues. Signal SVR I believe they call it.

You can turn off signal pins, which still stores the decryption keys in the cloud, but then they’re signed with a very long pin which is good enough.

From a government perspective, signals a no-go, the SGX enclaves are completely exploitable at the state actor level. You just have to look at all of the security vulnerabilities to date for SGX enclaves.

permalink
report
parent
reply
2 points

Do you have a reference for Signal using SGX for keys?

Everything I could find was about metadata and private data, e.g. contact lists (which is what the SVR thing that you mention is), but nothing about keys.

permalink
report
parent
reply
7 points
*

https://signal.miraheze.org/wiki/Secure_Value_Recovery

https://github.com/signalapp/SecureValueRecovery2

If you want to do an empirical test, create a signal account set a pin. Send a message to someone. Then delete signal. Recreate the account using the same phone number, recover using the pin and send a message. The receiver of that message will not get a warning that the signing key has changed.

The only way that’s possible is if the key, or a derived key, is recoverable from the network. That is de facto proof that the keys or a key generation mechanism is in the cloud. Which is probably fine for personal communication.

But if I’m a nation state, this represents a significant security risk, especially when you’re worried about another nation-state peaking at your communication. I.e France is buddy buddy with the US, but they probably don’t want the US to read all of their internal communication.

SGX https://en.m.wikipedia.org/wiki/Software_Guard_Extensions

https://sslab-gatech.github.io/sgx101/pages/attestation.html

SGX is a inside chip secure enclave created by Intel, a company headquartered in the United States, that uses key management, and signing keys from Intel. Intel could be compelled by domestic intelligence to provide their keys, or to sign certain enclave software. I’m not saying it’s happened, but I am saying this is part of the risk assessment a nation state would use to evaluate a messaging platform

So a nation state attack might look something like this: Intel is compelled to release their signing keys, the signal user enclave is copied, using the combination of both of these a special SGX environment will be set up to brute Force the passwords, with no limit. The limit will be removed, and it will operate in the SGX environment, and brute forcing a six-digit pin is trivial if you’re not rate limited. This is just one possibility, SGX has at least nine known side channel attacks at the moment, I’m sure there’s more that haven’t been published.

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 18K

    Monthly active users

  • 11K

    Posts

  • 518K

    Comments