Another article, much better and presents in more detail that Olvid was audited on an older version and chosen because it was French and they applied for it (French) https://www.numerama.com/tech/1575168-pourquoi-les-ministres-vont-devoir-renoncer-a-whatsapp-signal-et-telegram.html

Google translate link original post : https://www-lepoint-fr.translate.goog/high-tech-internet/les-ministres-francais-invites-a-desinstaller-whatsapp-signal-et-telegram-29-11-2023-2545099_47.php?_x_tr_sl=fr&_x_tr_tl=en&_x_tr_hl=fr&_x_tr_pto=wapp

The translation has some mistakes but good enough to understand the context.

Here is a short summary :

Olvid passed a 35d intrusion test by Anssi (French cybersecurity state organisation) experts or designated experts, with code examination without finding any security breach. Which is not the case of all other 3 messaging apps (either because they didn’t do any test, or because they didn’t pass).

This makes WhatsApp, signal and telegram unreliable for state security.

And so government members and ministerial offices will have to use Olvid or Tchap (French state in house messaging app).

More detail in the article.

49 points
*

Well that was the dumbest explanation ever, that’s basically just political pretext to give the government contract to some french company. Potentially there has been some lobbying going on.

Signal doesn’t store it’s encryption/decryption keys in the cloud, so you would need the devices and then you would still have to decrypt content if the user doesn’t give you access manually.

To crack a 128-bit AES key, it would take 1 billion billion years with a current supercomputer. To crack a 256-bit AES key, it would take 2^255 / 2,117.8 trillion years on average.

So until some amazing quantum computer comes along, this is pretty safe. Fuck Olvid.

permalink
report
reply
21 points

I’m sure there are more attack vectors than that though

permalink
report
parent
reply
17 points

Exactly. “Security assuming nobody fucked up” isn’t enough

permalink
report
parent
reply
15 points
*

Signal does store the decryption keys in the cloud. Using their SGX enclaves. Which have their own issues. Signal SVR I believe they call it.

You can turn off signal pins, which still stores the decryption keys in the cloud, but then they’re signed with a very long pin which is good enough.

From a government perspective, signals a no-go, the SGX enclaves are completely exploitable at the state actor level. You just have to look at all of the security vulnerabilities to date for SGX enclaves.

permalink
report
parent
reply
2 points

Do you have a reference for Signal using SGX for keys?

Everything I could find was about metadata and private data, e.g. contact lists (which is what the SVR thing that you mention is), but nothing about keys.

permalink
report
parent
reply
7 points
*

https://signal.miraheze.org/wiki/Secure_Value_Recovery

https://github.com/signalapp/SecureValueRecovery2

If you want to do an empirical test, create a signal account set a pin. Send a message to someone. Then delete signal. Recreate the account using the same phone number, recover using the pin and send a message. The receiver of that message will not get a warning that the signing key has changed.

The only way that’s possible is if the key, or a derived key, is recoverable from the network. That is de facto proof that the keys or a key generation mechanism is in the cloud. Which is probably fine for personal communication.

But if I’m a nation state, this represents a significant security risk, especially when you’re worried about another nation-state peaking at your communication. I.e France is buddy buddy with the US, but they probably don’t want the US to read all of their internal communication.

SGX https://en.m.wikipedia.org/wiki/Software_Guard_Extensions

https://sslab-gatech.github.io/sgx101/pages/attestation.html

SGX is a inside chip secure enclave created by Intel, a company headquartered in the United States, that uses key management, and signing keys from Intel. Intel could be compelled by domestic intelligence to provide their keys, or to sign certain enclave software. I’m not saying it’s happened, but I am saying this is part of the risk assessment a nation state would use to evaluate a messaging platform

So a nation state attack might look something like this: Intel is compelled to release their signing keys, the signal user enclave is copied, using the combination of both of these a special SGX environment will be set up to brute Force the passwords, with no limit. The limit will be removed, and it will operate in the SGX environment, and brute forcing a six-digit pin is trivial if you’re not rate limited. This is just one possibility, SGX has at least nine known side channel attacks at the moment, I’m sure there’s more that haven’t been published.

permalink
report
parent
reply
36 points

I don’t know much but what I do know is when a government endorses a secure messaging service, it’s definitely not secure.

permalink
report
reply
4 points

They’re using it themselves, not forcing citizens to use it. It’s when they force citizens to use an app they claim is secure that I am distrustful. I would assume their intentions are more pure when it’s their own state security rather than their citizens’ privacy.

permalink
report
parent
reply
24 points

Not being able to inspect their code vs no passing are different things.

permalink
report
reply
8 points
*

Are they? If you want to know if something is secure enough to use then not being able to examine the code should obviously disqualify it.

permalink
report
parent
reply
-1 points

Sure it does, but that doesn’t make it bad.

Open source code is not the only solution to secure communication.

You can be extremely secure on closed source tools as well.

If they found specific issues with Signal aside from not being allowed to freely inspect their code base, I suspect we would be hearing about it. Instead I don’t see specific security failings just hat it didn’t make the measure for their security software audit.

As an example of something that is closed source and trusted:

The software used to load data and debug the F-35 fighter jet.

Pretty big problem for 16 countries if that isn’t secure… closed source. So much s you can’t even run tests against the device for loading data to the jet live. It’s a problem to sort out, but it’s an example of where highly important communication protocols are not open source and trusted by the governments of many countries.

If their particular standard here was open source, ok, but they didn’t do anything to assure the version they inspected would be the only version used. In fact every release from that basement pair of programmers could inadvertently have a flaw in it, which this committee would not be reviewing in the code base for its members of parliament.

permalink
report
parent
reply
9 points

Lol at military stuff being secure. Most often it’s not, it’s just hidden. There was an Ars Technica article about the “secure” devices used at military bases being full of holes for example: https://arstechnica.com/security/2023/08/next-gen-osdp-was-supposed-to-make-it-harder-to-break-in-to-secure-facilities-it-failed/

When code is hidden all you know for sure is that you don’t know anything about it. You certainly can’t say it’s secure.

If a piece of code or a system is really secure then it does not care if the code is open because the security lays in the quality of its algorithms and the strength of the keys.

permalink
report
parent
reply
2 points
*

Well let’s give some counter examples in the softwares I mentioned :

  • WhatsApp closed : Owned by Facebook. Well Facebook had multiple data leaks, privacy violations and nothing substantial was done about it. Definitely not trustable (also zero days are getting sold on the black market for WhatsApp (https://techcrunch.com/2023/10/05/zero-days-for-hacking-whatsapp-are-now-worth-millions-of-dollars/ ).

  • Telegram closed : not end to end encrypted. Russian app. Not trustable.

  • Signal open : well this one is e to e encrypted. Open source, maybe could be trusted. Seems to have passed some security audits (https://community.signalusers.org/t/overview-of-third-party-security-audits/13243), tho it’s based in the US and uses servers, maybe the US may have super computers capable of decrypting such communications. However is signal has switched their encryption to quantum computer resistance it may be too hard even for a state actor. However they also “debunked”/ignored zero-day reports which were not reported through their own tool, and by asking the US for confirmation. I am not sure if the US can be trusted to give confirmation about the existance or not of vulnerabilities when they are very likely to use them (https://thehackernews.com/2023/10/signal-debunks-zero-day-vulnerability.html?m=1).

  • Olvid open (servers closed) : is French, e to e, and backed up by an encryption PhD. And why not use a local messaging app witch also is very secure and open source.

Notice how closed source is untrusted here. The economic activity of the tool changes how trustable it is. Military équipement has a huge and strict budget, it has to be secure.

Communication apps are user first. So they do what they can get away with, and that is very true for Facebook.

permalink
report
parent
reply
9 points
*

They had Tchap that may not be perfect but is open source (based on matrix/element), hosted in France and already used by 400 000 ppl from the public services… Why pay for a new app? Don’t get it…

permalink
report
reply
8 points

on one hand they are trying to illegalize encrypted messaging on the other they’re saying that it’s not secure? 😅

permalink
report
reply
2 points
*

That is an European proposition, and not French at all. France can stand for or against that.

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 18K

    Monthly active users

  • 11K

    Posts

  • 518K

    Comments