They can… everything is closed there. It can just be “encrypted” for your eyes
Comment on Google Messages just flicked the switch on end-to-end encryption for all RCS and group chats
echo64@lemmy.world 1 year agoIt’s E2E, E2E isn’t really something you can be sneaky about unless you roll your own encryption and then make claims about it totally being safe bro
They, however, run the app you are using to type everything, the keyboard you are using to type everything and the os you are using to type everything. If they want something, they don’t need to look at your in flight messages.
Rooki@lemmy.world 1 year ago
GigglyBobble@kbin.social 1 year ago
It’s E2E, E2E isn’t really something you can be sneaky about unless you roll your own encryption and then make claims about it totally being safe bro
With a closed source app? Of course you can. How is anyone supposed to know what keys you use for encryption? Doesn't even need to be a remote one - just the key generation be reproducible by the developer.
Not_Alec_Baldwin@lemmy.world 1 year ago
I don’t know if you’re understanding that that’s his point.
If Google can reproduce the key it’s not fully “end to end” unless one of the "end"s is Google.
art@lemmy.world 1 year ago
I know they unencrypted versions from my phone because my tablet and desktop version of messages seamlessly connects to the chat. So it’s probably be E2E in transit alone.
TheHobbyist@lemmy.zip 1 year ago
The trust doesn’t even have to be in the encryption, they could very well use the same signal protocol. They would only need a copy of the keys you are using and you wouldn’t even know… That’s the problem with closed source programs, there is no certainty that its not happening (and I’m not saying it is, I can’t prove it, obviously, but the doubt remains, we need to trust these companies not to screw us over and they don’t really have the best track record in that…)
Carighan@lemmy.world 1 year ago
As if you’re any more comfortable with open source software, actively vetting the code, building it yourself, running your own server.
For all you know, Signal keeps a copy of your keys, too. And happily decrypts everything you send and sells it to russian data brokers for re-sale to advertisers.
TheHobbyist@lemmy.zip 1 year ago
There is a post gathering all security audits performed on Signal messenger:
community.signalusers.org/t/…/13243
And anybody can double check it, because it’s open source. And not only is it open source, but they have reproducible builds which mean you can verify that the apk you download is the same version as is hosted on github. They also have server code published. Pretty rare. Additionally experts in the field themselves endorse signal.
Your point is valid for many projects, as open source is not a guarantee for security. But signal is a pretty bad example for that.
Carighan@lemmy.world 1 year ago
But that’s kinda my point, you rely inherently on someone else doing what open source allows you to do. So in the end you can be tricked just the same.
I mean of course, Signal is a pretty clearcut case, but even with that one you - and I’m guessing here but tell me it ain’t true 😅 - probably do not actively verify things. You did not check the source code. You did not build your own APK to install it. I don’t think you can build the desktop version yourself but I ain’t entirely sure, granted. You probably did not probe the network data to see whether the built APK actually does what the source code promises it’ll do or has been swapped out for one that allows the server they’re running to log all messages sent.
And so on.
My point was entirely that even in the easiest of cases where we could do all of that, we do not actually do it. Hence the point of being able to do that is usually extremely moot.
And I say this as someone who, at work, checks external libraries we’re using, which is an insanely time-consuming job that entirely explains why no one in their right mind does this without being paid for it, that is, in their spare time for private use.
Mubelotix@jlai.lu 1 year ago
Bullocks