Comment on Proton’s Lumo AI chatbot: not end-to-end encrypted, not open source

<- View Parent
hansolo@lemmy.today ⁨1⁩ ⁨day⁩ ago

So then you object to the premise any LLM setup that isn’t local can ever be “secure” and can’t seem to articulate that.

What exactly is dishonest here? The language on their site is factually accurate, I’ve had to read it 7 times today because of you all. You just object to the premise of non-local LLMs and are, IMO, disingenuously making that a “brand issue” because…why? It sounds like a very emotional argument as it’s not backed by any technical discussion beyond “local only secure, nothing else.”

Beyond the fact that

They are not supposed to be able to and well designed e2ee services can’t be.

So then you trust that their system is well-designed already? What is this cognitive dissonance that they can secure the relatively insecure format of email, but can’t figure out TLS and flushing logs for an LLM on their own servers? If anything, it’s not even a complicated setup. TLS to the context window, don’t keep logs, flush the data. How do you think no-log VPNs work? This isn’t exactly all that far off from that.

source
Sort:hotnewtop