Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

Lumo: the least open 'open' AI assistant

⁨183⁩ ⁨likes⁩

Submitted ⁨⁨2⁩ ⁨days⁩ ago⁩ by ⁨XLE@piefed.social⁩ to ⁨technology@lemmy.world⁩

https://osai-index.eu/news/lumo-proton-least-open

source

Comments

Sort:hotnewtop
  • philpo@feddit.org ⁨2⁩ ⁨days⁩ ago

    Proton claiming shit that they don’t actually do or can do?

    Consider me shocked!

    source
  • UnfortunateShort@lemmy.world ⁨2⁩ ⁨days⁩ ago

    TLDR: Apps are open, models are open weight but you don’t know which one you get, and they have not yet disclosed any tweaks they may have made (although for me, it never sounded like they actually tweak the models all too much)

    source
    • XLE@piefed.social ⁨2⁩ ⁨days⁩ ago

      Your tl;dr appears to be missing some important data. You can have an opinion but please don't represent it as an accurate summary.

      Things you crucially missed:

      • Less open than every other service available
      • Bills itself as the most open
      • Server side source code is MIA
      • No model card available. Evaluations, risks, biases, guardrails and safety measures unclear.
      source
      • fmstrat@lemmy.nowsci.com ⁨1⁩ ⁨day⁩ ago

        It get worse, and the model weights is a bit inaccurate with the Sept update:

        The only open source code we have found is for the Lumo mobile and web apps. Proton calling the Lumo AI assistant open source based on that is a bit like Microsoft calling Windows open source just because there’s a github repository for Windows Terminal.

        The models listed on Lumo’s privacy policy page are “Nemo, OpenHands 32B, OLMO 2 32B, and Mistral Small 3”. OpenHands is a QWEN fine-tune, and Nemo and Mistral Small are both Mistral models. Since Proton has open-sourced neither the Lumo system prompt nor the mysterious routing methods that decide which model will handle your query, you never know what you are going to get.

        So if the server isn’t open source, and the server does all the work, this system is simply not Open Source.

        source
  • masterofn001@lemmy.ca ⁨2⁩ ⁨days⁩ ago

    They didn’t try very hard to find the source code.

    github.com/ProtonMail/WebClients/tree/…/lumo

    source
    • Ghoelian@lemmy.dbzer0.com ⁨2⁩ ⁨days⁩ ago

      In the 1 September update they state they found that web client and a mobile client as well, but not the API (I guess) containing the system prompt and the actual routing to the models.

      source
      • masterofn001@lemmy.ca ⁨1⁩ ⁨day⁩ ago

        github.com/ProtonMail/…/lumo-api-client

        source
        • -> View More Comments
  • avidamoeba@lemmy.ca ⁨2⁩ ⁨days⁩ ago

    This article makes no sense in supporting the thesis.

    source
    • AliasAKA@lemmy.world ⁨2⁩ ⁨days⁩ ago

      It supports the thesis that Lumo is not open source in many common sense ways that most people would expect when a model claims it is open source. So in that sense, it does though.

      source
    • XLE@piefed.social ⁨2⁩ ⁨days⁩ ago

      Can you be more specific?

      source
    • Alphane_Moon@lemmy.world ⁨2⁩ ⁨days⁩ ago

      How so? It’s pretty clear that Proton using “open source” in context of their LLM service is demonstrated to be false.

      source
  • AliasAKA@lemmy.world ⁨2⁩ ⁨days⁩ ago

    It supports the thesis that Lumo is not open source in many common sense ways that most people would expect when a model claims it is open source. So in that sense, it does though.

    source