Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

Talking to dead people through AI: the business of ‘digital resurrection’ might not be helpful, ethical… or even legal.

⁨87⁩ ⁨likes⁩

Submitted ⁨⁨6⁩ ⁨months⁩ ago⁩ by ⁨Dot@feddit.org⁩ to ⁨technology@lemmy.world⁩

https://theconversation.com/talking-to-dead-people-through-ai-the-business-of-digital-resurrection-might-not-be-helpful-ethical-or-even-legal-242404

source

Comments

Sort:hotnewtop
  • illi@lemm.ee ⁨6⁩ ⁨months⁩ ago

    Hey, I’ve seen this one

    source
    • VintageGenious@sh.itjust.works ⁨6⁩ ⁨months⁩ ago

      Black mirror was never a scifi series, rather a warning

      source
      • Badeendje@lemmy.world ⁨6⁩ ⁨months⁩ ago

        It’s seems a lot of Sci-Fi is a warning.

        source
        • -> View More Comments
      • illi@lemm.ee ⁨6⁩ ⁨months⁩ ago

        And not a “don’t do this” kind of warning. More of a “this will happen, get ready” kind of warning

        source
      • aeronmelon@lemmy.world ⁨6⁩ ⁨months⁩ ago

        So was The Twilight Zone, no one listened to those parables either.

        source
    • RaoulDook@lemmy.world ⁨6⁩ ⁨months⁩ ago

      Yeah the Max Headroom show covered this topic back in 1987

      tiny-voice.com/max-headroom-30-years-into-the-fut…

      source
    • Ganbat@lemmy.dbzer0.com ⁨6⁩ ⁨months⁩ ago

      Also Doctor Who.

      source
  • nyan@lemmy.cafe ⁨6⁩ ⁨months⁩ ago

    It’s one of those things that needs careful handling and is unlikely to get it. I can see it having some value in therapy, but only if there is, y’know, an actual therapist involved who can make an informed call as to whether their patient will be helped or harmed by talking to a digital fake of a loved one. Instead, we’re likely to see a ham-fisted “allow all” or “forbid all” call by regulators.

    source
  • tee9000@lemmy.world ⁨6⁩ ⁨months⁩ ago

    How the fuck could this be illegal?

    source
    • shneancy@lemmy.world ⁨6⁩ ⁨months⁩ ago

      wow, so many reasons

      • to create a mimic of a person you must first destroy their privacy
      • after an AI has devoured all they’ve ever written or spoken on video it will then mimic such person very well, but most likely still be a legal property of a company that made it
      • in a situation like that you’d then have to pay a subscription to interact with the mimic (because god forbid you ever get actually sold something nowadays)

      now imagine having to pay to talk with a ghost of your loved one, a chatbot that sometimes allows you to forget that the actual person is gone, and makes all the moments where that illusion is broken all the more painful. A chatbot that denies you grief, and traps you in hell where you can talk with the person you lost, but never touch them, never feel them, never see them grow (or you could pay extra for the chatbot to attend new skill classes you could talk about :)).

      It would make grieving impossible and take constant advantage of those who “just want to say goodbye”. Grief is already hard as is, a wide spread mimicry of our dead ones would make it a psychological torture

      for more information watch a prediction of our future a fun sci-fi show called Black Mirror, specifically the episode titled Be Right Back (entire series is fully episodic you don’t need to watch from the start)

      source
      • tee9000@lemmy.world ⁨6⁩ ⁨months⁩ ago

        If someone came to a service provider and wanted it, and provided media to train on, and agreed to whatever costs are involved, isnt that enitrely their business?

        source
        • -> View More Comments
  • rottingleaf@lemmy.world ⁨6⁩ ⁨months⁩ ago

    No difference from talking to dead people via Markov chain fed their quotes.

    I mean, Star Wars holocrons have such UIs sometimes - an avatar of their maker, which one can talk to, but, first, those are closer to AGI, second, there’s no “model”, there’s just data (texts and images mostly) in there.

    source