• frog 🐸@beehaw.org
    link
    fedilink
    English
    arrow-up
    4
    ·
    6 months ago

    Sure, you should be free to make one. But when you die and an AI company contacts all your grieving friends and family to offer them access to an AI based on you (for a low, low fee!), there are valid questions about whether that will cause them harm rather than help - and grieving people do not always make the most rational decisions. They can very easily be convinced that interacting with AI-you would be good for them, but it actually prolongs their grief and makes them feel worse. Grieving people are vulnerable, and I don’t think AI companies should be free to prey on the vulnerable, which is a very, very realistic outcome of this technology. Because that is what companies do.

    So I think you need to ask yourself not whether you should have the right to make an AI version of yourself for those who survive your death… but whether you’re comfortable with the very likely outcome that an abusive company will use their memories of you to exploit their grief and prolong their suffering. Do you want to do that to people you care about?

    • Zaktor@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      6 months ago

      This is speculation of corporate action completely divorced from the specifics of this technology and particulars of this story. The result of this could be a simple purchase either of hardware or software to be used as chosen by the person owning it. And the person commissioning it can specify exactly who such a simulacrum is presented to. None of this has to be under the power of the company that builds the simulacrums, and if it is structured that way, then that’s the problem that should be rejected or disallowed, not that this particular form of memento exists.

      • intensely_human@lemm.ee
        link
        fedilink
        arrow-up
        2
        ·
        6 months ago

        It could still be a bad idea even if the profit motive isn’t involved.

        One might be trying to help with the big surprise stash of heroin they leave to their widow, and she might embrace it fully, but that doesn’t make it a good idea or good for her.

        • Zaktor@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 months ago

          Sure, and that point is being made in multiple other places in these comments. I find it patronizing, but that’s neither here nor there as it’s not what this comment thread is about.

    • FaceDeer@fedia.io
      link
      fedilink
      arrow-up
      1
      ·
      6 months ago

      But when you die and an AI company contacts all your grieving friends and family to offer them access to an AI based on you (for a low, low fee!)

      You can stop right there, you’re just imagining a scenario that suits your prejudices. Of all the applications for AI that I can imagine that would be better served by a model that is entirely under my control this would be the top of the list.

      With that out of the way the rest of your rhetorical questions are moot.