• Aggravationstation@feddit.uk
    link
    fedilink
    English
    arrow-up
    96
    arrow-down
    1
    ·
    edit-2
    11 days ago

    The year is 1997. A young boy is about to watch a porn video for the first time on a grainy VHS tape. An older version of himself from the far off year of 2025 appears.

    Me: “You know, in the future, you’ll make your own porn videos.”

    90s me: “Wow, you mean I’ll get to have lots of hot sex!?!?”

    Me: “Ha! No. So Nvidia will release this system called CUDA…”

    • turnip@lemm.ee
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      10 days ago

      Then another company called Deepseek will release a system called low level programming that replaces CUDA.

  • MoonlightFox@lemmy.world
    link
    fedilink
    English
    arrow-up
    61
    arrow-down
    1
    ·
    edit-2
    11 days ago

    First off, I am sex positive, pro porn, pro sex work, and don’t believe sex work should be shameful, and that there is nothing wrong about buying intimacy from a willing seller.

    That said. The current state of the industry and the conditions for many professionals raises serious ethical issues. Coercion being the biggest issue.

    I am torn about AI porn. On one hand it can produce porn without suffering, on the other hand it might be trained on other peoples work and take peoples jobs.

    I think another major point to consider going forward is if it is problematic if people can generate all sorts of illegal stuff. If it is AI generated it is a victimless crime, so should it be illegal? I personally feel uncomfortable with the thought of several things being legal, but I can’t logically argue for it being illegal without a victim.

    • TheGrandNagus@lemmy.world
      link
      fedilink
      English
      arrow-up
      29
      ·
      edit-2
      10 days ago

      I think another major point to consider going forward is if it is problematic if people can generate all sorts of illegal stuff. If it is AI generated it is a victimless crime, so should it be illegal? I personally feel uncomfortable with the thought of several things being legal, but I can’t logically argue for it being illegal without a victim.

      I’ve been thinking about this recently too, and I have similar feelings.

      I’m just gonna come out and say it without beating around the bush: what is the law’s position on AI-generated child porn?

      More importantly, what should it be?

      It goes without saying that the training data absolutely should not contain CP, for reasons that should be obvious to anybody. But what if it wasn’t?

      If we’re basing the law on pragmatism rather than emotional reaction, I guess it comes down to whether creating this material would embolden paedophiles and lead to more predatory behaviour (i.e. increasing demand), or whether it would satisfy their desires enough to cause a substantial drop in predatory behaviour (I.e. lowering demand).

      And to know that, we’d need extensive and extremely controversial studies. Beyond that, even in the event allowing this stuff to be generated is an overall positive (and I don’t know whether it would or won’t), will many politicians actually call for this stuff to be allowed? Seems like the kind of thing that could ruin a political career. Nobody’s touching that with a ten foot pole.

      • michaelmrose@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        11 days ago

        Let’s play devils advocate. You find Bob the pedophile with pictures depicting horrible things. 2 things are true.

        1. Although you can’t necessarily help Bob you can lock him up preventing him from doing harm and permanently brand him as a dangerous person making it less likely for actual children to be harmed.

        2. Bob can’t claim actual depictions of abuse are AI generated and force you to find the unknown victim before you can lock him and his confederates up. If the law doesn’t distinguish between simulated and actual abuse then in both cases Bob just goes to jail.

        A third factor is that this technology and the inherent lack of privacy on the internet could potentially pinpoint numerous unknown pedophiles who can even if they haven’t done any harm yet be profitably persecuted to societies ultimate profit so long as you value innocent kids more than perverts.

        • shalafi@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          11 days ago

          Am I reading this right? You’re for prosecuting people who have broken no laws?

          I’ll add this; I have sexual fantasies (not involving children) that would be repugnant to me IRL. Should I be in jail for having those fantasies, even though I would never act on them?

          This sounds like some Minority Report hellscape society.

          • michaelmrose@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            4
            ·
            edit-2
            10 days ago

            Am I reading this right? You’re for prosecuting people who have broken no laws?

            No I’m for making it against the law to simulate pedophile shit as the net effect is fewer abused kids than if such images were to be legal. Notably you are free to fantasize about whatever you like its the actual creation and sharing of images that would be illegal. Far from being a minority report hellscape its literally the present way things already are many places.

            • Petter1@lemm.ee
              link
              fedilink
              English
              arrow-up
              4
              ·
              10 days ago

              Lol, how can you say that do confidently? How would you know that with fewer AI CP you get less abused kids? And what is the logic behind it?

              Demand doesn’t really drop if something is illegal (same goes for drugs). The only thing you reduce is offering, which just resulting in making the thing that got illegal more valuable (this wakes attention of shady money grabbers that hate regulation / give a shit about law enforcement and therefore do illegal stuff to get money) and that you have to pay a shitton of government money maintaining all the prisons.

              • michaelmrose@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                arrow-down
                1
                ·
                10 days ago

                Basically every pedo in prison is one who isn’t abusing kids. Every pedo on a list is one who won’t be left alone with a young family member. Actually reducing AI CP doesn’t actually by itself do anything.

                • AwesomeLowlander@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  10 days ago

                  Wrong. Every pedo in prison is one WHO HAS ALREADY ABUSED A CHILD, whether directly or indirectly. There is an argument to be made, and some studies that show, that dealing with Minor Attracted People before they cross the line can be effective. Unfortunately, to do this we need to be able to have a logical and civil conversation about the topic, and the current political climate does not allow for that conversation to be had. The consequence is that preventable crimes are not being prevented, and more children are suffering for it in the long run.

        • MoonlightFox@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          11 days ago

          Good arguments. I think I am convinced that both cases should be illegal.

          If the pictures are real they probably increase demand, which is harmful. If the person knew, then the action therefore should result in jail and forced therapy.

          If the pictures are not, forced therapy is probably the best option.

          So I guess it being illegal in most cases simply to force therapy is the way to go. Even if it in one case is “victimless”. If they don’t manage to plausibly seem rehabilitated by professionals, then jail time for them.

          I would assume (but don’t really know) most pedophiles don’t truly want to act on it, and don’t want to have those urges. And would voluntarily go to therapy.

          Which is why I am convinced prevention is the way to go. Not sacrificing privacy. In Norway we have anonymous ways for pedophiles to seek help. There have been posters and ads for it a lot of places a year back or something. I have not researched how it works in practice though.

      • MoonlightFox@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        11 days ago

        It can generate combinations of things that it is not trained on, so not necessarily a victim. But of course there might be something in there, I won’t deny that.

        However the act of generating something does not create a new victim unless there is someones likeness and it is shared? Or is there something ethical here, that I am missing?

        (Yes, all current AI is basically collective piracy of everyones IP, but besides that)

        • surewhynotlem@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          11 days ago

          Watching videos of rape doesn’t create a new victim. But we consider it additional abuse of an existing victim.

          So take that video and modify it a bit. Color correct or something. That’s still abuse, right?

          So the question is, at what point in modifying the video does it become not abuse? When you can’t recognize the person? But I think simply blurring the face wouldn’t suffice. So when?

          That’s the gray area. AI is trained on images of abuse (we know it’s in there somewhere). So at what point can we say the modified images are okay because the abused person has been removed enough from the data?

          I can’t make that call. And because I can’t make that call, I can’t support the concept.

          • Petter1@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            ·
            10 days ago

            With this logic, any output of any pic gen AI is abuse… I mean, we can 100% be sure that there are CP in training data (it would be a very bug surprise if not) and all output is result of all training data as far as I understand the statistical behaviour of photo gen AI.

              • Petter1@lemm.ee
                link
                fedilink
                English
                arrow-up
                1
                ·
                7 days ago

                Well AI is by design not able to curate its training data, but companies training the models would in theory be able to. But it is not feasible to sanitise this huge stack of data.

          • dubyakay@lemmy.ca
            link
            fedilink
            English
            arrow-up
            1
            ·
            11 days ago

            It’s not just AI that can create content like that though. 3d artists have been making victimless rape slop of your vidya waifu for well over a decade now.

            • surewhynotlem@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              10 days ago

              Yeah, I’m ok with that.

              AI doesn’t create, it modifies. You might argue that humans are the same, but I think that’d be a dismal view of human creativity. But then we’re getting weirdly philosophical.

          • MoonlightFox@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            11 days ago

            I see the issue with how much of a crime is enough for it to be okay, and the gray area. I can’t make that call either, but I kinda disagree with the black and white conclusion. I don’t need something to be perfectly ethical, few things are. I do however want to act in a ethical manner, and strive to be better.

            Where do you draw the line? It sounds like you mean no AI can be used in any cases, unless all the material has been carefully vetted?

            I highly doubt there isn’t illegal content in most AI models of any size by big tech.

            I am not sure where I draw the line, but I do want to use AI services, but not for porn though.

  • Dr. Moose@lemmy.world
    link
    fedilink
    English
    arrow-up
    39
    arrow-down
    1
    ·
    11 days ago

    Good.

    Hot take but I think AI porn will be revolutionary and mostly in a good way. Sex industry is extremely wasteful and inefficient use of our collective time that also often involves a lot of abuse and dark business practices. It’s just somehow taboo to even mention this.

    • JackFrostNCola@lemmy.world
      link
      fedilink
      English
      arrow-up
      24
      ·
      11 days ago

      Sometimes you come across a video and you are like ‘oh this. I need more of THIS.’
      And then you start tailoring searches to try find more of the same but you keep getting generic or repeated results because the lack of well described/defined content overuse of video TAGs (obviously to try get more views with a large net rather than being specific).
      But would i watch AI content if i could feed it a description of what i want? Hell yeah!

      I mean there are only so many videos of girls giving a blowjob while he eats tacos and watches old Black & White documentaries about the advancements of mechanical production processes.

      • dumbass@leminal.space
        link
        fedilink
        English
        arrow-up
        13
        ·
        10 days ago

        I hate it when you find a good video, it does the job really well, so a few days later you’re like, yeah let’s watch that one again, you type in every single description you can about the video and find everything else but the video you want and they’re barely anywhere near as good.

        Hell I’d take an AI porn search engine for now, let me describe in detail what the video I’m looking for is so I can finally see it again.

        • Scrollone@feddit.it
          link
          fedilink
          English
          arrow-up
          2
          ·
          10 days ago

          Always, always download your favourite videos. Today they’re available, tomorrow you don’t know. (Remember the great PornHub purge? Pepperidge Farms remember)

  • TachyonTele@lemm.ee
    link
    fedilink
    English
    arrow-up
    38
    arrow-down
    2
    ·
    edit-2
    11 days ago

    Who are the girls in the picture? We can do this, team. Left to right, starting at the top.

    1. ??

    2. ??

    3. bayonet

    4. little mermaid

    5. ??

    6. ??

    7. Jinx

    8. ??

    9. Rei

    10. Rei

    11. lol Rei

    12. Aerith

  • SabinStargem@lemmings.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    10 days ago

    I am looking forward to this becoming common and effective. Being able to generate animated hentai in assorted styles would be neat. Lina Inverse getting boinked by Donald Duck could be a thing.

  • The Pantser@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    11 days ago

    Ok I checked it out, why are there so many people with boobs and with 3 foot cocks? Talk about unrealistic body expectations.

    • Regrettable_incident@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 days ago

      Yeah, but I’m sure it’ll improve in time. I can’t really see what’s the point of AI porn though, unless it’s to make images of someone who doesn’t consent, which is shit. For everything else, regular porn has it covered and looks better too.

      • rabber@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 days ago

        Porn industry destroys women’s lives so if AI becomes indistinguishable then women don’t need to sell their bodies in that way anymore

          • rabber@lemmy.ca
            link
            fedilink
            English
            arrow-up
            0
            ·
            10 days ago

            You want me to find a citation explaining why women selling their bodies is bad?

            • bigb@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              10 days ago

              Somehow you’re both partially wrong. There are people who have been badly abused by the porn industry. There’s always a need to make sure people are safe. But there’s nothing wrong if someone wants to willingly perform in pornography.

              • rabber@lemmy.ca
                link
                fedilink
                English
                arrow-up
                0
                ·
                10 days ago

                There aren’t very many pornstars who don’t regret it. But you can find countless examples of regret.

                • Nalivai@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  edit-2
                  9 days ago

                  But it’s mostly because of you people. You make their lifes miserable by pointless moralisation. You are the reason the industry is full of shady monsters, you made it that way with your constant religious fever.

  • Sans_Seraph@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 days ago

    What I find incredible is that they named this thing WanX, which can alternatively be pronounced Wanks. Nominative Determinism at its finest