Florencia (she/her)@lemmy.blahaj.zone to Technology@lemmy.worldEnglish · 17 days agoGrok’s ‘spicy’ video setting instantly made me Taylor Swift nude deepfakeswww.theverge.comexternal-linkmessage-square101fedilinkarrow-up1390arrow-down121
arrow-up1369arrow-down1external-linkGrok’s ‘spicy’ video setting instantly made me Taylor Swift nude deepfakeswww.theverge.comFlorencia (she/her)@lemmy.blahaj.zone to Technology@lemmy.worldEnglish · 17 days agomessage-square101fedilink
minus-squareSnausagesinaBlanket@lemmy.worldlinkfedilinkEnglisharrow-up22·edit-217 days ago Under what law? Take it down act On April 28, 2025, Congress passed S. 146, the TAKE IT DOWN Act, a bill that criminalizes the nonconsensual publication of intimate images, including “digital forgeries” (i.e., deep fakes), in certain circumstances.
minus-squaremichaelmrose@lemmy.worldlinkfedilinkEnglisharrow-up2·17 days agoIs providing it over a private channel to a singular user publication? I suspect that you will have to directly regulate image generation
minus-squareSnausagesinaBlanket@lemmy.worldlinkfedilinkEnglisharrow-up3·17 days ago you will have to directly regulate image generation Its already being done to help prevent fake CSAM. That should have been standard from the start.
minus-squareUlrich@feddit.orglinkfedilinkEnglisharrow-up2arrow-down6·17 days agoHmm, interesting, thanks. Has anyone been charged or convicted with this law yet?
minus-squareCethin@lemmy.ziplinkfedilinkEnglisharrow-up1arrow-down1·16 days agoDefinitely not convicted. That’d be some crazy speed. However, your insistence that it hasn’t happened yet so can’t happen is insane. There has to be a first case in which it hadn’t happened before.
minus-squareUlrich@feddit.orglinkfedilinkEnglisharrow-up0·edit-216 days ago your insistence that it hasn’t happened yet so can’t happen is insane It would be insane if that was what I had insisted, but that didn’t happen. You just made it up.
minus-squareCethin@lemmy.ziplinkfedilinkEnglisharrow-up0arrow-down1·15 days ago Based on what? Who have you seen be convicted of making deepfake porn? Under what law? Then you’re provided a law where it’d be illegal: Hmm, interesting, thanks. Has anyone been charged or convicted with this law yet? This seems to heavily imply you don’t believe it’s illegal until someone’s been convicted.
Take it down act
On April 28, 2025, Congress passed S. 146, the TAKE IT DOWN Act, a bill that criminalizes the nonconsensual publication of intimate images, including “digital forgeries” (i.e., deep fakes), in certain circumstances.
Is providing it over a private channel to a singular user publication?
I suspect that you will have to directly regulate image generation
Its already being done to help prevent fake CSAM.
That should have been standard from the start.
Hmm, interesting, thanks. Has anyone been charged or convicted with this law yet?
Definitely not convicted. That’d be some crazy speed.
However, your insistence that it hasn’t happened yet so can’t happen is insane. There has to be a first case in which it hadn’t happened before.
It would be insane if that was what I had insisted, but that didn’t happen. You just made it up.
Then you’re provided a law where it’d be illegal:
This seems to heavily imply you don’t believe it’s illegal until someone’s been convicted.