catculation@lemmy.zip to Technology@lemmy.worldEnglish · 8 months agoASCII art elicits harmful responses from 5 major AI chatbotsarstechnica.comexternal-linkmessage-square33fedilinkarrow-up1261arrow-down16
arrow-up1255arrow-down1external-linkASCII art elicits harmful responses from 5 major AI chatbotsarstechnica.comcatculation@lemmy.zip to Technology@lemmy.worldEnglish · 8 months agomessage-square33fedilink
minus-squareArmokGoB@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up12arrow-down2·edit-23 months agodeleted by creator
minus-squarepufferfischerpulver@feddit.delinkfedilinkEnglisharrow-up16·8 months agoTrue, but people generally understand hammers. Llms? Not so much
minus-squareArmokGoB@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up8arrow-down4·edit-23 months agodeleted by creator
minus-squareAkrenion@programming.devlinkfedilinkEnglisharrow-up9·8 months agoA lot of poluticians want hardwarelevel backdoors. It’s been declared unconstitutional quite some times in different countries but they are trying.
minus-squareWallEx@feddit.delinkfedilinkEnglisharrow-up1·8 months agoThat would be soooo bad, almost as bad as making a law against encryption
minus-squareGeneral_Effort@lemmy.worldlinkfedilinkEnglisharrow-up4·8 months agoA better example would be something like The Anarchist Cookbook. Possession is illegal in some places.
deleted by creator
True, but people generally understand hammers. Llms? Not so much
deleted by creator
A lot of poluticians want hardwarelevel backdoors. It’s been declared unconstitutional quite some times in different countries but they are trying.
That would be soooo bad, almost as bad as making a law against encryption
A better example would be something like The Anarchist Cookbook. Possession is illegal in some places.