ugjka@lemmy.world to Technology@lemmy.worldEnglish · 8 months agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square291fedilinkarrow-up11.02Karrow-down116
arrow-up11Karrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka@lemmy.world to Technology@lemmy.worldEnglish · 8 months agomessage-square291fedilink
minus-squareAdmiralRob@lemmy.ziplinkfedilinkEnglisharrow-up25arrow-down1·8 months agoTechnically, it didn’t print part of the instructions, it printed all of them.
Technically, it didn’t print part of the instructions, it printed all of them.