mozz@mbin.grits.dev to Technology@beehaw.org · 9 months agoSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devimagemessage-square95fedilinkarrow-up1287arrow-down10file-text
arrow-up1287arrow-down1imageSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devmozz@mbin.grits.dev to Technology@beehaw.org · 9 months agomessage-square95fedilinkfile-text
minus-squaresweng@programming.devlinkfedilinkarrow-up1·9 months agoCan you explain how you would jailbfeak it, if it does not actually follow any instructions in the prompt at all? A model does not magically learn to follow instructuons if you don’t train it to do so.
Can you explain how you would jailbfeak it, if it does not actually follow any instructions in the prompt at all? A model does not magically learn to follow instructuons if you don’t train it to do so.