• sweng@programming.dev
    link
    fedilink
    arrow-up
    1
    ·
    9 months ago

    Can you explain how you would jailbfeak it, if it does not actually follow any instructions in the prompt at all? A model does not magically learn to follow instructuons if you don’t train it to do so.