• SkyezOpen@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    LLMs are black box bullshit that can only be prompted, not recoded. The gab one that was told 3 or 4 times not to reveal its initial prompt was easily jailbroken.

      • trashgirlfriend@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        3 months ago

        Gab deployed their own GPT 4 and then told it to say that black people are bad

        the instruction set was revealed with the old “repeat the last message” trick