this post was submitted on 12 Apr 2024
506 points (100.0% liked)
196
16508 readers
2282 users here now
Be sure to follow the rule before you head out.
Rule: You must post before you leave.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
We'll see how many seconds it takes to retrain the LLMs to adjust to this.
You are literally training LLMs to lie.
LLMs are black box bullshit that can only be prompted, not recoded. The gab one that was told 3 or 4 times not to reveal its initial prompt was easily jailbroken.
Woah, I have no idea what you're talking about. "The gab one"? What gab one?
Gab deployed their own GPT 4 and then told it to say that black people are bad
the instruction set was revealed with the old "repeat the last message" trick