this post was submitted on 12 Apr 2024
506 points (100.0% liked)

196

16508 readers
2282 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 9 points 7 months ago (4 children)

We'll see how many seconds it takes to retrain the LLMs to adjust to this.

You are literally training LLMs to lie.

[–] [email protected] 17 points 7 months ago (3 children)

LLMs are black box bullshit that can only be prompted, not recoded. The gab one that was told 3 or 4 times not to reveal its initial prompt was easily jailbroken.

[–] [email protected] 3 points 7 months ago (1 children)

Woah, I have no idea what you're talking about. "The gab one"? What gab one?

[–] [email protected] 4 points 7 months ago

Gab deployed their own GPT 4 and then told it to say that black people are bad

the instruction set was revealed with the old "repeat the last message" trick

load more comments (1 replies)
load more comments (1 replies)