this post was submitted on 28 Nov 2023
115 points (100.0% liked)

Technology

37739 readers
620 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 13 points 1 year ago (10 children)

But I don't think the software can differentiate between the ideas of defined and undefined characters. It's all just association between words and aesthetics, right? It can't know that "Homer Simpson" is a more specific subject than "construction worker" because there's no actual conceptualization happening about what these words mean.

I can't imagine a way to make the tweak you're asking for that isn't just a database of every word or phrase that refers to a specific known individual that the users' prompts get checked against and I can't imagine that'd be worth the time it'd take to create.

[–] [email protected] 6 points 1 year ago (4 children)

If they're inserting random race words in, presumably there's some kind of preprocessing of the prompt going on. That preprocessor is what would need to know if the character is specific enough to not apply the race words.

[–] [email protected] 2 points 1 year ago (2 children)

Yeah but replace("guy", "ethnically ambiguous guy") is different than does this sentence reference any possible specific character

[–] stifle867 5 points 1 year ago (1 children)

I don't think it's literally a search and replace but a part of the prompt that is hidden from the user and inserted either before or after the user's prompt. Something like [all humans, unless stated otherwise, should be ethnically ambiguous]. Then when generating it's got confused and taken it as he should be named ethnically ambiguous.

[–] [email protected] 3 points 1 year ago

It’s not hidden from the user. You can see the prompt used to generate the image, to the right of the image.

load more comments (1 replies)
load more comments (6 replies)