this post was submitted on 27 Jan 2024
32 points (100.0% liked)

NotAwfulTech

386 readers
4 users here now

a community for posting cool tech news you don’t want to sneer at

non-awfulness of tech is not required or else we wouldn’t have any posts

founded 1 year ago
MODERATORS
 

Remember how we were told that genAI learns "just like humans", and how the law can't say about fair use, and I guess now all art is owned by big tech companies?

Well, of course it's not true. Exploiting a few of the ways in which genAI --is not-- like human learners, artists can filter their digital art in such a way that if a genAI tool consumes it, it actively reduces the quality of the model, undoing generalization and bleading into neighboring concepts.

Can an AI tool be used to undo this obfuscation? Yes. At scale, however, doing so requires increasing compute costs more and more. This also looks like an improvable method, not a dead end -- adversarial input design is a growing field of machine learning with more and more techniques becoming highly available. Imagine this as sort of "cryptography for semantics" in the sense that it presents asymetrical work on AI consumers (while leaving the human eye much less effected).

Now we just need labor laws to catch up.

Wouldn't it be funny if not only does generative AI not lead to a boring dystopia, but the proliferation and expansion of this and similar techniques to protect human meaning eventually put a lot of grifters out of business?

We must have faith in the dark times. Share this with your artist friends far and wide!

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 11 points 11 months ago

@[email protected] can we ban the grifter?

corbin’s track record both on and off awful.systems indicates they aren’t any kind of grifter

in fact, myself and @[email protected] have previously had a conversation about this type of technology (Nightshade and Glaze) where I was initially quite excited about it, but David and others brought up a lot of the same points corbin did here. there were some very solid social points made around the tech too, beyond the licensing and technical points we’ve seen here — should we really be establishing the expectation that artists need to defend their work using this specific proprietary technology? that feels way too close to the bullshit the NFT grifters pulled, where artists could opt out of their work being stolen and sold as an NFT only by following a specific set of steps for each and every NFT market, which doesn’t work. this kind of tech also opens the door for rent-seeking; techniques like Glaze and Nightshade can be broken by changes to generative models, which would keep artists on a treadmill continually paying for the latest versions of these proprietary tools, or else. it feels rather like a protection racket run by whoever has the most access to the models — and that’ll always be the same assholes who run the generative AI.

so I ended up with the strong impression that this technology won’t make things better for artists and other folks who are being exploited by the AI industrial complex; that this might not be something with a purely technical solution. and I think I understand your strong reaction to some of the posts here, because that fucking sucks. there isn’t a clean engineering solution to this problem that my increasingly technofascist industry created, and when you grow up being told (by some of the same techfash fucks who’re now behind some of the worst use of technology I can think of) that all you do is engineering, it’s easy to feel helpless.

but we aren’t helpless. technofascism is structured to produce and exploit that feeling when it’s engaged with on a purely technical level, but the systems established by technofascism (LLMs, generative AI, cryptocurrencies, and others) are plainly ridiculous when viewed through any other lens. the technofascist goal isn’t to win on technical merit (there isn’t any), but to normalize ridiculousness. the only way I know to push back against that is social. on a small scale, that’s part of what sneering is — any asshole pushing this ridiculous shit should feel ridiculous doing it, as a lot of the crypto grifters felt when the public at large started sneering at crypto (thanks to the efforts of David, Amy, Molly, and many others). on a larger scale, we desperately need systemic change. as engineers, we’re constantly told we don’t need unions or solidarity, particularly with folks like artists who we’re told are unimportant. it is very intentional that attitudes like that enable technofascism.

if and when we have those social factors established, a version of these tools with less potential for exploitation might be worth considering. but I see it kind of like the relationship between fediverse software and its community — federation is generally a good thing, but it’s absolutely nothing (and would probably be a net negative) without posters who generally want the fediverse to be a cozy place to make good posts; the polar opposite of the utterly hostile commercialized thing the internet at large has become.