this post was submitted on 10 Oct 2023
281 points (87.1% liked)

Technology

58303 readers
11 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Disney’s Loki faces backlash over reported use of generative AI / A Loki season 2 poster has been linked to a stock image on Shutterstock that seemingly breaks the platform’s licensing rules regard...::A promotional poster for the second season of Loki on Disney Plus has sparked controversy amongst professional designers following claims that it was created using generative AI.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 15 points 1 year ago (1 children)

Can we talk about how Shutterstock only allows their own AI-generated images? Stock image sites will be the first to face the guillotine of AI generation, and this is how they protect themselves?

Good riddance. I got my video card and several Stable Diffusion models that are way better than the prices they charge.

[–] [email protected] 14 points 1 year ago (1 children)

You're not a business whose sole purpose is to sell/license images. If you read the article, it explains that their models are trained using only images from their library, which seems like a sensible approach to avoiding copyright issues.

[–] [email protected] 7 points 1 year ago (3 children)

There's no copyright issues to avoid. Stable Diffusion is not suddenly illegal based on the images it trains on. It is a 4GB database of weights and numbers, not a many petabyte database of images.

Furthermore, Shutterstock cannot copyright their own AI-generated images, no matter how much they want to try to sell it back for. That's already been decided in the courts. So, even if it's their own images its trained on, if it was fully generated with their own AI, anybody is free to yank the image from their site and use it anywhere they want.

This is a dying industry trying desperately to hold on to its profit model.

[–] [email protected] 4 points 1 year ago (1 children)

Here we get the very crucial definition between "legal" and "moral".

It is not currently illegal to build a "database of weights and numbers" by crawling arts and images without permission, attribution or compensation, for the express purpose of creating similar works to replace the work of the artists whose artworks were used to train it and which they rely on to make a living.

That doesn't mean that it shouldn't be legislated.

Really not a fan of this "dying industry" talk in light of this.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago) (1 children)

It is morally right to be able to use others' copyrighted material without permission. For analysis, criticism, research, satire, parody and artistic expression like literature, art, and music, In the US, fair use balances the interests of copyright holders with the public’s right to access and use information. There are rights people can maintain over their work, and the rights they do not maintain have always been to the benefit of self-expression and discussion. It would be awful for everyone if IP holders could take down any review, reverse engineering, or indexes they didn’t like. That would be the dream of every corporation, bully, troll, or wannabe autocrat. It really shouldn’t be legislated.

AI training isn’t only for mega-corporations. After we've gone through and gutted all of our rights and protections like too many people want to do, we'll have handed corporations a monopoly of a public technology by making it prohibitively expensive to for us to keep developing our own models. Mega corporations will still have all their datasets, and the money to buy more. They might just make users sign predatory ToS too, allowing them exclusive access to user data, effectively selling our own data back to us. People who could have had access to a corporate-independent tool for creativity, education, entertainment, and social mobility would instead be worse off with fewer resources and rights than they started with.

I recommend reading this article by Kit Walsh, a senior staff attorney at the EFF if you haven't already. The EFF is a digital rights group who most recently won a historic case: border guards now need a warrant to search your phone.

You should also read this open letter by artists that have been using generative AI for years, some for decades. I'd like to hear your thoughts.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (1 children)

I have read that article and I have found it sorely insufficient at addressing the concerns of the artists who are having to deal with this new situation. The EFF is usually great but I cannot agree with them on this stance.

You speak of "IP holders" and "corporations", seemingly to give a connotation of overbearing nameless organizations to any attempt at legislation, but you don't have a single word to say about the independent artists who are being driven out of their artistic careers by this. It doesn't sound like you even considered what their side is like, just that you decided that it's "morally right" to have free access to everyone's works for AI training.

How fair is the "Fair Use" that lets artists get replaced by AI's trained on their works? Way to often AI proponents argue of current legal definitions as if this was merely a matter of some philosophical mind games rather than people's lives. The law exists to ensure people's rights and well-being. It's not sufficient for something to fit the letter of the law, if we want to judge it as just.

I did read this open letter, although I already wasn't expecting much, and I can only find it sappy, shallow and disingenuous. They may say that they don't care about using AI to replicate others' works, not only that's not sufficient to prevent it, it doesn't address all the artists' works that were still used without permission, attribution or compensation even if they use the resulting AI to produce works that don't resemble any other work in particular.

We see a unique opportunity in this moment to shape generative AI’s development responsibly. The broad concerns around human artistic labor being voiced today cannot be ignored. All too often, major corporations and other powerful entities use technology in ways that exploit artists’ labor and undermine our ability to make a living.

But this has already failed. AI has already been developed and released irresponsibly. Corporations are already using it to exploit artists labor. Many major models are themselves an exploitation of artists' labor. These are hollow words that don't even suggest a way to address the matter.

There is only one thing I want to hear from AI advocates if they intend to justify it. Not legal wording or technical details or philosophical discussions about the nature of creativity, because ultimately they don't address the material issues. Rather, how do they propose that the artists whose works they relied on ought to be supported. Because to scrape all their stuff and then to turn and say they are fated to be replaced, like many AI proponents do, is horribly callous, ungrateful and potentially more damaging to culture than any licensing requirement would be.

[–] [email protected] 0 points 1 year ago (1 children)

You speak of “IP holders” and “corporations”, seemingly to give a connotation of overbearing nameless organizations to any attempt at legislation, but you don’t have a single word to say about the independent artists who are being driven out of their artistic careers by this. It doesn’t sound like you even considered what their side is like, just that you decided that it’s “morally right” to have free access to everyone’s works for AI training.

It is morally right to have to be able to use copyrighted material for whatever allows people to express themselves and enable the fair free flow of information. Artists are holders of IP in this case, but they are not corporations. Many seemingly want to go down the same path as abusive organizations like the RIAA. They seek to become abusers themselves and hobble people to keep them from participating certain conversations. That isn't right.

How fair is the “Fair Use” that lets artists get replaced by AI’s trained on their works? Way to often AI proponents argue of current legal definitions as if this was merely a matter of some philosophical mind games rather than people’s lives. The law exists to ensure people’s rights and well-being. It’s not sufficient for something to fit the letter of the law, if we want to judge it as just.

AI aren't people, they are a tool for people to use, and all people have a right to self-expression and that includes the training of AI. What some people want would give too much power over discourse to a few who have a financial and social incentive to be as controlling as possible. That kind of balance would be rife for abuse and would be catastrophic for everyone else. Like print media vs. internet publication and TV/Radio vs. online video, there will be winners and losers, but I think this will all be in service of a more inclusive, decentralized, and open media landscape.

I did read this open letter, although I already wasn’t expecting much, and I can only find it sappy, shallow and disingenuous. They may say that they don’t care about using AI to replicate others’ works, not only that’s not sufficient to prevent it, it doesn’t address all the artists’ works that were still used without permission, attribution or compensation even if they use the resulting AI to produce works that don’t resemble any other work in particular.

You simply don't have to compensate someone so analyze public data. That would be like handing someone a flyer for lessons and then trying to collect a fee because they got good at the same kind of thing you do. They put in all the work, and they do new stuff that's all their own.

We see a unique opportunity in this moment to shape generative AI’s development responsibly. The broad concerns around human artistic labor being voiced today cannot be ignored. All too often, major corporations and other powerful entities use technology in ways that exploit artists’ labor and undermine our ability to make a living.

But this has already failed. AI has already been developed and released irresponsibly. Corporations are already using it to exploit artists labor. Many major models are themselves an exploitation of artists' labor. These are hollow words that don't even suggest a way to address the matter.

There is only one thing I want to hear from AI advocates if they intend to justify it. Not legal wording or technical details or philosophical discussions about the nature of creativity, because ultimately they don't address the material issues. Rather, how do they propose that the artists whose works they relied on ought to be supported. Because to scrape all their stuff and then to turn and say they are fated to be replaced, like many AI proponents do, is horribly callous, ungrateful and potentially more damaging to culture than any licensing requirement would be.

If I can't use legal wording, technical details, or philosophy, how am I supposed to be able to explain? Your goal seems to be only to avoid or dismiss the complexity, nuance, or validity of any explanation. The best I can do is: It isn't exploitation to analyze, reverse engineer, critique, or parody. It took us 100,000 years to get from cave drawings to Leonard Da Vinci. This is just another step, like Camera Obscura. We're all standing on the shoulders of giants. We learn from each other and humanity is at its best when we can all share in our advancements. Calling this exploitation is self-serving manipulative rhetoric that unjustly vilifies people and misrepresents the reality of how these models work. And I never said anyone was fated to be replaced, you're putting words in my mouth.

Generative AI is free and open source. There is a vibrant community of researchers, developers, activists, and artists who are working on FOSS software and models for anyone to use. There's a worldwide network working for the public, often times leading research and development, for free. We'd like nothing more than to have more people join us, because together we are stronger.

I understand you're passionate about this topic, and I respect your feelings. But, I think that you use manipulative language, personal attacks and misrepresenting of arguments to get out of giving any explanation. You have not provided any support or reasoning, and you ignored the ponts and facts I presented. The way you talk to people isn't fair, and I don't really feel like continuing this discussion, but thanks for listening.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (1 children)

You simply don’t have to compensate someone so analyze public data.

To call this simply "analysis" is wildly disingenuous. AI isn't simply offering data about those works, they are using them to create effective replacements for those works, at expense of those artists' careers. Even calling them "public" is reductive, because being publicly displayed in some manner does not mean they ceased to be copyrighted works.

You insist a lot of people's right to self-expression, but restricting some manners of training AI doesn't mean the people can't express themselves. Not only they could still use any other tool, they could simply get permission from the creators or use Public Domain works to train.

Why is it having to get permission for using works in AI training considered such a violation of people's rights? Just because older laws were written without consideration for the way AI functions (obviously)?

We also restrict the use of cameras in certain ways and yet photography is a flourishing medium. In a way one could call a camera an analysis tool too, but I don't think arguing this sort of technicality is productive.

Still, you don't seem to consider that the artists driven out of the market by AI will have their capability of expression stifled, simply by not being able to focus on creating.

If I can’t use legal wording, technical details, or philosophy, how am I supposed to be able to explain?

Because you are using those to argue around the situation of the artists whose work was used, and still you will not offer a single word about their situation. Instead you will call them "abusers" for trying to preserve their livelihoods. I did offer you arguments why the law is lacking, but I'm tired of arguing how human rights take precedent over tools, that comparing AI with human creativity is simplistic and that whatever may be the inner workings of AIs, that doesn't change the material effects of their use. I'm pretty sure I had those arguments with you in particular already.

Which is why I want to get to the point: "What about the artists' livelihoods?"

Any other explanation that doesn't address this is lacking a crucial aspect. Considering how much AI had to rely on original works it's strange that the source of those original works is not even given any consideration.

But back to this again:

You simply don’t have to compensate someone so analyze public data. That would be like handing someone a flyer for lessons and then trying to collect a fee because they got good at the same kind of thing you do.

See, this is the kind of talk that gets me incensed, and which makes your talk of "manipulative language" ironic and disingenuous. Are you really going to call the artist's role "handing someone a flyer for lessons"? Not even the lesson proper, since their work is the reference material? There is no possible way such an analogy could be made in good faith, it reveals a profound disregard for the role of the artists in all this, so much for standing in the shoulders of giants. No surprise you want to call artists seeking to protect themselves "abusers" too. It doesn't seem to me like the words of someone honestly interested in candid, open-minded and respectful discussion.

[–] [email protected] -2 points 1 year ago (1 children)

You're ignoring and misrepresenting what I said again so you can just repeat yourself. I've already covered all of this. If I explain it for a third time you'll just do this again.

[–] [email protected] 1 points 1 year ago (1 children)

Even though I said I was only interested on your take about a single particular point, you completely disregarded it. Even though I said I had no interest in any other aspect, I responded to multiple of your arguments.

At this point you are saying "you’re ignoring and misrepresenting what I said" as a way to disregard what I said and pretend there are no unadressed aspects. It's just not true. Ironically, you're saying that to ignore and misrepresent my words.

To be fair, you made it more than clear that trying to discuss the matter with you is wasted effort.

[–] [email protected] -1 points 1 year ago (1 children)

Why don't you tell me what this single particular point is instead of writing all of this?

[–] [email protected] 1 points 1 year ago (1 children)

Oh sorry, I thought you wanted me to take what you said into consideration. Is it still remotely unclear?

Pretty sure you are just playing coy now, but I am this stubborn.

Ahem.

How do you suggest that traditional artists ought to be supported in light of the likely possibility that AI trained on their works might make their careers financially unviable?

[–] [email protected] -2 points 1 year ago (1 children)

Yeah, I definitely covered this, The EFF article covered it too.

[–] [email protected] 1 points 1 year ago (1 children)

No, absolutely not. As expected, you are just bulshitting me.

Just to be thorough, this is how:

  • "Just evolve with the tech and use AI too" is a clueless response considering that artistic industries are already highly saturated. There won't even be enough positions for all the AI artists, which they will inevitably find out. It's also weirdly elitist, thinking of AI art as superior to any other form, the only one worthy still dedicating yourself too, an attitude that is already conceited when it happens between traditional and digital artists.

  • "It's just like the industrial revolution" not only is not a solution, rather it is a reference to time in history that, though romanticized, was very troubled. As it was, going from working artisanal crafts to getting their arms chewed by machinery didn't turn out great for the early industrial workers, and it might not ever have if not for people fighting for their rights. Worse than that, back then industries eventually freed people from working on fields to working on offices. Now AI can take them out of the offices, but to where? No, they won't all be AI engineers. How does that benefit the artist that is "freed" into having to work in a sweatshop?

  • Another common response that is present in neither your comments nor the article is "Universal Basic Income will solve this" and even if I try to be fully open to the possibility, I have to point out that AI is here today, but UBI is only tested every couple years, people tell how great it is and then we don't hear anything about widespread implementation. This likely won't happen without fierce popular pressure.

So, the question of the artists' material conditions is not even remotely addressed.

[–] [email protected] -1 points 1 year ago

I don't know what this is.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

If that's correct, then it's even more understandable why they wouldn't want an avalanche of pictures anyone can use for free on their service of selling pictures.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

I don't get what your point is. Are you trying to generate images with Stable Diffusion and upload them to Shutterstock? Because that's the only situation when the thing you're complaining about applies. Nobody is stopping you from generating images and using them. What they are doing is preventing you from generating them and then trying to profit from them on the Shutterstock platform, unless you use their tools. Why is this an issue, in your opinion?