this post was submitted on 21 Jan 2024
824 points (94.9% liked)

Technology

58303 readers
10 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 24 points 9 months ago (1 children)
[–] [email protected] 4 points 9 months ago (1 children)

Yeah, that's what I'm saying - our current copiright laws are insufficient to deal with AI art generation.

[–] [email protected] 10 points 9 months ago (1 children)

They aren't insufficient, they are working just fine. In the US, fair use balances the interests of copyright holders with the public’s right to access and use information. There are rights people can maintain over their work, and the rights they do not maintain have always been to the benefit of self-expression and discussion. We shouldn't be trying to make that any worse.

[–] [email protected] 10 points 9 months ago (2 children)

Yep. Copyright should not include "viewing or analyzing the picture" rights. Artists want to start charging you or software to even look at their art they literally put out for free. If u don't want your art seen by a person or an AI then don't publish it.

[–] [email protected] 3 points 9 months ago (1 children)

Copyright should absolutely include analyzing when you're talking about AI, and for one simple reason: companies are profiting off of the work of artists without compensating them. People want the rewards of work without having to do the work. AI has the potential to be incredibly useful for artists and non artists alike, but these kinds of people are ruining it for everybody.

What artists are asking for is ethical sourcing for AI datasets. We're talking paying a licensing fee or using free art that's opt-in. Right now, artists have no choice in the matter - their rights to their works are being violated by corporations. Already the music industry has made it illegal to use songs in AI without the artist's permission. You can't just take songs and make your own synthesizer out of them, then sell it. If you want music for something you're making, you either pay a licensing fee of some kind (like paying for a service) or use free-use songs. That's what artists want.

When an artist, who does art for a living, posts something online, it's an ad for their skills. People want to use AI to take the artist out of the equation. And doing so will result in creativity only being possible for people wealthy enough to pay for it. Much of the art you see online, and almost all the art you see in a museum, was paid for by somebody. Van Gogh died a poor man because people didn't want to buy his art. The Sistine Chapel was commissioned by a Pope. You take the artist out of the equation and what's left? Just AI art made as a derivative of AI art that was made as a derivative of other art.

[–] [email protected] 1 points 9 months ago (1 children)

You should check out this article by Kit Walsh, a senior staff attorney at the EFF. The EFF is a digital rights group who recently won a historic case: border guards now need a warrant to search your phone.

[–] [email protected] 3 points 9 months ago (1 children)

MidJourney is already storing pre-rendered images made from and mimicking around 4,000 artists' work. The derivative works infringement is already happening right out in the open.

[–] [email protected] 3 points 9 months ago (1 children)

Something being derivative doesn't mean it's automatically illegal or improper.

First, copyright law doesn’t prevent you from making factual observations about a work or copying the facts embodied in a work (this is called the “idea/expression distinction”). Rather, copyright forbids you from copying the work’s creative expression in a way that could substitute for the original, and from making “derivative works” when those works copy too much creative expression from the original.

Second, even if a person makes a copy or a derivative work, the use is not infringing if it is a “fair use.” Whether a use is fair depends on a number of factors, including the purpose of the use, the nature of the original work, how much is used, and potential harm to the market for the original work.

Even if a court concludes that a model is a derivative work under copyright law, creating the model is likely a lawful fair use. Fair use protects reverse engineering, indexing for search engines, and other forms of analysis that create new knowledge about works or bodies of works. Here, the fact that the model is used to create new works weighs in favor of fair use as does the fact that the model consists of original analysis of the training images in comparison with one another.

You are expressly allowed to mimic others' works as long as you don't substantially reproduce their work. That's a big part of why art can exist in the first place. You should check out that article I linked.

[–] [email protected] 2 points 9 months ago (1 children)

I actually did read it, that's why I specifically called out MidJourney here, as they're one I have specific problems with. MidJourney is currently caught up in a lawsuit partly because the devs were caught talking about how they launder artists' works through a dataset to then create prompts specifically for reproducing art that appears to be made by a specific artist of your choosing. You enter an artist's name as part of the generating parameters and you get a piece trained on their art. Essentially using LLM to run an art-tracing scheme while skirting copyright violations.

I wanna make it clear that I'm not on the "AI evilllll!!!1!!" train. My stance is specifically about ethical sourcing for AI datasets. In short, I believe that AI specifically should have an opt-in requirement rather than an opt-out requirement or no choice at all. Essentially creative commons licensing for works used in data sets, to ensure that artists are duly compensated for their works being used. This would allow artists to license out their portfolios for use with a fee or make them openly available for use, however they see fit, while still ensuring that they still have the ability to protect their job as an artist from stuff like what MidJourney is doing.

[–] [email protected] 2 points 9 months ago (1 children)

I actually did read it, that’s why I specifically called out MidJourney here, as they’re one I have specific problems with. MidJourney is currently caught up in a lawsuit partly because the devs were caught talking about how they launder artists’ works through a dataset to then create prompts specifically for reproducing art that appears to be made by a specific artist of your choosing. You enter an artist’s name as part of the generating parameters and you get a piece trained on their art. Essentially using LLM to run an art-tracing scheme while skirting copyright violations.

I'm pretty sure that's all part of the discovery from the same case where Midjourney is named as a defendant along with Stability AI, it isn't its own distinct case. It's also not illegal or improper to do what they are doing. They aren't skirting copyright law, it is a feature explicitly allowed by it so that you can communicate without the fear of reprisals. Styles are not something protected by copyright, nor should they be.

I wanna make it clear that I’m not on the “AI evilllll!!!1!!” train. My stance is specifically about ethical sourcing for AI datasets. In short, I believe that AI specifically should have an opt-in requirement rather than an opt-out requirement or no choice at all. Essentially creative commons licensing for works used in data sets, to ensure that artists are duly compensated for their works being used. This would allow artists to license out their portfolios for use with a fee or make them openly available for use, however they see fit, while still ensuring that they still have the ability to protect their job as an artist from stuff like what MidJourney is doing.

You can't extract compensation from someone doing their own independent analysis for the aim of making non-infringing novel works, and you don't need licenses or permission to exercise your rights. Singling out AI in this regard doesn't make sense because it isn't a special system in that regard. That would be like saying dolphin developers have to pay Nintendo every time someone downloads their emulator.

[–] [email protected] 1 points 9 months ago (1 children)

You do realize that you basically just confirmed every fear that artists have over AI, right? That they have no rights or protections to prevent anybody from coming along and using their work to train an LLM to create imitation works for cheaper than they can possibly charge for their work, thereby putting them out of business? Because in the end, a professional in any field is nothing more than the sum of the knowledge and experience they've accrued over their career; a "style" as you and MidJourney put it. And so long as somebody isn't basically copy+pasting a piece, then it's not violating copyright, because it's not potentially harming the market for the original piece, even if it is potentially harming the market for the creator of said piece.

The Dolphin analogy is also incorrect (though an interesting choice considering they got pulled from the Steam store after the threat of legal action by Nintendo, but I think you and I feel the same way on that issue - Dolphin has done nothing wrong). A better analogy would be if Unreal created an RPGMaker style tool for generating an entire game of any genre you want in Unreal Engine at the push of a button by averaging a multitude of games across different genres to generate the script. If they didn't get permission to use said games, either by paying a one time fee, an ongoing fee, or using games that expressly give permission for said use, I'm sure the developers/publishers would be rather unhappy with Unreal. Could it be incredibly beneficial and vastly improve the process of creating games for the industry? Absolutely. If they released it for free, could it be used by anybody and everybody to make imitation Ubisoft games, or any other developer, and run the risk of strangling the industry with even more trash games with no soul in them? Also absolutely. And a big AAA publisher has a lot more ability to deal with knock-offs/competition like that than your average starving artist. The indie game scene is the strongest it's ever been thanks to the rise of digital storefronts, but how many great indie game developers go under after producing their first game and never make a second? The vast majority. Because indie games almost never make a profit, meaning they can't afford to make another.

The issue with AI is that it opens a whole can of worms in the form of creating an industrial scale imitation generator that anybody can use at the push of a button. And the general public have long made known their disdain for properly compensating artists for the work that they do, and have already been gleefully doing a corporation by using AI to avoid having to hire artists. This runs the risk of creating a chilling effect in the field of creativity and the arts, as your average independent artist can no longer afford to keep doing art thanks to the wonders of capitalism. There will always be people who do art as a hobby, but professional artists as we think of them today? Why go into debt by training at an art school if all your job prospects have been replaced because people generate art for free with some form of LLM instead of hiring artists. I myself never went into art beyond a hobby level despite wanting to because of how abysmal the job prospects were even 15 years ago. And I simply cannot afford to do it as much as I'd like (if at all) between work, the time investment, and the expense of it. And that's not even getting into the issues of LLM generated porn of people, advertisements generated using the voices of dead (and still alive) celebrities, scams made using the voices of relatives, and all the other ethical issues.

I used to work at a fish market with a kid who was a trained electrician who was set to follow in the footsteps of his dad who had been one of the highest paid electricians in the US, except he gave up on it because the thing he liked doing the most in the field was replaced with a machine by the time he graduated from technical school. Obviously the machine is more efficient (and probably safer), but instead of entering the field at all, he ended up working a job he hated and to this day has never found a job he has any passion for. What happens to art when professional artists are only NEETs, who have minimal living expenses, and those hired by corporations and the wealthy? Are we going to get the fine art market on steroids, with the masses only having access to AI generated art that will degrade in quality over time as the only new inputs are previous AI generated pieces, unless there's enough hobby artists to provide sufficient new art, while the wealthy hold a monopoly on human-made art that the rest of us will probably never see?

This is all pure speculation, but it's the Jurassic Park question: "Your scientists were so preoccupied with whether or not they could, they didn't stop to think if they should."

[–] [email protected] -1 points 9 months ago (1 children)

You do realize that you basically just confirmed every fear that artists have over AI, right? That they have no rights or protections to prevent anybody from coming along and using their work to train an LLM to create imitation works for cheaper than they can possibly charge for their work, thereby putting them out of business? Because in the end, a professional in any field is nothing more than the sum of the knowledge and experience they’ve accrued over their career; a “style” as you and MidJourney put it. And so long as somebody isn’t basically copy+pasting a piece, then it’s not violating copyright, because it’s not potentially harming the market for the original piece, even if it is potentially harming the market for the creator of said piece.

A professional career can't be reduced down to a style. There's a lot more that goes into art than styles.

The Dolphin analogy is also incorrect (though an interesting choice considering they got pulled from the Steam store after the threat of legal action by Nintendo, but I think you and I feel the same way on that issue - Dolphin has done nothing wrong). A better analogy would be if Unreal created an RPGMaker style tool for generating an entire game of any genre you want in Unreal Engine at the push of a button by averaging a multitude of games across different genres to generate the script. If they didn’t get permission to use said games, either by paying a one time fee, an ongoing fee, or using games that expressly give permission for said use, I’m sure the developers/publishers would be rather unhappy with Unreal. Could it be incredibly beneficial and vastly improve the process of creating games for the industry? Absolutely. If they released it for free, could it be used by anybody and everybody to make imitation Ubisoft games, or any other developer, and run the risk of strangling the industry with even more trash games with no soul in them? Also absolutely. And a big AAA publisher has a lot more ability to deal with knock-offs/competition like that than your average starving artist. The indie game scene is the strongest it’s ever been thanks to the rise of digital storefronts, but how many great indie game developers go under after producing their first game and never make a second? The vast majority. Because indie games almost never make a profit, meaning they can’t afford to make another.

Profit shouldn't be the sole motivator for creative endeavors. If a tool like the one you describe existed, we wouldn't need to have to "afford" to make things. We could have more collaborative projects like SCP, but with more fleshed out rich detail. I certainly don't spend my time lamenting the fact that I can't monetize every one of my comments and posts.

I used to work at a fish market with a kid who was a trained electrician who was set to follow in the footsteps of his dad who had been one of the highest paid electricians in the US, except he gave up on it because the thing he liked doing the most in the field was replaced with a machine by the time he graduated from technical school. Obviously the machine is more efficient (and probably safer), but instead of entering the field at all, he ended up working a job he hated and to this day has never found a job he has any passion for. What happens to art when professional artists are only NEETs, who have minimal living expenses, and those hired by corporations and the wealthy? Are we going to get the fine art market on steroids, with the masses only having access to AI generated art that will degrade in quality over time as the only new inputs are previous AI generated pieces, unless there’s enough hobby artists to provide sufficient new art, while the wealthy hold a monopoly on human-made art that the rest of us will probably never see?

Your problem is with Capitalism. Your friend is a victim of the capitalist logic of prioritizing cost-cutting over human well-being. The question of what happens to art under capitalism is also a valid one, as capitalism tends to reduce everything into a product that can be bought and sold, but I think the potential outcomes for art are less predetermined than you make them seem. As long as we keep encouraging and nurturing diverse voices, I think we can come out winners.

This is all pure speculation, but it’s the Jurassic Park question: “Your scientists were so preoccupied with whether or not they could, they didn’t stop to think if they should.”

I don't think it's wrong to give people a free tool to expand their ability to communicate and collaborate.

[–] [email protected] 1 points 9 months ago (1 children)

See, I agree with pretty much everything you say here, because my (and the artists who are opposed to AI) problem is with Capitalism, full stop. People have to monetize skills in order to survive if they want to spend their time doing something they love. Even hobbies have now become "side hustles." Many of the indie game studios start out as a hobby, before the people working on the game use their savings to move to developing full-time as their job so they can work on their passion more. And then they don't turn a profit and have to go back to making smaller projects as hobbies while they do something else for work. This is where the fear is - artists love making art, but if you do it professionally, AI that mimics your art is basically on the same level as knock-off products of name brand designs.

My issue with MidJourney, for example, wouldn't be an issue if the concern over taking business away from artists was made moot. You say that a professional career can't be reduced down to a style, but then what is MidJourney doing and what is the difference? Because 4,000 of the 5,000 "style" prompts that you can input into MidJourney are artists' names, and that list is growing apparently according to the Discord logs - somebody mentioned having a list of 15,000 new artists' names to add to the prompts after they scrape their art. You can say "make me an impressionist landscape," but you can also say "make me a landscape by Sara Winters." Would having MidJourney make you paintings by a specific artist and then selling them be okay? Is that just a style or is it copyright infringement? Because I can easily see the case where that could be considered damaging to Sara's (in this example) "market" as a professional, even if you aren't selling the paintings you make. Because MidJourney is explicitly providing the tools to create work that mimics her art with the intent of cutting her out of the equation. At that point, have we crossed the line into forgery?

We unfortunately live in a capitalist society and we have to keep that in mind. People need to market their skills as a job in order to afford the essentials to live. Beyond a certain point, the time investment in something demands that you make money doing it. AI as a tool has the capability to be absolutely monumentally helpful, we could even see a fundamental paradigm shift in how we think of the act of creativity. But it also has the possibility to be monstrously harmful, as we've already seen with faked nudes of underage teens and false endorsements for products and political campaigns. Somebody tried to threaten an artist by claiming they had the copyright to a picture the artist was working on on Twitch after they took a screenshot of it and supposedly ran it through some sort of image generator. There was even a DA who somebody tried to scam using an AI generated copy of his son's voice claiming that he was in prison. Letting it be unregulated is incredibly risky, and that goes for corporate AI uses as well. We need to be able to protect us from them as much as we need to be able to protect ourselves from bad actors. And part of that is saying what is and what isn't an acceptable use of AI and the data that goes into training it. Otherwise, people are going to use stuff like Nightshade to attempt to protect their livelihoods from a threat that may or may not be imagined.

[–] [email protected] 0 points 9 months ago* (last edited 9 months ago)

My issue with MidJourney, for example, wouldn’t be an issue if the concern over taking business away from artists was made moot. You say that a professional career can’t be reduced down to a style, but then what is MidJourney doing and what is the difference? Because 4,000 of the 5,000 “style” prompts that you can input into MidJourney are artists’ names, and that list is growing apparently according to the Discord logs - somebody mentioned having a list of 15,000 new artists’ names to add to the prompts after they scrape their art. You can say “make me an impressionist landscape,” but you can also say “make me a landscape by Sara Winters.” Would having MidJourney make you paintings by a specific artist and then selling them be okay? Is that just a style or is it copyright infringement? Because I can easily see the case where that could be considered damaging to Sara’s (in this example) “market” as a professional, even if you aren’t selling the paintings you make. Because MidJourney is explicitly providing the tools to create work that mimics her art with the intent of cutting her out of the equation. At that point, have we crossed the line into forgery?

You should read the article I linked earlier. There's no problem as long as you're not using their name to sell your works. Styles belong to everyone, no one person can lay claim to them.

Specific expressions deserve protection, but wanting to limit others from expressing the same ideas differently is both is selfish and harmful, especially when they aren't directly copying or undermining your work.

We unfortunately live in a capitalist society and we have to keep that in mind. People need to market their skills as a job in order to afford the essentials to live. Beyond a certain point, the time investment in something demands that you make money doing it. AI as a tool has the capability to be absolutely monumentally helpful, we could even see a fundamental paradigm shift in how we think of the act of creativity. But it also has the possibility to be monstrously harmful, as we’ve already seen with faked nudes of underage teens and false endorsements for products and political campaigns. Somebody tried to threaten an artist by claiming they had the copyright to a picture the artist was working on on Twitch after they took a screenshot of it and supposedly ran it through some sort of image generator. There was even a DA who somebody tried to scam using an AI generated copy of his son’s voice claiming that he was in prison. Letting it be unregulated is incredibly risky, and that goes for corporate AI uses as well. We need to be able to protect us from them as much as we need to be able to protect ourselves from bad actors. And part of that is saying what is and what isn’t an acceptable use of AI and the data that goes into training it. Otherwise, people are going to use stuff like Nightshade to attempt to protect their livelihoods from a threat that may or may not be imagined.

We already have countless laws for the misuse of computer systems, and they adequately cover these cases. I'm confident we'll be able to deal with all of that and reap the benefits.

Open-source AI development offers critical solutions. By making AI accessible, we maximize public participation and understanding, foster responsible development, and prevent harmful control attempts. Their AI will never work for us, and look at just who is trying their hand at regulatory capture. I believe John Carmack put it best.

[–] [email protected] 2 points 9 months ago* (last edited 9 months ago)

It's sad some people feel that way. That kind of monopoly on expression and ideas would only serve to increase disparities and divisions, manipulate discourse in subtle ways, and in the end, fundamentally alter how we interact with each other for the worse.

What they want would score a huge inadvertent home run for corporations and swing the doors open for them hindering competition, stifling undesirable speech, and monopolizing spaces like nothing we’ve seen before. There are very good reasons we have the rights we have, and there's nothing good that can be said about anyone trying to make them worse.

Also, rest assured they'd collude with each other and only use their new powers to stamp out the little guy. It'll be like American ISPs busting attempts at municipal internet all over again.