this post was submitted on 14 Jul 2023
4 points (75.0% liked)

Videos

14114 readers
1 users here now

For sharing interesting videos from around the Web!

Rules

  1. Videos only
  2. Follow the global Mastodon.World rules and the Lemmy.World TOS while posting and commenting.
  3. Link directly to the video source and not for example an embedded video in an article.
  4. Don't be a jerk
  5. No advertising
  6. Avoid clickbait titles. (Tip: Use dearrow)

founded 1 year ago
MODERATORS
 

Hi! I am a former VFX artist for a few [adultswim] shows who has been using generative imagery for about a year now. My take is very nuanced without a cheering endorsement or condemnation of Ai imagery. If anyone has questions or comments, please post them below. Thanks for watching.

top 6 comments
sorted by: hot top controversial new old
[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (2 children)

Considering the amount of bad uses you can give to said images, what kind of safeguards can we put in place to stop it from being abuse.

Preferably ones that aren't invasive to our privacy. After all the better the images get the less we will be able to believe in, unless we see wit our own eyes.

[–] [email protected] 2 points 1 year ago (1 children)

Can you give an example of what you mean by "bad uses"?

[–] [email protected] 1 points 1 year ago (1 children)

Of course, let's say in a while you cannot trust a video of someone committing homicide, simply because it could be fake.

The opposite is also truth, you can shatter the public's trust in an individual, without him actually saying or doing any of the things he's accused off

[–] [email protected] 1 points 1 year ago (1 children)

OK Thanks - So you are asking about protections from misinformation- deepfakes and such.

As technology improves, it may be downright impossible to tell real from fake with our own eyes- at which point what is "proof" becomes blurry. It will become "this is why we can't have nice things..." where innocents are at risk of harm (non-AI art getting rejected from competitions because it looks like AI art), and bad actors more often get away with shenanigans. Hopefully we're smart enough to figure out ways to avoid that kind of future.

However, I don't think restricting the technology itself, through legislation or otherwise, would be practical nor would it be very effective. Forgery and deception are age-old concepts, and people aren't going to stop trying to cheat/lie/steal. Some people (VFX artists?) can probably already make a believable fake homicide. And just look at all the fake UFO footage out there- we don't really need AI to deceive people, it's just that AI makes it more accessible- and perhaps now within reach for some lowlife that needs to cheat to be successful in life. And, most countries already have laws in place against fraud, forgery, and libel- things that hurt others. It would be very difficult to regulate "misinformation" though, because it can overlap with legitimate uses such as art and entertainment.

Of course, it would be nice to have only "Ethical" AI - and this is what you are starting to see in the commercial space, but it is pretty easy to bypass these restrictions (not endorsing this, just an example of a quick search result). Also, not all AI systems will even bother trying to be ethical, and once the technology is more accessible bad actors could just make their own AI systems from scratch. I also think any attempt at restriction through legal means would significantly hinder legitimate research in the field and slow progress on what may be our best chance at overcoming humanity's biggest challenge (climate change, etc.).

I like to think of AI as an extension of the human intellectual tool set - so let's not treat it like guns or drugs (physical things) but rather like libraries or the internet. Regulated to a practical extent, yes, but not really restricted with regards to what it can do. The fact that the internet was not highly regulated or highly-controlled during it's inception is a major part of why it is the amazing global network we have today.

[–] [email protected] 1 points 1 year ago

Ok, you make a great point. I just don't know that comparing it to a liberal wolf be wise, the gun metaphor is correct. It's not like we need to make any real effort to use these aí, we just need a couple of prompts. Even the slowest of us could use it, a library involves research and at least hard work to collect it all.

Yes I agree with you in the restrictions, we would just end up hiding information in order to enforce these laws, do it would be worse in the long therm. But I mean the danger sir this technology are infinite, should everyone have access to it, or just a few people who had to go under training and have licences. .

Like I said I feel like this is going to be a Far west type of cenario, where those who are good with guns are free to do as they want.

What would be better America's approach on guns or the European one?

[–] [email protected] 2 points 1 year ago

what kind of safeguards can we put in place to stop it from being abuse

Realistically, zero. Stable Diffusion is open source for better or worse and that means people can tailor it for their needs.

load more comments
view more: next ›