this post was submitted on 16 Oct 2023
98 points (91.5% liked)

Technology

58303 readers
11 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Deepfake Porn Is Out of Control::New research shows the number of deepfake videos is skyrocketing—and the world's biggest search engines are funneling clicks to dozens of sites dedicated to the nonconsensual fakes.

all 35 comments
sorted by: hot top controversial new old
[–] [email protected] 113 points 1 year ago* (last edited 1 year ago) (1 children)

From my perspective deep fakes will lead to a short but massive peak of harassment until everyone is aware of the technology and its capabilities. Once the technology reaches the mainstream and everyone is able to generate such content with ease, people will just stop caring. If these videos are everywhere, it's easy to play it off as a fake. It might even help victims of actual revenge porn. Virtual nudity will become less of a deal, probably even in real life.

From my perspective the bigger issue of deep fakes is news. We already have a huge issue with lies on social media and even TV and newspapers today and once we can no longer trust what we see it will be incredibly hard to build up trust for any sources.

Fake videos of politicians being spread to harm their credibility, fake videos of war crimes to justify an attack. Or vice versa if there's an authentic video of a crime the offenders will just deny the authenticity. But in contrast to Trump's "fake news" claims today, it will be more or less impossible for normal people to fake check anything.

[–] [email protected] 21 points 1 year ago (1 children)

Although not related to porn, a lot of scam services that operate in India already use it as a defense. Its extremely hard to get someone in the field in trouble because you need evidence to raid, and it cant be video nor audio because they claim that the said medium is a deepfake.

[–] [email protected] 77 points 1 year ago* (last edited 1 year ago) (2 children)

This is a sad article to read. I'm not a woman nor am I young adult growing up with all this technology that can be leveraged against me. Could you imagine being a junior high or high school student, and having an anonymous classmate creating deepfake porn of you using your yearbook photo? And the children in your class gossiping about you, sharing your porn video/photo online with their friends, and enduring that harassment? It's already well-documented what damages that too much pornography causes on our psychological development, now imagine the consumer of this content being around the victim. That harassment can get so much worse.

I can't even begin to fathom what kind of psychological damage this will cause to the youth. I feel for women everywhere - this is a terrible thing people are doing with this technology. I can't imagine raising a daughter in this environment and trying to help her navigate this problem when some asshole creates deepfake porn of her. My niece is currently getting bullied in school - what if her bullies use these tools against her? This just makes my blood boil.

It's bad enough that since social media has risen and captured the attention span of kids and teenagers, that there is a well-defined correlation with an increase in suicide rates since 2009 (the year Twitter first came out). https://www.health.com/youth-suicide-rate-increase-cdc-report-7551663 . Now, a nonconsensual AI-generated porn era to navigate.

These are dangerous times. This opens persons up for attack, and regulation to increase friction to access these tools is one of the next most important steps to take. Granted, outright bans never work (as the persistent ones will always get their hands on it), but we need to put controls into place to limit access to this. Then we can remediate the root cause to these problems (e.g., proper systemic education, teaching a modified sexual education in schools to address things like consent, etc.).

EDIT:

Wanted to also add after I posted this, that a common prevalent argument I hear parroted by people is this:

  • People are gonna do this AI generation anyway. It'll get to the point that you won't be able to tell what's real or not, so women can just deny it. You can't prove it's real anyway, so why bother?

This is another way of saying "boys will be boys" and ignoring the problem. The problem is harrassment and violence against women.

[–] [email protected] 11 points 1 year ago (2 children)

Where did all the replies to this post go? There was an entire discussion that is now gone, and nothing in the modlog.

[–] [email protected] 9 points 1 year ago

After some testing, It might be that the parent commenter just deleted their comment which nuked all the child comments. I can't rememeber if this is what Reddit does. I think it just sais "Deleted by creator", but keeps the children. Could certainly be wrong, though.

[–] [email protected] 6 points 1 year ago* (last edited 1 year ago) (1 children)

Well, that doesn't bode well.

[–] [email protected] 7 points 1 year ago (2 children)

I’ve found that if done one deletes their comment than everything below it disappears.

[–] [email protected] 7 points 1 year ago* (last edited 1 year ago) (1 children)

Yup it appears that our entire comment chain got nuked. So it is now confirmed that if you delete the parent, then all children get removed as well.


For any reading this message, the context is that we tested it by me replying to OP's previous comment, then OP responding to me, then I deleted my comment to see if their comment also got deleted.

[–] [email protected] 2 points 1 year ago

From my testing it only removes them, but you should be able to go into them again by clicking on the reply in your inbox.

[–] [email protected] -1 points 1 year ago* (last edited 1 year ago) (1 children)

This is another way of saying "boys will be boys" and ignoring the problem.

I don't think that's at all similar. "Boys will be boys" is "we know it's bad, but we can't stop them."

The argument is... is it really bad? After all, isn't it the "scandal" that really causes the damage? It's not like any harm is directly done to the person, someone could've already done this to me, and well, I wouldn't be any the wiser. It's when they start sharing it and society reacts as if it's real and there's something scandalous that there's a problem.

If we stop considering it scandalous... The problem kind of goes away... It's not much different than AI photoshopping a hat on someone that they may or may not approve of.

This opens persons up for attack, and regulation to increase friction to access these tools is one of the next most important steps to take.

I've never researched these tools or used them... But I'd wager that's going to be next to impossible. If you think the war on drugs was bad... A war on a particular genre of software would be so much worse.

Like a lot of things... I think this is a question of how do we adapt to new technology not how do we stop it. If I actually believed this was stoppable, I might agree with you... But it actually seems more dangerous if we try and make the tools hard to obtain vs just giving people plausible deniability.

You mentioned bullying, definitely empathetic to that. I don't know that this would really make things worse vs the "I heard Katie ..." rumor crap that's being going on for decades. Feminism has argued for taking the power away by removing the taboo of women having sex lives ... and that seems equally relevant here.

Either way, it really seems like a lot more research is needed.

[–] [email protected] 6 points 1 year ago (1 children)

"Just stop considering it scandalous" is a severe lack of imagination. Even if/when the stigma of "having a sex life" is gone, the great majority of people consider their sex life to be private. Video floating around that looks like you having sex is a very different thing to hearsay rumors.

Keep in mind that the exact same techniques could be used to sabotage adult relationships, marriages, careers, just as easily as teenage bullying. This isn't a problem society can shrug away by saying sex should be less stigmatized.

[–] [email protected] 0 points 1 year ago* (last edited 1 year ago)

Keep in mind that the exact same techniques could be used to sabotage adult relationships, marriages, careers, just as easily as teenage bullying.

And a "video" should ruin those things why?

Literally everything you listed is because society is making a big stink of things that don't matter.

Why should your job care ... even if it's real?

If somebody didn't cheat and there's no other reason to believe that than a ... suspiciously careless video of someone that looks like them... Why in the world should that end their relationship?

Not to mention, AI isn't going to get the details right. It's going to get the gist of it right but anyone who's actually seen you naked is presumably going to be able to find some details that are/aren't off.

Also in terms of privacy, your privacy wasn't violated. Someone made a caricature of you.

Video floating around that looks like you having sex is a very different thing to hearsay rumors.

It's really not, the only reason it is, is because video has been trustworthy for the past century, now it's not.

I hope you folks down voting me have some magic ace up your sleeve, but I see no way past this other than through it. Just like when the atom bomb was invented, it's technology that exists now and we have to deal with it. Unlike the atom bomb, it's just a bunch of computer code and at some point pretty much any idiot is going to be able to get their hands on convincing versions of it. Also unlike the atomic bomb, it can't actually kill you.

[–] [email protected] 28 points 1 year ago (1 children)

AI and deepfakes aren't going to stop. Schools need to get with the times rather than pretending like it's the year 1960.

Teachers should be able to deliver meaningful punishments to students. If someone gets caught passing these around, then that person should catch some flak. And none of that punishing the victim and the bully like most schools do.

[–] [email protected] 0 points 1 year ago (1 children)

Schools won't care. They never did and never will. This will just be a new era for bullies to use.

They should actually teach about it. Maybe even teach how to use it.

[–] [email protected] 2 points 1 year ago

Well then start making videos of the staff that doesn't care.

[–] [email protected] 22 points 1 year ago (1 children)

That’s disgusting. Which particular sites are out of control?

[–] [email protected] 10 points 1 year ago* (last edited 1 year ago)

How would you police this without direct abuse?

It’s pretty easy to spot deep fakes, even now. The type of porn being created in deep fakes are just too unbelievable when it comes to actors and actresses. They’re not deep faking intimate love porn, it’s nearly always straight up deep hardcore pornography that is being made when deep fakes are involved. I feel like everything described that is so evil is just a straw man argument. Hell anyone that believes deep hardcore pornography is what just happens in reality is a moron. The amount of bullshit incest porn on these same websites is just bonkers. That being said I can see how it can affect some people.

But guess what? Humans tend to look similar so how do you go stop it when you don’t know if it’s real or fake? How crazy easy will it be to create yet another advantage to those in power/financial success? Examples:

Politician is seeing a prostitute and abusing his status to do so. The prostitute records a secret sex tape of him raping her and threatening to have her arrested if she doesn’t submit to what he wants. This video goes public. Politician claims it’s a deep fake and prostitute is arrested anyway. Or the reverse. A prostitute deep fakes the video and threatens the politician. The politician just had information coming out of him glancing at another woman than his wife before the deep fake and so the populace just sides with the prostitute and the politician is arrested.

Or how about a woman who looks just like Taylor Swift decides she wants to work in pornography. Her likeness is immediately noticed and it’s part of her popularity but not billed as such. T swizzle claims it’s a deep fake to disparage her and the porn actress is ruin if not sued into oblivion.

So many scenarios could go either way. You can’t ban the technology because you’ll never be able to legitimately be able to know which is which. And just like cryptography banning it will not limit access to those who would use it lawfully.

So what’s the solution? Get over the lunacy of the whole event? What options do we really have? And being we don’t have many/any options all we’re doing is sending clicks to news sites who have nothing else to write about. I’m not saying it’s not a problem, just not seeing a solution and don’t see a need to continually beat a dead horse.

[–] [email protected] 1 points 11 months ago

Society's views on sexuality will change before we will EVER get a serious handle on deepfakes. If you're rich and can afford the lawyers, go ham and sue, otherwise, time to just accept that humans are animals and animals fuck.

Whether or not someone is or is not in a porn video is less important than whether or not they can do whatever job or task they've been given.

Religious puritanical morons and prudes need to stfu and get over it, the victims need to cope with reality that this is never going away and they can spend their entire life and fortune on 'finding the one who did this' or just move on and put energy into something worthwhile.

Even complaining about this is hysterically moronic. The 'big threat' is fake porn.

Fixing the child care system so that child abuse, emotional, physical and sexual gets reduced even 1% would be immensely more worthwhile a task than literally any ending of a pursuit against a technology that is open source and widely available, not to mention even if it was made illegal in your country, good luck actually enforcing a law like that without going 110% dystopia with locked down internet that would make current chinese life look like a kind big brother system.