this post was submitted on 03 Apr 2024
958 points (99.4% liked)

Technology

58303 readers
11 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

A judge in Washington state has blocked video evidence that’s been “AI-enhanced” from being submitted in a triple murder trial. And that’s a good thing, given the fact that too many people seem to think applying an AI filter can give them access to secret visual data.

you are viewing a single comment's thread
view the rest of the comments
[–] fuzzzerd 6 points 7 months ago (1 children)

This is what I was wondering about as I read the article. At what point does the post processing on the device become too much?

[–] [email protected] 1 points 7 months ago (1 children)

When it generates additional data instead of just interpolating captured data

[–] fuzzzerd 1 points 7 months ago (1 children)

What would you classify google or apple portrait mode as? It's definitely doing something. We can probably agree, at this point it's still a reasonably enhanced version of what was really there, but maybe a Snapchat filter that turns you into a dog is obviously too much. The question is where in that spectrum is the AI or algorithm too much?

[–] [email protected] 1 points 7 months ago* (last edited 7 months ago)

It varies, there's definitely generative pieces involved but they try to not make it blatant

If we're talking evidence in court then it's practically speaking more important if the photographer themselves can testify about how accurate they think it is and how well it corresponds to what they saw. Any significantly AI edited photo effectively becomes as strong evidence as a diary entry written by a person on the scene, it backs up their testimony to a certain degree by checking for the witness' consistency over time instead of trusting it directly. The photo can lie just as much as the diary entry can, so it's a test for credibility instead.

If you use face swap then those photos are likely nearly unusable. Editing for colors and contrast, etc, still usable. Upscaling depends entirely on what the testimony is about. Identifying a person that's just a pixelated blob? Nope, won't do. Same with verifying what a scene looked like, such as identifying very pixelated objects, not OK. But upscaling a clear photo which you just wanted to be larger, where the photographer can attest to who the subject is? Still usable.