this post was submitted on 22 Mar 2024
271 points (97.2% liked)

Technology

58303 readers
15 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

You’ve gone home with a Tinder date, and things are escalating. You don’t really know or trust this guy, and you don’t want to contract an STI, so… what now?

A company called Calmara wants you to snap a photo of the guy’s penis, then use its AI to tell you if your partner is “clear” or not.

Let’s get something out of the way right off the bat: You should not take a picture of anyone’s genitals and scan it with an AI tool to decide whether or not you should have sex.

top 29 comments
sorted by: hot top controversial new old
[–] [email protected] 37 points 8 months ago (1 children)

Single reason why this is suspicious from the start:

Advertised not to check yourself, but your one-night partner. If it was advertised for self-check it would be bombed with lawsuit for fake medical advices.

[–] [email protected] 15 points 8 months ago

Nah, they'd just throw up a disclaimer "Not true medical advice, consult a doctor for actual confirmation" and they'd probably be in the clear

[–] [email protected] 34 points 8 months ago (2 children)
[–] [email protected] 7 points 8 months ago

“Is sandwich.“

[–] [email protected] 1 points 8 months ago

"Meat popsicle confirmed."

[–] [email protected] 34 points 8 months ago (1 children)

This definitely won't be misused in any way that would completely destroy the good name of the person taking/in the frame of the image. It's just one "probable cause" search from a bad day.

[–] [email protected] 2 points 8 months ago

what are the chances they build a database to blackmail any individual they want in the future and just say it was leaked

[–] [email protected] 26 points 8 months ago (2 children)

I wouldn't trust calamari to identify anything tbh

[–] [email protected] 15 points 8 months ago

i'm almost certain there's a hentai like this

[–] [email protected] 6 points 8 months ago

It's a trap!

[–] [email protected] 20 points 8 months ago

I'm speechless, ina bad way.

[–] [email protected] 17 points 8 months ago (1 children)

I could care less who sees my junk. I also would not let someone take pictures of it so I can fuck them. I'm galaxies away from being that desperate.

[–] [email protected] 13 points 8 months ago (1 children)

couldn't care less*

Since I assume you mean you don't care.

[–] [email protected] 1 points 8 months ago
[–] [email protected] 14 points 8 months ago

Some STIs, in some situations, have a visible presentation that could be detected.

A false positive is a good thing here, a false negative is a bad thing here. There's no way this app will not have huge false negatives.

[–] [email protected] 12 points 8 months ago (3 children)

Downside is we have unique buttholes, so I assume that extends to other gentials. Fun new privacy attack here

https://www.smithsonianmag.com/smart-news/why-scientists-created-smart-toilet-recognizes-your-bum-180974641/

[–] [email protected] 1 points 8 months ago

Wait, someone actually made Smart Pipe?

[–] [email protected] 1 points 8 months ago

Buttplugs to protect your privacy. But only if you are on the toilet for A, not for B.

[–] [email protected] 1 points 8 months ago

An untapped mobile device biometric.

[–] [email protected] 11 points 8 months ago

Hold on babe AI wants to see a picture of your shillelagh first 🤳

[–] [email protected] 9 points 8 months ago

Maybe they will use the photo to match it with doctor notes and photos medically taken of the same penis or vagina to then illegally match them to illegally obtained health records. Probably not though.

[–] [email protected] 9 points 8 months ago

Not a hot dog.

[–] [email protected] 9 points 8 months ago

Obligatory Peep Show

[–] [email protected] 8 points 8 months ago

And to think, they stared with an app to identify if something was a hot dog or not.

[–] [email protected] 8 points 8 months ago

No more need for Ann Perkins to identify Joe's problem.

[–] [email protected] 7 points 8 months ago

Finally, using this we'll be able to train AI models so we can know what super-gonaherpes looks like

[–] [email protected] 4 points 8 months ago

This is the best summary I could come up with:


“With lab diagnosis, sensitivity and specificity are two key measures that help us understand the test’s propensity for missing infections and for false positives,” Daphne Chen, founder of TBD Health, told TechCrunch.

HeHealth is framed as a first step for assessing sexual health; then, the platform helps users connect with partner clinics in their area to schedule an appointment for an actual, comprehensive screening.

HeHealth’s approach is more reassuring than Calmara’s, but that’s a low bar — and even then, there’s a giant red flag waving: data privacy.

“It’s good to see that they offer an anonymous mode, where you don’t have to link your photos to personally identifiable information,” Valentina Milanova, founder of tampon-based STI screening startup Daye, told TechCrunch.

This sounds reassuring, but in its privacy policy, Calmara writes that it shares user information with “service providers and partners who assist in service operation, including data hosting, analytics, marketing, payment processing, and security.” They also don’t specify whether these AI scans are taking place on your device or in the cloud, and if so, how long that data remains in the cloud, and what it’s used for.

Calmara represents the danger of over-hyped technology: It seems like a publicity stunt for HeHealth to capitalize on excitement around AI, but in its actual implementation, it just gives users a false sense of security about their sexual health.


The original article contains 773 words, the summary contains 228 words. Saved 71%. I'm a bot and I'm open source!

[–] [email protected] 3 points 8 months ago* (last edited 8 months ago)

Is it open source and offline? I would only trust that they're not collecting all the photos people take with the app if so.

[–] [email protected] 2 points 8 months ago