Single reason why this is suspicious from the start:
Advertised not to check yourself, but your one-night partner. If it was advertised for self-check it would be bombed with lawsuit for fake medical advices.
This is a most excellent place for technology news and articles.
Single reason why this is suspicious from the start:
Advertised not to check yourself, but your one-night partner. If it was advertised for self-check it would be bombed with lawsuit for fake medical advices.
Nah, they'd just throw up a disclaimer "Not true medical advice, consult a doctor for actual confirmation" and they'd probably be in the clear
"Not Hotdog."
“Is sandwich.“
"Meat popsicle confirmed."
This definitely won't be misused in any way that would completely destroy the good name of the person taking/in the frame of the image. It's just one "probable cause" search from a bad day.
what are the chances they build a database to blackmail any individual they want in the future and just say it was leaked
I wouldn't trust calamari to identify anything tbh
i'm almost certain there's a hentai like this
It's a trap!
I'm speechless, ina bad way.
I could care less who sees my junk. I also would not let someone take pictures of it so I can fuck them. I'm galaxies away from being that desperate.
couldn't care less*
Since I assume you mean you don't care.
Thank you.
Some STIs, in some situations, have a visible presentation that could be detected.
A false positive is a good thing here, a false negative is a bad thing here. There's no way this app will not have huge false negatives.
Downside is we have unique buttholes, so I assume that extends to other gentials. Fun new privacy attack here
Buttplugs to protect your privacy. But only if you are on the toilet for A, not for B.
An untapped mobile device biometric.
Hold on babe AI wants to see a picture of your shillelagh first 🤳
Maybe they will use the photo to match it with doctor notes and photos medically taken of the same penis or vagina to then illegally match them to illegally obtained health records. Probably not though.
Not a hot dog.
Obligatory Peep Show
And to think, they stared with an app to identify if something was a hot dog or not.
No more need for Ann Perkins to identify Joe's problem.
Finally, using this we'll be able to train AI models so we can know what super-gonaherpes looks like
This is the best summary I could come up with:
“With lab diagnosis, sensitivity and specificity are two key measures that help us understand the test’s propensity for missing infections and for false positives,” Daphne Chen, founder of TBD Health, told TechCrunch.
HeHealth is framed as a first step for assessing sexual health; then, the platform helps users connect with partner clinics in their area to schedule an appointment for an actual, comprehensive screening.
HeHealth’s approach is more reassuring than Calmara’s, but that’s a low bar — and even then, there’s a giant red flag waving: data privacy.
“It’s good to see that they offer an anonymous mode, where you don’t have to link your photos to personally identifiable information,” Valentina Milanova, founder of tampon-based STI screening startup Daye, told TechCrunch.
This sounds reassuring, but in its privacy policy, Calmara writes that it shares user information with “service providers and partners who assist in service operation, including data hosting, analytics, marketing, payment processing, and security.” They also don’t specify whether these AI scans are taking place on your device or in the cloud, and if so, how long that data remains in the cloud, and what it’s used for.
Calmara represents the danger of over-hyped technology: It seems like a publicity stunt for HeHealth to capitalize on excitement around AI, but in its actual implementation, it just gives users a false sense of security about their sexual health.
The original article contains 773 words, the summary contains 228 words. Saved 71%. I'm a bot and I'm open source!
Is it open source and offline? I would only trust that they're not collecting all the photos people take with the app if so.