this post was submitted on 04 Nov 2024
92 points (98.9% liked)

Fuck AI

1361 readers
49 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 8 months ago
MODERATORS
 

OpenAI’s Whisper tool may add fake text to medical transcripts, investigation finds.

top 10 comments
sorted by: hot top controversial new old
[–] [email protected] 15 points 4 days ago

Regular transcription software is finally respectable (the early days of dragon naturally speaking were dark indeed). Who thought tossing AI in the mix was a good idea?

[–] [email protected] 13 points 4 days ago (1 children)

I work in judicial tech and have heard questions of using AI transcription tools. I didn't believe AI should be used in this kind of high risk area. The ones asking if AI is a good fit for court transcripts can be forgiven because all they see is the hype, but if the ones responding greenlight a project like that there will be some incredibly embarrassing moments.

My other concern is that the court would have to run the service locally. There are situations where a victim's name or other information is redacted. That information should not be on an Open AI server and should not be regurgitated back out when the AI misbehaves.

[–] [email protected] 2 points 3 days ago (1 children)

Don't court stenographers basically use tailored voice models and voice to text transcription already?

[–] [email protected] 2 points 3 days ago

I don't get too technical with the court reporter software. They have their own license and receive direct support from their vendor. What I have seen is that there is an interpreting layer between the stenographer machine and the software, literally called magic by the vendor, that is a bit like predictive text. In this situation, the stenographer is actively recording and interpreting the results.

[–] [email protected] 10 points 4 days ago (1 children)

God I hope this isn't the AI plan that the NHS adopts

[–] [email protected] 8 points 4 days ago* (last edited 4 days ago)

This is the AI plan every healthcare entity worldwide will adopt.

No joke. They are desperate for shit like this.

[–] [email protected] 10 points 4 days ago* (last edited 4 days ago)

Private hospitals care about only one thing: profit. These error-ridden tools serve that purpose.

[–] [email protected] 4 points 3 days ago

Errors and Hallucinations are definitely serious concerns, but my biggest concern would be privacy. If my GP is using AI, I no longer see my medical information as private, and that is unacceptable.

[–] [email protected] 4 points 4 days ago

If anyone needs to know the state of AI transcription, just turn on closed captioning for your local tv channel. It's atrocious and I am sorry that people who need closed captioning are subjected to that.

[–] [email protected] 1 points 3 days ago

Years ago, I worked in a tech role at a medical transcription company. It hadn't occurred to me that AI would render their jobs irrelevant. This used to be an area where women in particular could make decent money after a bit of training, and there were opportunities for advancement into medical coding and even hospital administration.

I worked with some good people. Hope they landed on their feet.