this post was submitted on 16 Feb 2024
184 points (96.0% liked)

Privacy

31997 readers
950 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS
 

As a medical doctor I extensively use digital voice recorders to document my work. My secretary does the transcription. As a cost saving measure the process is soon intended to be replaced by AI-powered transcription, trained on each doctor's voice. As I understand it the model created is not being stored locally and I have no control over it what so ever.

I see many dangers as the data model is trained on biometric data and possibly could be used to recreate my voice. Of course I understand that there probably are other recordings on the Internet of me, enough to recreate my voice, but that's beside the point. Also the question is about educating them, not a legal one.

How do I present my case? I'm not willing to use a non local AI transcribing my voice. I don't want to be percieved as a paranoid nut case. Preferravly I want my bosses and collegues to understand the privacy concerns and dangers of using a "cloud sollution". Unfortunately thay are totally ignorant to the field of technology and the explanation/examples need to translate to the lay person.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 4 points 9 months ago (1 children)

I assume you'll be using Dragon Medical One. Nuance is a well established organization, with users in a broad range of professions, and their medical product is extensively used by many specialists. The health system where I live has been in the process of phasing out transcriptionists in favor of it for a decade or so.

The only potential privacy concerns a hospital would care about would be if they are storing your transcripts on their servers, because that will contain sensitive information about patients. It will be impossible to get any administrator to care about your voice data.

This tide is unlikely one you will be able to stem, but you could stop dictating and type it yourself.

[–] [email protected] 1 points 9 months ago (1 children)

I'm not sure what exact service will be used. I won't be able to type as the IT environment is tightly controlled and they will even remove the keyboard as an input device for the medical records.

[–] [email protected] 1 points 9 months ago

I see that lasting until the first time some record ends up reading "backspace backspace backspace! No you stupid delete! Delete. Dee feet delete".