I agree and I suspect this planned system might get scuttled before release due to legal problems. That's why I framed it in a non legal way. I want my bosses to understand the privacy issue, both in this particular case but also in future cases.
FlappyBubble
Yes I agree. Broadening the scope a little, I frankly just wait for a big leak of medical records. The system we use is a birds nest of different softwares, countless API:s, all sorts of database backends. Many systems syem from MS-DOS, just embedded in a bit more modern integrated environment. There are just so many flaws and I'm amazed a leak hasn't happened (or at least surfaced) yet.
I'm not sure what exact service will be used. I won't be able to type as the IT environment is tightly controlled and they will even remove the keyboard as an input device for the medical records.
I understand the fight will be hard and I'm not getting into it if I cant present something they will understand. I'm definetly in a minority both among the admin staff and my peers, the doctors. Most are totally ignorsnt to the privacy issue.
I work in Sweden and it falls under GDPR. There are probably are GDPR implications but as I wrote the question is not legal. I want my bosses to be aeare of the general issue ad this is but the first of many similar problems.
The training data is to be per person, resulting in a tailored model to every single doctor.
Thaks fot he advice but I'm not against using AI-models transcribing me, just not a cloud model specifically trained on my voice without any control by me. A local model or more preferrably a general local model woulf be fine. What makes me sad is that the persons behind this are totally ignorant to the problem.
That's correct! I'm not againt using technology to cut costs or providing better healthcare. My question is entirely about the privacy implications.
Sure but what about my peers? I want to get the point across and the understanding of privacy implications. I'm certain that this is just the first of many reforms without proper analysis of privacy implications.
It will not be possible to use my own software. The computer environment is tightly controlled. If this is implemented my only input device to the medical records will be the AI transcriber (stupidity).
I'm a psychiatrist in the field of substance abuse and withdrawal. Sure there's a shortage of us too but I want the hospital to understand the problem, not just me getting to use a old school secretary by threatening going to another hospital.
My question is not a legal one. There probably are legal obstacles for my hospital in this case but HIPAA is not applicable in my country.
I'd primarily like to get your opinions of how to effectively present my case for my bosses against using a non local model for this.
Bad but expected given that they are Chinese based. I use several of their cameras but only after kepping them isolated from the Internet and segmented from the rest of my network. I only access the streams from my NAS which in turn access the camera streams from a dedicated NIC.
I don't know if it's common practise in other countries. In Sweden where I work it is. I think the rationale is the following:
Of course we have to review the teanscribed result. At my hospital, all doctors carry smart cards and use the personal stoed private key to digitally sign every transcribed medical record entry.