this post was submitted on 20 Jul 2023
36 points (90.9% liked)

Technology

60074 readers
3248 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

ChatGPT Out-scores Medical Students on Complex Clinical Care Exam Questions::A new study shows AI's capabilities at analyzing medical text and offering diagnoses — and forces a rethink of medical education.

top 7 comments
sorted by: hot top controversial new old
[–] [email protected] 19 points 1 year ago (2 children)

That's interesting but never forget the difference between exams and real life is huge. Exam test cases are always sorta typical clinical presentations, every small element pointing towards the general picture.

In real life, there are almost always discrepancies, elements that don't make sense at all for the given case, and the whole point of getting some residency experience is to be able to know what to make out of those contradictory elements. When to question nonsensical lab values. What to do when a situation doesn't belong in any category of problems you learned to solve.

Many things i think generative AI, due to its generative nature of predicting what word is most likely to come next based on learned data, wouldn't be able to do

[–] [email protected] 9 points 1 year ago (2 children)

How would the students fare if they had access to all the information available on the internet, used to train the AI, during the test?

[–] Tilted 5 points 1 year ago (1 children)

We probably should be training the students to use the AI as a tool

[–] [email protected] 0 points 1 year ago

I’ve used chat gpt a bit to see what it spits out in terms of medical education. I don’t trust it to be completely accurate but for the things where I’m able to verify it is true it does surprisingly well. There are a number of databases that exist with specifically verified content that is current and reliable that doctors use. If you could isolate the ai to only use that information you could reduce the risk of it spitting out false information and doctors could use it to spitball ideas or get assistance pulling protocols and guidelines and whatnot. I definitely could see language model ai like this getting used to assist clinical providers in the future. I could also see it used to further automate patient monitoring which we already do quite a bit but still struggle to master. Current ai models can identify high risk patients hours before a human can identify them and they improve outcomes. This will only continue but it will certainly not be replacing humans in this equation anytime soon.

[–] [email protected] -2 points 1 year ago

for a human, that's probably too much information to be useful. That's why chatGPT is so powerful. It can sort through all that cruft and find "relevant" information.

it's an incredibly complicated set of If-then statements that lead it through a decision tree; ultimately responding to a prompt using what is most commonly followed up in similar prompts on the internet.

It fails on knowing if the information is useful, or even correct, however. and it receives the biases inherited both from the people who wrote the if-then statements and the data to which it was fed. Further, the narrow AI's we have today have no agency, no creativity or intuition. It fakes all of these things in order to make us believe it's 'real'- that's what it's programed to do.

[–] [email protected] -1 points 1 year ago (1 children)

I hope you're wrong. If there's one job I want AI to do, it's to improve health care.

There are many excellent doctors, but also many very average doctors. And even the best doctors seem to be biased towards the most common illnesses.

And I've read that many people with persistent pain, especially people of color, cannot get medicine because doctors suspect everyone of being abusers. But, giving it out like candy isn't great either.

We need AI doctors.

[–] [email protected] 2 points 1 year ago

don't get me wrong, human doctors (humans in general actually) have a lot of problems and it would be great to have some kind of AI assistance for diagnosis or management. But I don't think generative AI like chatGPT is actual AI : it's a probabilistic algorithm that spits out the word most likely to be after the last one it wrote, based on the material it was trained on. I don't think we need a doctor like that.