this post was submitted on 20 Jul 2023
36 points (90.9% liked)

Technology

60074 readers
3248 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

ChatGPT Out-scores Medical Students on Complex Clinical Care Exam Questions::A new study shows AI's capabilities at analyzing medical text and offering diagnoses — and forces a rethink of medical education.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 19 points 1 year ago (6 children)

That's interesting but never forget the difference between exams and real life is huge. Exam test cases are always sorta typical clinical presentations, every small element pointing towards the general picture.

In real life, there are almost always discrepancies, elements that don't make sense at all for the given case, and the whole point of getting some residency experience is to be able to know what to make out of those contradictory elements. When to question nonsensical lab values. What to do when a situation doesn't belong in any category of problems you learned to solve.

Many things i think generative AI, due to its generative nature of predicting what word is most likely to come next based on learned data, wouldn't be able to do

[–] [email protected] 9 points 1 year ago (3 children)

How would the students fare if they had access to all the information available on the internet, used to train the AI, during the test?

[–] Tilted 5 points 1 year ago (1 children)

We probably should be training the students to use the AI as a tool

[–] [email protected] 0 points 1 year ago

I’ve used chat gpt a bit to see what it spits out in terms of medical education. I don’t trust it to be completely accurate but for the things where I’m able to verify it is true it does surprisingly well. There are a number of databases that exist with specifically verified content that is current and reliable that doctors use. If you could isolate the ai to only use that information you could reduce the risk of it spitting out false information and doctors could use it to spitball ideas or get assistance pulling protocols and guidelines and whatnot. I definitely could see language model ai like this getting used to assist clinical providers in the future. I could also see it used to further automate patient monitoring which we already do quite a bit but still struggle to master. Current ai models can identify high risk patients hours before a human can identify them and they improve outcomes. This will only continue but it will certainly not be replacing humans in this equation anytime soon.

load more comments (1 replies)
load more comments (3 replies)