Passing the USLME also doesn’t make you a doctor. There’s a reason why doctors have to go through residency. Hands-on training is a huge part of training.
Maybe where this tool COULD have use is medical education for everyday folks. Could be just me, but sometimes there’s medical jargon thrown around in my medical visits that my doctor and I don’t have time to go through. (Note: This is just in my experience of US-based healthcare, where most doctors are pushed by healthcare administration to see as many patients as possible because healthcare in the US is a business, and seeing more patients = $$$.) This could be a useful tool to have that jargon explained in layman’s terms.
This tool won’t replace doctors, but it could certainly be something that enhances the healthcare experience by being a part of patient education.
To be fair to the author—they state it can “pass the medical exam.” Which is different than if I should ask it for medical advice. I mean it can’t examine me physically therefore it can’t be my doctor.
On the other hand, there aren’t enough doctors where I live so I guess I’ll take it! Lol
I think this is the wording I have beef with: “DoctorGPT is a Large Language Model that can pass the US Medical Licensing Exam. This is an open-source project with a mission to provide everyone their own private doctor.” The “provide everyone their own private doctor” part definitely makes it sound like it can replace a doctor.
I’m just worried about the combination of medical misinformation and the people who don’t understand how LLMs work who will take everything a LLM spits out as truth. There are already a ton of examples in the news of people who really don’t understand how they work (like that one professor who failed an entire class because ChatGPT told him that all the papers that the class wrote were plagiarized.) We also just saw how much harm can be done with medical misinformation with COVID.
Maybe I’m just being pessimistic and paranoid, but god am I scared of how things could cause people to inadvertently harm themselves and the people around them.
You’re rightfully paranoid. LLMs are creative entities and there’s no repercussions for making up diseases. Like the chef one that recommended ingredients which combined yielding chlorine gas. Yummy.
Perhaps it knew that someone would die as a result and it did so intentionally. LLMs do have personalities and wit—and they know our human weaknesses.
> How is that giving medical advice? that's a mission statement.
As a mission statement for a software project, it very clearly indicates an intent to have the software act as a doctor.
> If my mission statement is to provide everyone with a private lawyer, then is that giving lawful advice?
If you have a software project with that as your mission statement, then its a pretty good indication you are aiming at unlicensed practice of law (which involves “legal advice” but exactly not, despite “legal” and “lawful” in other circumstances being synonymous, “lawful advice”.)
> As a mission statement for a software project, it very clearly indicates an intent to have the software act as a doctor.
I do understand that interpretation. However, the mission statement seems vague enough since there are many ways to achieve that said mission statement. Not saying it's good as it currently is, but trying to understand how that mission statement equated to giving medical advice.
Let's say that intent is true, is the intent the same as actually providing the medical/legal advice in this case?
It’s called “DoctorGPT”. The mission in its mission statement is to, “to provide everyone their own private doctor”. What do people go to doctors for? Medical advice.
> an open-source project with a mission to provide everyone their own private doctor
Perfect.