To be fair to the author—they state it can “pass the medical exam.” Which is different than if I should ask it for medical advice. I mean it can’t examine me physically therefore it can’t be my doctor.
On the other hand, there aren’t enough doctors where I live so I guess I’ll take it! Lol
I think this is the wording I have beef with: “DoctorGPT is a Large Language Model that can pass the US Medical Licensing Exam. This is an open-source project with a mission to provide everyone their own private doctor.” The “provide everyone their own private doctor” part definitely makes it sound like it can replace a doctor.
I’m just worried about the combination of medical misinformation and the people who don’t understand how LLMs work who will take everything a LLM spits out as truth. There are already a ton of examples in the news of people who really don’t understand how they work (like that one professor who failed an entire class because ChatGPT told him that all the papers that the class wrote were plagiarized.) We also just saw how much harm can be done with medical misinformation with COVID.
Maybe I’m just being pessimistic and paranoid, but god am I scared of how things could cause people to inadvertently harm themselves and the people around them.
You’re rightfully paranoid. LLMs are creative entities and there’s no repercussions for making up diseases. Like the chef one that recommended ingredients which combined yielding chlorine gas. Yummy.
Perhaps it knew that someone would die as a result and it did so intentionally. LLMs do have personalities and wit—and they know our human weaknesses.
On the other hand, there aren’t enough doctors where I live so I guess I’ll take it! Lol