Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Doubters say it's not as accurate or could hallucinate. But the thing about hiring professionals is that you have to blindly trust them because you'd need to have a professional level of knowledge to qualify who is competent.

LLMs are a good way to double check if the service you're getting is about right or steer them onto the right hypothesis when they have some confirmation bias. This assumes that you know how to prompt it with plenty of information and open questions that don't contain leading presuppositions.

An LLM read my wife's blood lab results and found something the doctor was ignoring.





All these things are language parsing and transforming. That's the kind of thing llms are good at.

And statistical modelling. LLM's must have a weights that associate _seen X chemical high, so Y condition likely follows in the documents I've read_.



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: