Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

When human makes a mistake, we call it a mistake. When human lies, we call it a lie. In both cases, we blame the human.

When LLM does the same, we call it hallucination and blame the human.



Which is the correct reaction, because LLM isn't a human and can't be held accountable.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: