Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I wonder how many hallucinated wrong facts are in there. It looked like a good resource until I learned its LLM generated. https://news.ycombinator.com/item?id=45479268


What's the base rate nowadays for hallucinations? 10%? 15%.

It's unlikely that a large volume of text will be generated at the base rate (errors compound), so the number might be higher than we expect.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: