Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The fact that a tool can break or that the company manufacturing that tool lies about its abilities, are annoying but do not imply that the tool is useless.

I experience LLM "reasoning" failure several times a day, yet I find them useful.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: