Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

LLMs do reason. We already have AGI.

Sometimes they mess up when reasoning. Still reasoning.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: