Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Producing code is like producing syntactically correct algebra. It has very little value on its own.

I’ve been trying to pair system design with ChatGPT and it feels just like talking with a person who’s confident and regurgitates trivia, but doesn’t really understand. No sense of self-contradiction, doubt, curiosity.

I’m very, very impressed with the language abilities and the regurgitation can be handy, but is there a single novel discovery by LLMs? Even a (semantic) simplification of a complicated theory would be valuable.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: