Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I mostly agree with this. LLMs are just another tool, and we've learned how to use and adapted to using many other tools throughout our history just fine.

With the caveat of for our field in particular, it's one of the few that require continuous learning and adaptation, so tech workers in a way are better predisposed to this line of thinking and tool adoption without some of the potential harmful side effects.

To pick on spell check, it has been showing that we can develop a dependency on it and thereby losing our own ability to spell and reason about language. But, is that a bad thing? I don't know.

What I do know is humans have been outsourcing our thinking for a long time. LLMs are another evolution in that process, just another way to push off cognitive load onto a tool like we've done with stone tablets, books, paper notes, digital notes, google, etc.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: