Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is that weird? LLMs will just repeat what is in their training corpus. If most of the internet is recommending something wrong (like this conditional move "optimization") then that is what they will recommend too.


Not weird but important to note.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: