Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Well it's sort of true in that people stick these LLMs together and they produce intelligent seeming outputs in ways that the people building them don't fully understand. Kind of like how evolution stuck a bunch of biological neurons together without needing to fully understand how it works.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: