Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It kinda is, if you want not just performance on synthetic benchmarks but a good coverage of the long tail. This is where GPT4 excels, and also why I pay for it. Transformers are basically fancy associative memories. A smaller model, much like a smaller search index, will not be able to contain as much nuanced information for some hard, immutable, information theoretic reasons.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: