Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

BNNs certainly have their uses, but I think people in general found that it's a better use of compute to fit a larger model on more data than to try to squeeze more juice from a given small dataset + model. Usually there is more data available, it's just somewhat tangentially related. LLMs are the ultimate example of how training on tons of tangentially-related data can ultimately be worthwhile for almost any task.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: