Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> At the current rate of change, ...

We've seen that the rate of change went up hugely when LLMs came around. But the rate of change was much lower before that. It could also be much slower for the foreseeable future.

LLMs are only as good as their training materials. But a lot of what programmers do is not documented anywhere, it happens in their head, and it is in response to what they see around them, not in what they scrape from the web or books.

Maybe what is needed is for organizations to start producing materials for AI to learn from, rather than assuming that all they need is what they find on the web? How much of the effort to "train" AI is just letting them consume the web, and how much is concsiously trying create new learning materials for AI?



It could slow down again. We don’t know. But the people working at OpenAI seem to believe the models will keep improving for the foreseeable future. The “we’ll run out of training data” argument seems overblown.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: