Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You can assume that already-published open weights models are available at $0, regardless of how much money was sunk into their original development. These models will look increasingly stale over time but most software development doesn't change quickly. If a model can generate capable and up-to-date Python, C++, Java, or Javascript code in 2025 then you can expect it to still be a useful model in 2035 (based on the observation that then-modern code in these languages from 2015 works fine today, even if styles have shifted).


>2025-2035

Depending on other people to maintain backward compatibility so that you can keep coding like it’s 2025 is its own problematic dependency.

You could certainly do it but it would be limiting. Imagine that you had a model trained on examples from before 2013 and your boss wants you to take over maintenance for a React app.


You're all referencing the strange idea in a world where there would be no open-weight coding models trained in the future. Even in a world where VC spending vanished completely, coding models are such a valuable utility that I'm sure at the very least companies/individuals would crowdsource them on a reoccurring basis, keeping them up to date.

The value of this technology has been established, it's not leaving anytime soon.


SOTA models cost hundreds of millions to train. I doubt anyone is crowdsourcing that.

And that’s assuming you already have a lot of the infrastructure in place.


I think faang and the like would probably crowdsource it given that they would—according to the hypothesis presented—would only have to do it every few years, and ostensibly are realizing improved developer productivity from them.


I don’t think the incentive to open source is there for $200 million LLM models the same way it is for frameworks like React.

And for closed source LLMs, I’ve yet to see any verifiable metrics that indicate that “productivity” increases are having any external impact—looking at new products released, new games on Steam, new startups founded etc…

Certainly not enough to justify bearing the full cost of training and infrastructure.


2013 was pre-LLM. If devs continue relying on LLMs and their training would stop (which i would find unlikely), still the tools around the LLMs will continue to evolve and new language features will get less attention and would only be used by people who don't like to use LLMs. Then it would be a race of popularity between new language (features) and using LLMs steering 'old' programming languages and APIs. Its not always the best technology that wins, often its the most popular one. You know what happened during the browser wars.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: