Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The difference between openai and next best model seems to be increasing and not decreasing. Maybe Google's gemini could be competitive, but I don't believe open source will match OpenAI's capability ever.

Also OpenAI gets significant discount on compute due to favourable deals from Nvidia and Microsoft. And they could design their server better for their homogenous needs. They are already working on AI chip.



Being ahead in a race doesn’t mean you’re going to win. Open source models will win eventually because they have the lowest marginal cost to run.

People will figure out what OpenAI is doing and duplicate it. There’s many people working at OpenAI, it’s going to leak out.


Did you even read my comment? I specifically highlighted why openai might be cheaper in long run. One is they are already working on a chip that would be better just for running a single model.


They are not going to beat NVIDIA. Making a chip for one model is not really a good idea, there are more efficiency gains to be made by improving the model and using a general purpose AI chip, rather than keeping the model architecture static and building a special purpose chip for it. Regardless, whatever OpenAI can do, NVIDIA can do better, and on more recent process nodes because they have the volume.


No, because NVidia has to work for all the models. Nvidia has other constraints that they need to have for users like instructions, security etc. which openai doesn't have.

e.g. As they have a fixed model which they know they would get billions of request to, they could even work with analogue chip which is significantly cheaper and faster for inference. [1] could achieve 10-100x flops/watt for fixed models compared to nvidia for their first gen chip.

[1]: https://www.nature.com/articles/s41586-023-06337-5




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: