It's pretty interesting and mildly shocking that everyone is just making the same 'who needs a new JS library' joke.
What about closed source tooling? How do you expect an AI to ever help you with something it doesn't have a license to know about? Not everything in the world can be anonymously scraped into the yearly revision.
If AI is going to stay we'll have to solve the problem of knowledge segmentation. If we solve that, keeping it up to date shouldn't be too bad.
>What about closed source tooling? How do you expect an AI to ever help you with something it doesn't have a license to know about? Not everything in the world can be anonymously scraped into the yearly revision.
This is not a novel problem. Proprietary toolchains already suffer from decreased resources on public forums like Stack Overflow; AI did not create this knowledge segmentation, it is scraping this public information after all.
>It's pretty interesting and mildly shocking that everyone is just making the same 'who needs a new JS library' joke.
Surely the proprietary toolchain is itself the 'new JS library'?
Most developers I know don't enjoy working with esoteric commercial solutions that suffer from poor documentation, when there exists an open-source solution that is widely understood and freely documented.
I do not see why AI code generation further incentivizing the use of the open-source solution is a problem.
> This is not a novel problem. AI did not create this knowledge segmentation, it is scraping this public information after all.
I think you misunderstand the situation. You as a person can be privy to private knowledge. Relying on AI enough that you can't use that private knowledge is the novel situation.
> I do not see why AI code generation further incentivizing the use of the open-source solution is a problem.
Maybe you don't get it and have never experienced it but there's a missive amount of development done against unreleased APIs or hardware. Game engines, firmware, etc. I doubt Apple is going to publish new SDKs for their new widgets long before any devs use them.
I am curious, have you used a code generation tool that is linked to your IDE?
If I ask it, "why is the structure I am feeding into the `do_cool_stuff()` function not working", it won't say "sorry, I can't figure that out because I don't know the library that implements `do_cool_stuff()`. It will:
1. read my code where I construct the data to feed into the function,
2. see that the function is imported,
3. then read the file containing the linked `do_cool_stuff()` function to understand exactly what data structure it expects.
Crucially, the linked function can be anything from my own personal super secret UI framework for building widgets, to the `lib` folder containing my proprietary driver code for my unreleased hardware. This is fundamentally the LLM figuring out how to do "development [] against unreleased APIs or hardware".
I don't really know what "private knowledge" you are referring to, given you talk about Apple developing widgets against their new unreleased framework. The code is not really "private"; certainly to you or I it might be, but Apple owns the code on both sides of the interface, no? What precludes their internal generative AI tooling from doing the equivalent of Ctrl-Clicking through to the linked `do_cool_stuff()`?
If anything, this is perhaps a benefit of genAI; for security reasons, I am sure your average programmer at Apple is expected to work solely against the internal documentation regarding how `do_cool_stuff()` works, and has no access to the actual implementation of it. Yet, an in-house inference server could certainly have access to both sides without the risk of code leakage you have directly sharing the other side of an interface with the consuming developers.
To me, in this context, "closed source" implies something the LLM does not have access to the implementation details for, e.g. source unavailable for the user invoking the LLM; in this case, I think my original analysis holds re: genAI further encouraging the development on top of open source technologies.
What about closed source tooling? How do you expect an AI to ever help you with something it doesn't have a license to know about? Not everything in the world can be anonymously scraped into the yearly revision.
If AI is going to stay we'll have to solve the problem of knowledge segmentation. If we solve that, keeping it up to date shouldn't be too bad.