It will definitely get more complex, and the abstraction layers has been a red herring for a while.
Programmers crave more subjective control over the code. But programmers do not lift their tooling and thinking to the level required, over a long grueling process to make it happen.
So instead we substitute in house-of-cards style abstractions that temporarily satsify the requirements. Destined to collpase under their own weight.
Experienced programmers say 'don't build abstractions' and instead we should compress code down into reusable chunks. To dig ditches instead of stacking a house of cards.
Neither make any real change to the art of programming and continuing the status quo, will forever 'incentivize' abstractions, because there is no tooling for squeezing in the subjective structures required in every program.
We so dearly love the ideal of mathematical certainty in code, but it brings a limit to our views on programming.
Frameworks are also abstractions. It’s basically, hey we are handling X for you so you don’t “reinvent the wheel”
Low/no code platforms are just another layer (could even say multiple layers) of this.
The idea of using an LLM to write code for you is conceptually similar. You’re leveraging code you didn’t write. Does it fit the rest of your codebase in style? In approach? But it’s worse because at least a framework or platform is packaged and versioned and you can update it. If your codebase has a bunch of LLM generated code in it, you’re fully responsible for that code. And the bigger that surface is, the more trouble you could be in.
I think LLMs and the recent jumps in that area are really cool and I do think some successful products and companies will be built on helping people use it, but based on what I know about them, I feel like calling them “AI” is a misnomer. There’s no actual intelligence or sapience, it’s still just pattern matching.
And if this is just a building block - the equivalent to getting synapses firing in the brain and they are able to scale this up to a point where something resembling real cognition is happening, all jobs are in trouble. I don’t see how it’d be unique to software development.
AI as a 'nomer' really realy reminds me of what we used to call 'impressions'. It just gives you the impression of a result.
People use the word AI to refer to a picture, video or audio clip that (in effect) does an 'impression'. Like an actor doing an impersonation.
The abstraction thing is a big topic. An indirection, frameworks and LLMs are asbtractions that are 'flat' or expand outwards on the architectural blueprint of code(s). These are ideal, or semi-ideal code ideas that are understood objectively, built from a set of primitives that someone, somewhere chose.
The other 'abstraction' is the heirarchy of abstraction, from electricity, to binary, to machine code, to assembly, to C, to C++.
I think there is another "couple layers" of abstraction above programming languages as we have them. But there is a grueling No-Mans-Land of work to get to the 'next step up'.
If we can get above 'high level programming languages' then we can capture some subjectivity from the programmer.
I really struggle to think in flat abstractions or blueprints/maps of code, to my own detriment. But it does make sense as a concept.
Programmers crave more subjective control over the code. But programmers do not lift their tooling and thinking to the level required, over a long grueling process to make it happen.
So instead we substitute in house-of-cards style abstractions that temporarily satsify the requirements. Destined to collpase under their own weight.
Experienced programmers say 'don't build abstractions' and instead we should compress code down into reusable chunks. To dig ditches instead of stacking a house of cards.
Neither make any real change to the art of programming and continuing the status quo, will forever 'incentivize' abstractions, because there is no tooling for squeezing in the subjective structures required in every program.
We so dearly love the ideal of mathematical certainty in code, but it brings a limit to our views on programming.