Well, they are going to be disapointed when they'll realize we haven't even figured out the catalog yet.
Software is half a century old, we are still using flint to produce it. The only reason it looks fancy is because it makes more sparks, and a lot faster.
We're just starting to get out of the alchemy stage right now. Wise men and magicians everywhere telling royalty that if they only pay them treasures that they will reveal the future and show them miracles. Meanwhile it turns out that NSS has a straight-forward vulnerability that everyone just somehow missed.
If you look at the history of chemistry, mechanical engineering, or astronomy then you kind of get the impression that we're probably 150-300 years away from software development working the way everyone already imagines it working.
I believe we miss solid common ontology - an agreement what individual pieces of data mean. Then individual pieces of software are either incompatible or inconsistent and must be constantly built anew, resulting in all kinds of bugs.
> If you look at the history of chemistry, mechanical engineering, or astronomy then you kind of get the impression that we're probably 150-300 years away from software development working the way everyone already imagines it working.
If it is to happen, a serious plateauing of "progress" will need to happen all the way up the layers of the tech stack to the point where app developers or data engineers are now working with tools stable enough that previous generations would at least recognise let alone even be able to work with.
Engineering maturity "nirvana" for software will happen not by us getting more experienced/better at it, but by "progress" stalling in all the tools we use.
Software changes a lot basically because it is easy to change. And because we as an industry value constant work towards making it even easier to keep changing - eg cloud, containers, devops, agile etc etc
In the physical world, things have stabilised and reached a kind of local optimum across the vast majority of areas. Sure there are still incremental improvements happening, and occasional revolutionary improvements in different areas. Mostly the main bits being radically changed are the bits that coincidentally touch computers downstream from the constant churn happening to software.
Also another factor is when I think of physical engineering (and I used to work in civil/structural engineering in the 90s), for nearly all work going on, the scale of the problems being solved hasn't changed. Most building sites are the same size they were, most materials are the same, some regulations have changed, but ubiquitous CAD tools etc seems to be the major change.
So maybe, when we can't make transistors smaller, and we can't make cpus faster, and can't increase memory or storage things will slow down. eg as each layer up the stack plateaus, each layer above it eventually changes from working on new capabilities to making what it already does more efficient, that will eventually (it will take a while) bubble all the way up. Progress slows down a lot and us end developers and engineers are working on a stable set of tools very much in a local (or even global) optimum. Also maybe when/if the worlds population stabilises (in decades/centuries/whenever), that might lead to an eventual limit on how much user data can be mined from people and the scale of the data we deal with might stabilise too even beyond the limit reached where we've stopped collecting more because of capacity limits talked about above.
Heh, I started off trying to disagree with your quoted statement by saying I don't think it would ever stabilise like other disciplines. But by laying out one way it could happen, I think I may have ended up agreeing with you :)
Software is half a century old, we are still using flint to produce it. The only reason it looks fancy is because it makes more sparks, and a lot faster.