No software company I have ever worked at had an excess of worker productivity. There were always at least 3-5X as much work needing to be done, bugs needing to be fixed, features that needed to be implemented than engineers to do it. Backlogs just grew and grew until you just gave up and mass-closed issues because they were 10 years old.
If AI coding improves productivity, it might move us closer to having 2X as much work as we can possibly do instead of 3X.
I don't think you can judge "work needing to be done" by looking at backlog. Tickets are easy to enter. If they were really important, they'd get done or people would be hired to do them (employed or contracted). 10 year old issues that never got attention were just never that important to begin with.
This "fallacy" is from 1891 and assumes jobs that require virtually no retraining. A farm worker could ion theory clean the factory floor or do one small step in an assembly line within a week.
Nowadays we already have bullshit jobs that keep academics employed. Retraining takes several years.
With "AI" the danger is theoretically limited because it creates more bureaucracy and reduces productivity. The problem is that it is used as an excuse for layoffs.
What a strange thing to ask for a citation on when CEO pay, stock buy backs and corporate dividends are at all time highs while worker pay and honestly just affording to live continue to crater.
Productivity is up and labor wages are up. That’s why I asked. It wasn’t an attempt at a rebuttal it was a request for reading material as it’s a heterodox view.
The normal conversation is that productivity growth has slowed and the divide has increased, not that more productivity creates lower outcomes in real terms.
It's economic jargon for what people are paid per hour for working (which can include non-direct payments such as healthcare and pensions), adjusted for inflation (for economists, "real" just means divided by CPI, as opposed to "nominal" which are the actual dollar amounts in the past).
I mean, I hate a lazy "citation needed" FUD attack as much as (really likely way more than) anyone, but with a bit more context I do think a citation is needed, as the correct citation in the other direction is (as someone else noted) Jevon's paradox: when you make it easier to X, you make it so people can use X in ever more contexts, and you make it so that the things which previously needed something way harder than X are suddenly possible, and the result -- as much in software development as any other field: it seems like every year it becomes MUCH easier to do things, due to better tools -- always seems to result in MORE demand, not less... we even see the slow raising of "table stakes" for software, such that a website or app is off-putting and lame to a lot of users if it isn't doing the things that require at least some effort: instead of animated transitions and giant images maybe the next phase of this is that the website has to be an interactive assistant white-glove AI experience--or some crazy AR-capable thing--requiring tons of engineering effort to pull off, but now possible for your average website due to AI coding. Meanwhile, the other effects you are talking about all started before AI coding even sort of worked well, and so have very little to do with AI: they are more related to monetary policy shifts, temporary pandemic demand spikes, and that R&D tax law change.