On that note, my personal hypothesis, even more controversial than "IT unproductive hypotheses" in the article that means IT was a rounding error relative to earlier major improvements, is: in many areas - particularly in office work and all kinds of everyday errands - IT is anti-productive, as in it makes people less productive on the net.
The core hypothesis behind my belief is that introduction of computers to replace a class of tasks - up to and including a whole job type - is just shifting the workload, diffusing it across many people, where previously it was concentrated in smaller number of specialists. Think e.g. the things you use Word, Excel, Powerpoint, Outlook, etc. (or their equivalents from other vendors) for - before software ate it, a lot of that used to be someone's job. Now, it's just tacked onto everyone's workload, distracting people from doing the actual job they were paid to.
That would seem like obviously stupid way to do, so why would businesses all fall for it? I argue it's because even as shifting the workload makes everyone in the company less productive on the net, it looks like an improvement to accounting. Jobs with salaries are legible, clearly visible on the balance sheets. So is money saved by eliminating them. However, the overall productivity drop caused by smearing that same work across rest of the company? That's incremental, not obviously quantifiable. People and their salaries stay the same. So it all looks like introducing software and obsoleting some jobs saves everyone money - but then somehow, everyone is experiencing a "productivity paradox". But it's not a paradox if you ignore the financial metrics with their low resolution - focusing on what happens to work, it seems that IT improvements are mostly a lie.
If I understand it would be something like. You used to get a dedicated secretary. But now most of those roles are now handled by computer. So in a sense everyone has had their workload mildly increased. But worse than that the workload is typically of a different nature so, for example, excessive meetings are now easy to generate.
I would also add that it may be of a net benefit that fewer roles are needed. But that net benefit overwhelming goes to the owners of the company. And that's what we've been seeing the last 30+ years the very wealthy have become much more wealthy while everyone else is worse off. ()
() growing wealth inequality is very complex and I'm sure would be happening anyway. I'm not saying computers cause wealthy inequality but they don't seem to be doing much good in fixing it either
> But worse than that the workload is typically of a different nature so, for example, excessive meetings are now easy to generate.
That, but also:
- Secretaries were better at this work because that was their specialization, and they enjoyed efficiencies coming from focusing on doing a single specific kind of work.
- Those increments of extra work add up.
- Moving that work to everyone else means you now have highly paid specialists doing less and less of the specialized work they're paid for. In many cases (programming among them), context switching is costly, so the extra work disproportionately reduces their capacity at doing the thing they're good at.
This all adds up to rather significant loss of productivity.
The core hypothesis behind my belief is that introduction of computers to replace a class of tasks - up to and including a whole job type - is just shifting the workload, diffusing it across many people, where previously it was concentrated in smaller number of specialists. Think e.g. the things you use Word, Excel, Powerpoint, Outlook, etc. (or their equivalents from other vendors) for - before software ate it, a lot of that used to be someone's job. Now, it's just tacked onto everyone's workload, distracting people from doing the actual job they were paid to.
That would seem like obviously stupid way to do, so why would businesses all fall for it? I argue it's because even as shifting the workload makes everyone in the company less productive on the net, it looks like an improvement to accounting. Jobs with salaries are legible, clearly visible on the balance sheets. So is money saved by eliminating them. However, the overall productivity drop caused by smearing that same work across rest of the company? That's incremental, not obviously quantifiable. People and their salaries stay the same. So it all looks like introducing software and obsoleting some jobs saves everyone money - but then somehow, everyone is experiencing a "productivity paradox". But it's not a paradox if you ignore the financial metrics with their low resolution - focusing on what happens to work, it seems that IT improvements are mostly a lie.