I think it's just RAM reaching the comfortable level, like other things did.
Back when I had a 386 DX 40 MHz with 4MB RAM and 170MB disk, everything was at a premium. Drawing a game at a decent framerate at 320x200 required careful coding. RAM was always scarce. That disk space ran out in no time at all, and even faster one CD drives showed up.
I remember spending an inordinate time on squeezing out more disk space, and using stuff like DoubleSpace to make more room.
Today I've got a 2TB SSD and that's plenty. Once in a while I notice I've got 500GB worth of builds lying around, do some cleaning, and problem solved for the next 6 months.
I could get more storage but it'd be superfluous, it'd just allow for accumulating more junk before I need a cleaning.
RAM is similar, at some point it ceases to be constraining. 16GB is an okay amount to have unless you run virtual machines, or compile large projects using 32 cores at once (had to upgrade to 64GB for that).
16G is just enough that I only get two or three OOM kills a day. So, it's pathetically low for my usage, but I can't upgrade because it's all soldered on now! 64G or 128G seems like it would be enough to not run into problems.
What are you doing where you're having OOM kills? I think the only time that's ever happened to me on a desktop computer (or laptop) was when I accidentally generated an enormous mesh in Blender
I also have 64GB on my home PC and Firefox tends to get into bad states where it uses up a lot of RAM/CPU too. Restarting it usually fixes things (with saved tabs so I don't lose too much state).
But outside of bugs I can see why we're not at 100GB - even with a PopOS VM soaking up 8GB and running Firefox for at least a day or two with maybe 30 tabs, I'm only at 21GB used. Most of that is Firefox and Edge.
yeah, its definitely a cache/extension thing; usually when it gets near the edge I also restart. I do wish there were a way to set a max cache size for firefox.
> Drawing a game at a decent framerate at 320x200 required careful coding.
320x200 is 64,000 pixels.
If you want to maintain 20 fps, then you have to render 1,280,000 pixels per second. At 40 Mhz, that's 31.25 clock cycles per pixel. And the IPC of a 386 was pretty awful.
That's also not including any CPU time for game logic.
Most PC games of the VGA DOS era did exactly that, though.
But, well, a lot can be done in 30 cycles. If it's a 2D game, then you're mostly blitting sprites. If there's no need for translucency, each row can be reduced to a memcpy (i.e. probably REP MOVSD).
Something like Doom had to do a lot more tricks to be fast enough. Though even then it still repainted all the pixels.
For most users that is true. I think there were several applications that drove the demand for more memory, then the 32bit -> 64bit transition drove it further but now for most users 16GB is plenty.
16 GB RAM is above average. I've just opened BestBuy (US, WA, and I'm not logged in so it picked some store in Seattle - YMMV), went to the "All Laptops" section (no filters of any kind) and here's what I get on the first page: 16, 8, 12, 8, 12, 4, 4, 4, 4, 8, 8, 8, 8, 16, 16, 4, 4, 8. Median value is obviously 8 and mean/average is 8.4.
I'd say that's about enough to comfortably use a browser with a few tabs on an otherwise pristine machine with nothing unusual running in background (and I'm not sure about memory requirements of all the typically prenistalled crapware). Start opening more tabs or install some apps and 8GB RAM is going to run out real quick.
And it goes as low as 4 - which is a bad joke. That's appropriate only for quite special low-memory uses (like a thin client, preferably based on a special low-resource GNU/Linux distro) or "I'll install my own RAM anyway so I don't care what comes stock" scenario.
I agree that 4 GiB is too low for browsers these days (and has been for years) but but that is only because the web is so bloated. But 4 GiB would also be a waste on any kind of thin client. Plenty of local applications should run fine on that with the appropriate OS.
Compute resource consumption is like a gas, it expands to fill whatever container you give it.
Things didn't reach a comfortable level, Moore's Law just slowed down a lot so bloat slowed at pace. When developers can get a machine with twice as much resources every year and a half, things feel uncomfortable real quick for everybody else. When developer's can't ... things stop being so uncomfortable for everyone.
However, there is a certain amount of physical reality that has capped needs. Audio and video have limits to perceptual differences; with a given codec (which are getting better as well) there is a maximum bitrate where a human will be able to experience an improvement. Lots of arguing about where exactly, but the limit exists and so the need for storage/compute/memory to handle media has a max and we've hit that.
Back when I had a 386 DX 40 MHz with 4MB RAM and 170MB disk, everything was at a premium. Drawing a game at a decent framerate at 320x200 required careful coding. RAM was always scarce. That disk space ran out in no time at all, and even faster one CD drives showed up.
I remember spending an inordinate time on squeezing out more disk space, and using stuff like DoubleSpace to make more room.
Today I've got a 2TB SSD and that's plenty. Once in a while I notice I've got 500GB worth of builds lying around, do some cleaning, and problem solved for the next 6 months.
I could get more storage but it'd be superfluous, it'd just allow for accumulating more junk before I need a cleaning.
RAM is similar, at some point it ceases to be constraining. 16GB is an okay amount to have unless you run virtual machines, or compile large projects using 32 cores at once (had to upgrade to 64GB for that).