That may not actually be a practical approach to the problem. Years (and jobs) ago, we rolled significant caching into Vegas (video editor), allowing the app to take up 8GB of RAM in cached frames, even on 32-bit builds. There was no way for the OS to do this sort of work for us. The best that the operating system would have been able to do would have been to cache file reads for us.
Caching the output of computation was incredibly valuable to our customers, and the caching that the OS would have done would have had little to no benefit to our customers.
The same could be said for Chrome. Maybe there is value to a globally coordinated OS caching manager, but the OS shouldn't have sole responsibility for caching. Such an approach is clearly suboptimal for the user in at least one use-case (see example above).
There are exceptions, but in almost all cases, an application hacking the OS' RAM management is slowing down other applications disproportionally in order to make itself seem faster.
Shaming such RAM hogging applications is not just legitimate, it's very much necessary. Otherwise all applications would just reserve as much RAM as they can convince the OS to give them, even if they have no real use for it.
There is no checks for this in place, so users need to understand the technical implications and punish RAM hoggers by avoiding them.
If your application can utilize RAM to disproportionally speed itself up and therefore on average speed up the workflow of most users, then that's legitimate, too, and an informed user will see the value.
You're talking about a professional software suite meant to work on workstations often solely meant to run that suite.
Chrome has to cooperatively share resources with other applications and with aggressive caching at application level, the only strategy at hands of the OS is to start swapping.
I'm responding to the assertion: "The OS should use all available RAM it practically can for things like cache.
Not the applications."
Applied absolutely, it results in deeply sub-optimal behavior in a large number of cases.
Another case could be image decode caching via glide for scrolling lists, or skia output tile caching in a browser, or texture caching in a game engine, or reference frame caching in a video decoder, or glyph caching in a word processor, or block caching in a constructive solid modeller, or result caching in a spreadsheet, or composition caching in a presentation tool, etc. etc. etc.
Caching is an enormously effective tool for applications and operating systems. If an operating system removed it as a tool for applications, it wouldn't likely be a competitive operating system.
Should applications abusively and single-mindedly monopolize memory usage? In some cases, maybe. In general, I think we'd tend to be on the same side of things. I like to see well-behaved applications (that make conservative and efficient use of system resources) and robustly managed operating systems (that don't let apps walk all over them).
Caching the output of computation was incredibly valuable to our customers, and the caching that the OS would have done would have had little to no benefit to our customers.
The same could be said for Chrome. Maybe there is value to a globally coordinated OS caching manager, but the OS shouldn't have sole responsibility for caching. Such an approach is clearly suboptimal for the user in at least one use-case (see example above).