It's an interesting possibility for sure but to me these concepts are not linked. With the recent LK-99 craze, I learned that theoretical optimal efficiency for computing is many many orders of magnitude higher than today. So: chips theoretically can get much more efficient. If we find a 1000x more efficient computer, do you still think we need to throw the same 20% of our resources towards it? What would we let those 1000x more capable computers do? The question we need to ask is: what can we do with computing, what would it give us and how much energy does that cost.
I don't want to sound like "384kb is enough for everybody" but saying there's a fixed percentage of energy that should go towards computing is weird to me.
There's not a fixed percentage. There's an optimal balance that likely changes depending on the environment.
But you do sound like you're saying "384kb is enough for everybody". The reason to devote more resources to cognition is is precisely because we can't imagine the possibilities that exist with more clever thinking applied to our limited resources. In the same way, an ape with 10% energy allocated towards cognition (guessing) can't even begin to imagine the magic that gets unlocked by its ancestors that gambled on 20%. Hell, apes can look at us now and still can't understand us.
In this conversation, you're the ape who's blindly suggesting there's little worthwhile in expanding resources towards global computation, and I'm the ape who's blindly suggesting there probably is. Neither of us can honestly predict what might happen, good or bad.
I don't want to sound like "384kb is enough for everybody" but saying there's a fixed percentage of energy that should go towards computing is weird to me.