> plus there is all the energy needed for me to live in a modern civilization and make the source material available to me for learning (schools, libraries, internet)
To be fair, this is true of LLMs too, and arguably more true for them than it is for humans. LLMs would've been pretty much impossible to achieve w/o massive amounts of digitized human-written text (though now ofc they could be bootstrapped with synthetic data).
> but a modern human uses between 20x and 200x as much energy in supporting infrastructure than the food calories they consume, so we're at about 1 to 10 GWh, which according to GPT5 is in the ballpark for what it took to train GPT3 or GPT4
But if we're including all the energy for supporting infrastructure for humans, shouldn't we also include it for GPT? Mining metals, constructing the chips, etc.? Also, the "modern" is carrying a lot of the weight here. Pre-modern humans were still pretty smart and presumably nearly as efficient in their learning, despite using much less energy.
To be fair, this is true of LLMs too, and arguably more true for them than it is for humans. LLMs would've been pretty much impossible to achieve w/o massive amounts of digitized human-written text (though now ofc they could be bootstrapped with synthetic data).
> but a modern human uses between 20x and 200x as much energy in supporting infrastructure than the food calories they consume, so we're at about 1 to 10 GWh, which according to GPT5 is in the ballpark for what it took to train GPT3 or GPT4
But if we're including all the energy for supporting infrastructure for humans, shouldn't we also include it for GPT? Mining metals, constructing the chips, etc.? Also, the "modern" is carrying a lot of the weight here. Pre-modern humans were still pretty smart and presumably nearly as efficient in their learning, despite using much less energy.