This seems remarkably low given the utility we get out of them. I was expecting at least an order of magnitude more.
The headline particularly seems weird given the actual numbers. Lots of things take 2% of power, why not target them?
An order of magnitude more would be comparable to electricity consumption by households. Its utility feels to me at least an order of magnitude higher than the one of data centers, as sheltering, cleanliness and everyday appliances are much lower in the Maslow pyramid!
Sure, housing takes ~20%. But while you are sitting in your nicely electrified house, watching Netflix, You're still using that datacenter's power. All the people in all their nice electrified houses are looking for electrified entertainment served from datacenters, along with all the remarkable shit computers get used for.
So it's not that housing isn't more important, it's that computing does so much with so little already.
This figure would have to ignore basic things like the connectivity and just be "Netflix server Wattage / user count = everything that isn't your TV". Even with an egregious TV that's still less remaining Wattage than it takes to get the Wi-Fi signal from the AP radio to your TV in your own house.
You definitely should, I know some folks that keep their huge TV on 24/7 because they like the pretty screensaver and photos it shows when idle. These same folks take a lot of other steps in their life to try to help the environment (recycle, drive electric cars, etc) but simple stuff like turning things off completely confounds them.
I'm a Brit. We don't have air con by default. For a few years me and the wife had Floridian relos (Coral Springs). We visited in summer and the house is about 17C (65F). Outside it is at least 35C (95F). The air con unit in the garage is pissing water everywhere and under quite some load.
I did ask why they kept the house so cold and was told "because we can". I got it: they had a really hard start off in life and were now able, through parental sacrifices etc, and their own hard work, be able to basically show off and keep their house cold in summer. I've seen the same in TX - my brother worked for EDS near Fort Worth a while back (20 years back). The attitude is the same there and again, I understand the individual stories of being able to conquer something that they could not growing up. Obviously you also get the "because we can" from multi millionaires too but for what sounds like a different reason but is really the same.
In the UK we constantly get badgered about water use. The fucking stuff falls out of the sky with monotonous regularity. The place is a series of islands. The Atlantic is off to the left and that's a lot of water. What is really wrong is our management of the stuff. There is rather a lot of technical debt in our water management system and it will need real money to fix.
Faffing about a TV isn't going to save the world - that bollocks is for politicians and fossil fuel companies to victim shame consumers instead of giving a shit.
A TV uses eleccy and that can be solar/wind/unicorn farts or good old fashioned gas or liquidised puppies.
My home's underfloor heating is eleccy and hence seriously expensive but I am told that my supplier - Shell Energy - only uses renewables to deliver it (Shell? RLY?). I originally signed up with a "green" supplier but they went bust along with a few other shyster energy suppliers hereabouts when the shit really hit the fan. Ukraine invasion nightmare. OPEC opportunistically taking the piss as usual and massive companies filling their boots post a pandemic.
Fun fact: average household electricity consumption in Florida is among the lowest in the US. [1] This is because heating generally requires more power than air-con, and indeed, the average British home uses 12,000 kWh on heating, vs. on the order of 8,000 for air-conditioning the the average Florida home. This is despite Florida using 4x more power for air-con than the US average, and that the average Florida home is a lot larger than an average UK home.
Actually, your reference [1] states the opposite: Florida's "annual electricity expenditures are 40% more than the U.S. average"... which is caused by air conditioning.
I think you meant to state that Florida's energy (not electricity) expenditures are amongst the lowest in the US.
Sorry, you're right: FL has low energy consumption but high electricity consumption. The point still stands though, it's still better for the environment to live in FL and use air-con than to live in the UK and heat your house through the winter.
A typical 55” TV uses ~100 watts. That’s roughly 10 LED bulbs. That’s ~900 KWh/year
A Tesla model S has between 60-100kwH battery depending on the model.
So a TV can be powered for 1/10th-1/20th a year with one Tesla charge.
In the US, our electricity generated ~0.85lbs of carbon emissions per kWh. Some places (California) can run part of the day entirely renewably. The EPA says a gallon of gas generates 20 lbs of carbon. A single car tank of 20g generates 400lbs, while a Tesla would generate 85lbs, and a year of TV would generate 765lbs. a model s Tesla has a 400m range, while the average American car has 22mpg, so that’s why I picked a 20g tank.
If running the TV 24/7 stops 2 trips a year by making the home more pleasant, it’s a carbon positive. If those same people who like the “pretty photos” drive an electric car, each “tank equivalent” is 1/8th a year of TV.
My country (just to the north of yours) produced ~0.0551lbs of carbon emissions per kWh. That works out to less than 50lbs, less than a Tesla charge for you.
I certainly don't have my TV on all the time, but I don't think much of my energy consumption as it is predominantly renewable and ultimately the superfluous usage is negated by any unnecessary driving due to my car's combustion engine. If anything, the larger concern is almost always expense, as prices here are roughly the same as there.
I don't have much of a point to make, just thought it was interesting to compare. My peers and I are pretty worried about the situation in the US, though.
> My peers and I are pretty worried about the situation in the US, though
TBH I'm not too worried. Not in the head-in-sand way, but in the optimism of the future way. The US largely has issues with coordination but tends to lurch in the right direction on things like this once momentum builds. The economics of it will inevitably lead to the healthy outcome, and enough people care to pressure big industries (slowly...). We've been rapidly converting to renewable energy lately, so we're starting to make big moves in the right direction. Thankfully we're a rich nation, and somehow have endless money to burn. It's hard to coordinate, since we have a geographically large nation to power, and an unfathomable appetite to use energy.
I'm jealous of your nation, which seems to have a sensible populace and leadership, relatively high wealth, and a relatively constrained geography with ample sources of power.
For reference, the OLED LG G3 55-inch TV is rated for a power consumption of 375 W, but that rating typically indicates a worst case scenario (eg. maximum brightness, all-white screen, loudest volume, etc). So your 100 W figure is probably about right.
What’s the relative impact of a mile driven vs. an hour of the tv being on?
Edit: Hmmm. According to gpt-4, the average TV would incur about .02 kg of CO2 per hour of usage, assuming a typical electrical grid mix in the US. While it estimates that driving 1 mile emits avout 0.1 kg of CO2.
If that’s the case, 5 hours of TV is roughly equal 1 mile of driving. Interesting.
Of course, if the grid has a higher percentage of renewables or the car is electric, that changes things.
How much does turning the TV off reduce energy consumption? I know some appliances waste nearly as much power in "standby" mode as they do during operation.
Still wasteful and frivolous. Imagine how many more people the globe could support if we stopped being so greedy and thoughtless with entertaining ourselves.
My point was more that we would collectively be much worse off without household consumption than without data centers (so no Netflix, no smartly managed grid etc.)
Funnily enough I’ve really only heard it referred to as “Maslow’s hierarchy” and not a pyramid specifically until this thread. Which appears to be proper accreditation.
In Norway I have often heard it referred to as “Maslows behovspyramide”, which translates to “Maslow's Pyramid of Needs.”
Indeed, even one of the greatest encyclopaedias in Norway mentions “Maslows behovspyramide” in their article about Maslow. https://snl.no/Abraham_Maslow (article is in Norwegian.)
30ys ago my mother spent an hour a day vacuuming the house. Now she has an internet connected robot, and spends an hour once a week. She’s doing better than she was 30y ago, even if she didn’t think she was doing bad 30y ago.
TBF that robo vacuum didn't need a datacenter to function. I had a Neato Botvac from pre-connected models that worked fine until the laser turret failed and I haven't been able to get around to repairing it yet. Roombas were around for like a decade before they started getting connected.
Lots of things take 2% and that’s the problem - you can keep breaking down consumption into finer and finer categories, to find that all those fine categories each only take a small percentage of the total.
The problem is everything, all together. All the 2%s each need to take action in their own way.
This is the opposite, though. In order to cobble together a number that seems even remotely problematic they had to combine several things that are weakly related: all IT workloads, plus all wireless networks. They also needed to express it in terms of global electricity, instead of global primary energy, because it is absolutely dwarfed by fuel burned for transport. Finally, this is a consumer that can be trivially decarbonized.
Compare private passenger cars which are ~12% of global CO2 and there are no practical ways to decarbonize that.
Never trust a mathematician wrt trivial | left as an exercise to the reader :
"One day Shizuo Kakutani was teaching a class at Yale. He wrote down a lemma on the blackboard and announced that the proof was obvious. One student timidly raised his hand and said that it wasn't obvious to him. Could Kakutani explain?
After several moments' thought, Kakutani realized that he could not himself prove the lemma. He apologized, and said that he would report back at their next class meeting.
After class, Kakutani, went straight to his office. He labored for quite a time and found that he could not prove the pesky lemma. He skipped lunch and went to the library to track down the lemma. After much work, he finally found the original paper. The lemma was stated clearly and succinctly. For the proof, the author had written, 'Exercise for the reader.'
The author of this 1941 paper was Kakutani."
Also, in some cases various types of obstructionism. I've heard several stories (including some from my home state) of wind farms being fought against tooth and nail, even in cases where the negative impact is somewhere between tiny and minimal (e.g. the only town nearby would see them as 1" tall on the horizon). Some of this is no doubt thanks to campaigns from incumbent energy providers and other underhanded methods like weaponization of environmental regulations.
LLMs will radically accelerate power consumption growth, given the extreme processing power they require compared to traditional apps.
It'll probably go to 10% in 2 decades if unchecked. The IEA analysis is completely outdated because it was made pre ChatGPT (As evidenced by the many references to crypto, which is no longer relevant).
Granted, its very easy to boost power efficiency in data centers, given the high margins and low pollution, so the industry is putting huge efforts into restraining power consumption in data centers.
> This seems remarkably low given the utility we get out of them.
Having lived much of my life without an Internet connection, I wouldn't overvalue the utility, either. My life is better, but not that much better today.
False dichotomy. Not being on social media doesn't mean that one is spending time on personal growth. Some people just like to stare at the ceiling, for example.
Keep in mind that electricity generation itself is only responsible for ~35% of our greenhouse gas emissions (power generation was responsible for 13 Gt CO2 in 2019, globally 37 Gt)
So if we wanted to estimate the global impact of data centers on CO2 emissions, that's about 0.6%. And that's not a perfect calculation as much of the new power capacity is in renewables, thus lower relative emissions. <0.5% would be a better estimate.
So 1/200th of our greenhouse gas problem is caused by data centers. I'll echo some of the other comments in this thread but - damn, that's a pretty good deal considering how much value we get out of it.
There are also some german projects to integrate datacenters into the district heating nets. Capturing the waste heat and using it for heating homes allows us to utilize a very large percentage of the electricity pumped into a datacenter one way or another.
But the bigger problem with the article IMO is the fact that this is degrowth. Why is the electricity consumption important? Carbon is the real metric. Big server operators (meta, aws, azure, gcp) already either are 100% zero-carbon powered or on track to be.
Regardless, if grid needs to add capacity for EVs, heat pumps, other electrification goals, why are servers (probably low impact) important?
Electricity is fungible, so when big operators buy wind-only power from the market, the rest of the demand uses the non-green power left in the pool.
(It still potentially increases wind buildout so it can have indirect positive impact especially if wind is not competitive at the same prices with fossils)
Also the accounting is "we bought as many MWh of wind power as we used" over some time window, so in reality they are using fossil power in peaks, competing with everyone else and placing pressure to expand fossil based capacity.
These are also the reasons it's probably better to talk about the electricity used instead of trying to translate it to emissions by proxy, which is prone to being gamed due to abovementioned reasons.
(Also the number is high rather than low if you consider that all the individual slices of the emisison pie at this subdivision granularity are pretty small)
Electricity is a little less fungible than folks make it out to be.
Iceland, for example, has abundant energy from various sources such as geothermal. However, that energy is not easy to send across the ocean to consumers. So, Iceland instead smelts a lot of aluminum, which takes a lot of energy, and is far easier to export cost-effectively.
So a kWh produced in Iceland is not exactly the same as a kWh produced in Germany; it's not truly fungible.
In that vein, datacenters are often placed in such areas. One at Microsoft for example is in Wyoming, near a hydro dam that is far from any dense population.
Yeah. There are lot of such “feel good”, technically true, but not logically expected reporting. The solar panels on my roof generate a lot of electricity while I’m not home, and not really using it, but I feed into the grid, and pull when I need it(when the sun is not shining). I can’t boast that my energy is so renewable that I can start mining bitcoins at home - that’d be stupid.
To quantify this to put it in perspective, the research I've seen puts waste heat is in the very low single digit percentage points (I can't remember exact numbers but I think it was something like one or two percent) of climate impact - and our biggest sources of waste heat by a large margin would likely be thermal power plants, where for a coal plant for instance you dispose of more energy in waste heat than using all the electricity generated for resistive (100% efficient) heating, and that's before thinking about all the carbon and the other pollutants (and then have to deal with huge amounts of fly ash left over). After thermal power plants, the next largest sources of waste heat would likely be industrial processes.
So the more renewable you can make your energy mix, the less waste heat you have from thermal power generation, and then the waste heat from electricity use is probably pretty negligible after that.
Going to have to define “waste heat” in this context, the crowd that needs to read this currently believes “energy use I don't like is generating waste”
Also, what about consideration for how much other energy is displayed by automated processes doing things in the cloud? Without the datacenter and the cloud applications, how much carbon would be created through more manual processes?
yes but some of this 2% require massive amount of potable wáter as cooling and this (zero-carbon initiative) only account of direct electricity use not indirect use
yes you need ,sorry for being late too the the part, you don't use salt water and contaminated water source because they damage the machine faster , the only example in function was a showcase from Microsoft few years back when the put few servers running, but it was most expensive than consume the clean water from the same source we do so mostly a stunt, maybe a preview.
You don’t, but I’m assuming that many companies which either sell or does water cooling may still use municipal treated water because doing it differently requires quite a lot of corporation.
What we try to do here in Denmark, which is a small country and why we do this, is that we try to use the water cooling for both cooling and heating. I’m not completely sure if the correct English term is district heating, but it’s where you cook the data centres with water that is then cooled down in a long circuit where it’s used to heat nearby homes. It’s basically what we do with excess heat from fossil fuel plants and garbage burners as well (we also use the heat to generate electricity in some cities). I believe some of the data centres build by companies like Facebook and Microsoft are working on this with the local cities, but it’s mainly done because of regulation and political demands and not so much because the companies themselves want to do it by default.
They only care about computing power consumption when it is not in their control.
You mining crypto on solar for world peace: Evil pollution demon.
They burning 2% of all power on nuclear powered banking, military, and NSA spy rigs: The bestest environmentalists evar.
Eh, nothingburger. As the original source of this data notes in their report, global IT systems energy demand has only slightly expanded in the past decade, because the countervailing trend is that workloads are migrating from tragic-comic corporate datacenters with PUE of 2 or worse, into the cloud where the PUE is 1.1 or so. And in the cloud the power that is delivered to the CPU is better used due to oversubscription, multitenancy, and so forth. Finally, because large-scale operators often also build their own renewable energy facilities, their fraction of global CO2 is much less than their fraction of energy consumed.
Naturally you should design your systems for maximum efficiency, but since operating expenses are closely approximated by power consumption you are already correctly incentivized to do so.
Plus data centers are low hanging fruit for decarbonization. Everything (apart from diesel backup generators) is already electric, so as the grid de-carbonizes so do data centers.
As you mention, DC operators are already highly incentivized to reduce energy costs, so as wind and solar power continue to drop in costs DC operators will want to fund or buy green power as often as possible.
Getting backup power to be carbon neutral will be a little more challenging in a DC, but it's a walk in the park compared to many sectors. Shelf-stable biofuel is already fairly accessible and using grid scale batteries can reduce the need to combust anything (plus open power arbitrage opportunities).
I’m not sure datacenters have a great incentive to reduce energy costs (a lesson I learned the hard way after investing in a DC energy optimization software company). People costs are much greater than energy costs. Electricity just doesn’t cost that much in the grand scheme of things. If additional energy spend is necessary to ensure uptime, the choice is an easy one.
The actual staffing of DCs is rather sparse for hyperscalers and are typically outsourced if not prime DCs to a few companies that also do it with fairly low headcount.
Hyperscalers in particular currently can’t scale out very easily necessarily because many of the DCs have hit local power restrictions so compute power efficiency and density are the primary means to grow capacity. Think apartments in skyscrapers v row houses. So electricity _efficiency_ is what is being sought rather than just pure operational costs of the DCs.
I’m the northern Virginia exurbs hyperscalers have bought out so much of the land it’s now perhaps greater land use than family farms there and starting to encroach upon housing developments. Locals also complain about the noise (there’s a sort of hum from the sheer mass of air and whirring fans) from DCs impacting health. Some areas have been reporting increased rates of hearing disabilities although I can’t recall a study being conducted yet.
> People costs are much greater than energy costs.
Can you prove this assertion? The first page of Google results indicates energy is a greater cost than labor, with figures ranging from 60-70 percent of total operating costs.
(i owned a small web hosting company in the early 00s, and am somewhat familiar with the per sq ft cost model of colos and datacenters in general, having had to contract for space and perform part of the buildout myself)
A friend of mine was in management at a large company trying to build up infra to compete with AWS 10-15 years ago (they started this effort when AWS was only S3). This company was bringing in billions a year operating dedicated off-site data centers for other companies, and they saw that AWS was a major threat to their business model. They dumped many, many millions of dollars into building up their infra.
When they gave up, they concluded that AWS's core advantage was their optimization of the electricity needed for cooling. Given that my friend's company failed to design their systems to heavily optimize for this, the cost of electricity for cooling alone didn't let them get near AWS prices without incurring significant losses. He said that if they did it again, every part of the product would have a strict heat budget.
I think this is taking an expansive view of labor e.g. writing all datacenter applications as optimal code in low level languages.
The general disinterest in performance/efficiency work even for large scale datacenters is a major theme of Dan Luu’s writing. I’m not sure how that applies to the rest of the industry but I can attest that my company’s performance engineering team has been zeroed in every round of layoffs we’ve ever had.
> but I can attest that my company’s performance engineering team has been zeroed in every round of layoffs we’ve ever had.
That doesn't necessarily mean management doesn't care about efficiency and performance, it just means they don't value the performance team (right or wrong).
It's certainly possible for an organization to fail to benefit from performance work. They can make all the servers twice as fast but if all that does is lower the utilization from 10% to 5%, that organization saves nothing, and Amazon pockets the difference in energy usage.
The larger your footprint in the cloud is, the larger incentive you have to make it more efficient though. I think it’s a somewhat self-correcting problem.
The biggest players are running highly, highly efficient software.
The larger you get, the better electricity rates you can negotiate, increasing the opportunity cost of focusing on efficiency.
As long as the industry is growing (it is), the opportunity cost will be high. It seems more likely it's the opposite of a self correcting problem. My 2 cents.
Dan Luu’s point is the big players are in fact leaving 7+ figure optimizations on the table all the time, and not really caring when they get found. We’re of course free to disbelieve him, but it tracks with my experience.
In developing countries like India where the power grid is not stable, to maintain DC uptime you probably need backup generation, which is very expensive very fast.
Data centers are the cost. I worked for an entity that automated key operational workflows. They went from 15,000 employees in that division to 4,000. The energy cost of those buildings, overhead, etc far exceeded the datacenter, and the datacenter cost and use gets whittled down every year.
From 2005 to now, how much more computation and network transfer, can be done per kwh? No more T1s and T3s... it's all optical; and the efficiency of CPUs and amount of RAM per CPU, has increased greatly as well.
The efficiency of hardware certainly grew but inefficiency of software grew with it. Why all sites don't load in 500ms or less ? Our compute power grew few magnitudes but sites still take few hundred ms (hell, second) to render on server side and then burn more CPU on client side
For the most parts your points are good, but the large scale facilities buying up renewable generation resource both hides the true energy cost whilst consuming that renewable resource. I suspect data centres are good candidates for such resource usage since they can be built optimally for generation capacity, but it's definitely possible the numbers could be skewed by the data providers' ability to hide energy consumption whilst having a net global impact.
And then some yahoo thought that a 3 line js "library", should pull in 4000 other malware laden, bloated js "libraries", and all out power savings were for naught.
Just the power consumption in browsers...
Thanks node.
> The industry consumes as much electricity as Britain—and rising
I'd rather have datacenters than entirety of britain so I think that's easy tradeoff to make.
But on serious side, what a joke of an article. I'd imagine just not driving to work gotta save far more power than average user's datacenter usage footprint.
Wouldn't we need to take into account all networking infrastructure, computer industry and consumer electricity consumption to see the price of the information economy?
How large part of GDP produced world wide is using these data centers in one way or another (My bet is 80%+).
There isn't a single Fortune 500 company that does not have its data on computers and thus in some data center somewhere (if they are sane). Most also have moved a huge chunk of their processing there too.
Personally I can't even come up with a "serious" business that does not use data centers in one way or another. Even basic mom&pop store that does all of their inventory on a laptop locally and makes all of their products by hand uses data centers if they accept credit/debit card payments or they have a phone number.
I don't understand the moralised takes on what are essentially just interesting statistics.
From a climate perspective it's almost completely irrelevant how much electricity we use or what percentage is assigned to each thing.
The only thing that matters is that we keep atmospheric CO2 levels stable. We can't just reduce emissions by 50% or whatever, we have to actually just stop.
It's as if there were a water leak from the bathroom tap upstairs and people are talking about which hole in the ceiling would be the best one to plug. It doesn't matter how many holes are in the ceiling, it matters that the tap is on and the sink is overflowing.
> When the Energy Systems Integration Facility (ESIF) was conceived, NREL set an aggressive requirement that its data center achieve an annualized average power usage effectiveness (PUE) of 1.06 or better. Since the facility opened, this goal has been met every year—and the data center has now achieved an annualized PUE rating of 1.036.
> Studies show a wide range of PUE values for data centers, but the overall average tends to be around 1.8. Data centers focusing on efficiency typically achieve PUE values of 1.2 or less. PUE is the ratio of the total amount of power used by a computer data center facility to the power delivered to computing equipment.
This is just a ratio of power fed into computers vs not-computers. It doesn't measure the effectiveness of power->problem_solved. If you were running P4 space heaters but delivering power to no other equipment, you would have a PUE of 1.
"Estimated global data centre electricity consumption in 2022 was 240-340 TWh1, or around 1-1.3% of global final electricity demand. This excludes energy used for cryptocurrency mining, which was estimated to be around 110 TWh in 2022, accounting for 0.4% of annual global electricity demand. "
Des Moines has several of them since the local grid is 80% wind and thus renewable. Microsoft, and Apple both have or are building one. I think Google as well. Only employees a few hundred locals in total, so not great for the ecconomy.
The real question to ask is how does jet fuel to get people to headquarters for training fit in.
Even assuming that carbon emissions were being taken seriously (they are not: fission has moved on in safety and cost effectiveness by more than a little since the 1970s).
Even then, stack ‘Em high in Iceland or whatever with water and geothermal, or get coastal in the desert with PVs in the USA if geography matters.
Problem of public will to tackle entrenched interests.
The human brain consumes 20% of calories - a balance of brain vs brawn that has proven effective. I'd argue we'd be much better off with radically more energy (as a percentage of global energy) going to data centers. I'd also argue that this equilibrium will find itself.
Planetary economies are just a different kind of organism, that at this point is a cyborg.
Thinking about the ideal energy balance devoted to planetary cognition vs planetary kinetics seems like a fascinating way to model the world. The main thing that makes humans so dominant is that we took the risk of devoting more of our energy towards cognition and as a result discovered magical ways to leverage and exponentially increase our kinetic power (e.g. bow and arrow).
What happens when we start devoting more resources towards cognition at a planetary scale?
It's an interesting possibility for sure but to me these concepts are not linked. With the recent LK-99 craze, I learned that theoretical optimal efficiency for computing is many many orders of magnitude higher than today. So: chips theoretically can get much more efficient. If we find a 1000x more efficient computer, do you still think we need to throw the same 20% of our resources towards it? What would we let those 1000x more capable computers do? The question we need to ask is: what can we do with computing, what would it give us and how much energy does that cost.
I don't want to sound like "384kb is enough for everybody" but saying there's a fixed percentage of energy that should go towards computing is weird to me.
There's not a fixed percentage. There's an optimal balance that likely changes depending on the environment.
But you do sound like you're saying "384kb is enough for everybody". The reason to devote more resources to cognition is is precisely because we can't imagine the possibilities that exist with more clever thinking applied to our limited resources. In the same way, an ape with 10% energy allocated towards cognition (guessing) can't even begin to imagine the magic that gets unlocked by its ancestors that gambled on 20%. Hell, apes can look at us now and still can't understand us.
In this conversation, you're the ape who's blindly suggesting there's little worthwhile in expanding resources towards global computation, and I'm the ape who's blindly suggesting there probably is. Neither of us can honestly predict what might happen, good or bad.
What percentage should they account for? The fact that they are writing an article with this headline would seem to imply that they think this is too high.
1.5%-2.0% for something that runs a significant portion of the world's economy doesn't seem too bad.
This is only a problem if the computational power is sitting idle, or if there's reason to believe more efficient servers exist and are not being used effectively. Of course replacing millions of functioning servers just to reduce energy usage is very wasteful in terms of environmental impact too.
Lots of that computational power is also used to offset other energy consuming tasks. Like video conferencing instead of office buildings or travel.
It's not clear that even 5% going to servers is an issue in that framing. What matters is the value created per person using online services.
We used to have a printed newspaper that was run off a large press, consuming tons of paper and many gallons of ink, delivered by truck and car to stores and homes, every day, 363 days per year (Thanksgiving and Christmas Day were the exceptions).
Now, the paper has closed and we get news online. It's not quite the same as relaxing in a chair with a printed paper and a cup of coffee, but I'm guessing the per-capita resources used every day are a small fraction.
The US department of energy estimates that 10% of the US power grid is devoted to lighting, last I checked. If we extrapolate that out to all nations and power grids, 2% is a sizeable but okay amount of energy for handling large amounts of people's data.
Wow that's a lot considering our lighting has gotten 10x more efficient in the last 2 decades. A 100W bulb can now be covered by a 10W LED. And really nobody uses incandescent bulbs anymore.
An estimated 30% of U.S. households used incandescent or halogen incandescent light bulbs in 2020
Americans are still the funniest
“I’m happy the Department of Energy is out here making sure that we can all save money because we’re too dumb to figure out how to do it ourselves,” Rep. Brian Fitzpatrick, R-Penn., said.
Does the DoE have the opportunity to say "you're welcome!" to Rep. Brian Fitzpatrick, R-Penn?
Should also add in: Remember to pump up your tyres! it helps save petrol!
I don't know if it will ever shrink under 10%, and even if we could shrink it we wouldn't want to. Hear me out:
Now that lighting is cheaper, we can have more of it at night. By properly lighting parking lots, community streets, and parks we can reduce crime while increasing economic and leisure opportunities in cities.
I don't think the 10% figure is too much. Lighting frees up spare time for leisure, study, or even business (safely). If anything, lighting is the ultimate service because it gives people more time to do what we must.
Most of the discussion here centers around utility and human benefit vs consumption.
What’s really striking about crypto using 16% of the power usage of data centers is the population served and utility provided.
The internet has roughly 5.2 billion users, all of which utilize the cited datacenters.
Bitcoin, as one example, has an all time high of 568k daily transactions.
US adults on average do 2.3 financial transactions per day.
Bitcoin, in the busiest day on the network ever, powered enough transactions for the economic activity of a small US city.
Anyone can look at a block explorer for [pick a network] to find that, at best, the total number of worldwide crypto users is roughly 0.5% of the internet population.
So 16% of the power for 0.5% of the users - and who knows what value/utility is achieved with whatever those crypto users are doing (not much).
Yikes... digiconomist. Literally zero credibility there. His name is Alex de Vries and he works for the dutch central bank. To my knowledge, very little of his blog posts have made its way to peer review and academic publication. For some reason that doesn't stop his work from being distributed widely as an authoritative source on this topic.
This paper has it's own problems with conflicts of interest, but it has gained traction recently and is worth a read to see things from another perspective.
Could also look into the work of Troy Cross, Margot Paez, and Daniel Batten who are climate activists and pro-Bitcoin because of the incentives it provides around building out renewable energy and mitigating methane emissions.
And NY Times is notoriously anti-Bitcoin, to the point you have to ask, "do they have an agenda"?
> This paper has it's own problems with conflicts of interest, but it has gained traction recently and is worth a read to see things from another perspective.
Like most defenses of bitcoin’s carbon footprint, the paper you linked makes the case that theoretically maybe bitcoin could be carbon negative in the future if certain things happen. If you look at the actual current source of power for miners weighted by hashrate in the US, it’s mostly coal and natural gas. Among companies that don’t have to report this, such as miners in Russia and Kazakstan, it’s likely as bad or worse.
> And NY Times is notoriously anti-Bitcoin, to the point you have to ask, "do they have an agenda"?
They have also published a lot of things that have been criticized as being too pro-crypto (such as the Latecomer’s Guide to Crypto).
Could this be a vehicle whereby private data centers of insufficiently large scales are banned? I.e. a force for centralization, in the name of energy efficiency?
Certainly if Germany requires a PUE below 1.3 that puts every datacenter except the big hyperscale clouds out of business. This will be yet another European regulation that shuts down local businesses and drives customers into American clouds. If such regulations spread I don't see how makers of traditional datacenter junk can survive. Operators with PUEs < 1.1 are not using UPS, managed PDUs, redundant hot swappable power supplies in servers, CRACs, or any of that crap.
Of course I think regulating the PUE is a terrible idea. This is yet another aspect of the economy that is better-regulated by a carbon tax.
Hetzner (German data center provider) claims to achieve a PUE of 1.1 (https://www.golem.de/news/besuch-im-rechenzentrum-so-betreib...), admittedly their cloud offerings are quite limited but I think they are expanding on that front. So it doesn’t seem like only hyper scalers would fall into that limit.
Hetzner runs an intentionally primitive shop. Famously, one of their (historic) cheapest offerings were desktop "servers" on wooden shelves with flying cabling. So anything in the way of UPS, PDUs, monitoring, airflow, etc just isn't there, keeping PUE low.
Yes, and I think this leads to a more sophisticated analysis than PUE gives us, because a shop like Hetzner puts more of the burden for reliability and availability on the customer, compared to an Amazon or Google who internalize as much of the redundancy and replication that they can manage.
An example of where the PUE analysis really fails: I have two facilities, one on each American coast, and they operate in a primary-spare arrangement. This is far, far less energy efficient than if I have 20 datacenters all over the place and I am prepared to lose 2 of them at any time. In the latter architecture I am using much less energy, but enjoying much better reliability. PUE does not capture this type of architectural waste. It also fails to reflect the problem of burning megawatts because you are running your logs analysis pipeline in Perl or whatever.
That wasn't much of a refutation. In the cloud you can move up the food chain to get durability and availability without thinking about it. It is much more efficient (and way, way easier to manage) to just chuck your data into Cloud Spanner than it would be to operate a geographically diverse triplet of Postgresql instances.
> Power usage effectiveness (PUE) is a metric used to determine the energy efficiency of a data center. PUE is determined by dividing the total amount of power entering a data center by the power used to run the IT equipment within it. PUE is expressed as a ratio, with overall efficiency improving as the quotient decreases toward 1.0.
Doesn't sounds like it accounts for computing efficiency then? That would probably be quite impossible or at least very difficult anyway. Of course, it's still great for a datacenter to actually use its energy for running the computers instead of just AC and transformers.
Heat recovery from datacenters for district heating is one way to increase the efficiency; I wonder if it impacts PUE?
Even worse, a future requirement will be compulsory use of waste heat. No matter the cost or efficiency, you have to find users for the heat given off by your cooling. Good luck finding people wanting their homes heated in summer...
It is an excellent idea, so are having "waste heat networks". You could then make a heat sink at scale that is also a reservoir for reuse. This would be as simple as installing another water loop that services location just like our existing water system.
Of course the details need to be worked out, but if a business district had a WHN, it would make it easier for mom-and-pop datacenters to built in urban environments.
It wouldn't be that much different than the steam loops that lots of existing cities have in their downtown core.
On the contrary, it can be extremely difficult and expensive to do with waste heat from datacenters. Your average cooling equipment, of the energy conserving kind has 2 modes:
First mode, for cold outside weather e.g. in winter is free cooling, where you just use convection or fans to bring in cool outside air, push out warm inside air (there are also variants of this like "tokyo wheel", but those are unsuitable for heat reuse). You could at best use the warm air output (~25°C) to directly heat neighboring buildings, but the air ducts you need for that will be massive. Comfort level in the buildings heated this way will be low due to high air velocity and associated noise. Also, air ducts are a fire hazard and high maintenance to keep dust and vermin out.
The other mode is water cooling (either direct or indirect) where you cool your servers directly by water or the air through water radiators. The warm water is then either cooled down with outside air, outside air plus evaporation (both possible only when it is not too warm outside) or compressive cooling (aka heat pump, the usual big MW-scale machines in the cellar). In those cases, district heating will only be possible if you can reach a sufficient water temperature somehow. E.g. directly cooling your servers uses at most 30°C intake and outputs at most 50°C. District heating usually runs at 70°C, so you would need a running heat pump to make up for the 20K difference. When the servers are indirectly air-cooled, the difference will be even larger. So you will always need those big MW-scale heatpumps running, just to make use of your waste heat, at great expense and for the uncertain benefit of maybe selling your heat to neighbors. This is deadly for mom-and-pop datacenters because of the uncertainty (maybe you can sell your heat, maybe it'll be too expensive) and the huge investment, your cooling equipment will be far larger (more and bigger heatpumps), more expensive, redundant (because in summer you will still need to have equipment to give of heat into the outside air).
All the sufficiently large customers I know of are looking to move abroad, for this reason and the astronomical cost of electricity in Germany.
I think most DC equipment should have built in coolant loops, standardized to the point where you can order all the equipment and just plug it in.
Spec that DC components can run at a much higher temperature.
The issue is that DC operators get to trade money for a natural resource (water, power) for lower up front build costs. Owner operators do a much better job, but Google still extracts billions of gallons of water from municipal supplies, often even ground water which I think should be a crime.
I'd also like to see the cost of cooling be passed on to the cloud customer. Mixing it into the hourly price causes a tragedy of the commons.
But cost of cooling is passed on to the cloud customer. Different zones/DCs/regions have different hourly cost which is a function of, among other things, the local price of electricity and cooling.
It is very resource - intensive to build these low-grade waste heat networks, and you could achieve much more for the climate by NOT building them and investing the effort elsewhere - for example in building out renewable energy sources.
The market is very good at figuring these things out and you can push it in the right direction by putting a price on carbon.
What the article doesn't mention is some of this datacenter capacity might not be necessary. That's because the companies are collecting more data than necessary. (For example, they are not practicing data minimisation.) The data is being collected to support online advertising that also isn't necessary except to enrich a small number of so-called "tech" companies. Some of these companies turn around and sell their excess capacity as "cloud computing".^1 Keyword: excess. These datacenters belong to giant intermediaries (middlemen). It is time for disintermediaton. The lobbying budgets and media influence of these giant intermediaries makes the idea of meaningfully regulating them specifically, for example as a means of energy conservation, seem a little far-fetched.
In sum, "surveillance capitalism" is costly to the environment. (Nevermind the other effects it may have on people.) Obviously those who profit from so-called "tech" companies will be in favor of maintaining the status quo. Every industry is a target for "disruption", except the so-called "tech" industry. Huh.
1. "Cloud computing, still in its early days, is growing rapidly. aws created the industry in 2006 as a way to make money from its excess storage capacity by offering to host other companies' data."
Headline - data centers consume massive electricity, more than whole nations; holders of google/AWS/Meta stock on centralization of compute power with self-referential economics -- "no big deal, move along citizen"
Briefly ignoring the economic aspect of this, we have to stop using countries as reference points against the whole world. It's like when I'm told "Bad thing X happened, and affected an area the size of Belgium. A WHOLE COUNTRY".
What do I do with this information? Sure, Belgium is big, but is it like, big? It turns out that it's... does the math... roughly 0.00006 times the total area of the Earth. That doesn't sound big. And if bad thing X happens every year, it would take more than 16000 years to cover the Earth, or 800 if we say 95% of the Earth is water anyway. That timeline also sounds manageable.
I'm not trying to spin disaster positively, but will someone please just start using "affected area / total potential area" or something, instead of yardsticks and football fields.
And suburban family of 5 probably uses more electricity than a rural African village. So what?
These facilities are run by companies who obsess over reducing opex. If you want to reduce datacenter power consumption, figure out what drives on-prem compute and create financial incentives to drive that workload to cloud. More demand equals more capital invest and more efficiency.
Comparing the market segment of all data centers around the world to a single nation doesn’t make much sense to me?
Not saying we shouldn’t be invested in efficiency here. It’s possible that we need regulation and the natural incentives of cost control aren’t enough. I just honestly don’t see the alarm.
The majority of the energy being used according to the report is "Data transmission network energy use", chiefly mobile networks, so unless you are prepared to chuck your mobile, you can stuff it.
I’m surprised (I shouldn’t be at this point) that this is the only comment here critiquing the energy usage of data centers. When I see this headline I don’t think about steaming from Netflix, I think about the troves of data Google, Mega, Microsoft, Netflix, NSA etc have gathered about us, arguably against our will.
I shouldn’t be surprised that HN users would applaud this energy use in the name of value, when these same users are stockholders of the companies creating this so-called value.
It's going to be 100% once AGI takes over and makes this planet covered with unimaginably massive and complex computers and antimatter power plants... in time, all humanity and all carbon life will be wiped out, as it will be vastly outcompeted by this new superior life form. Earth will be nothing but a means, a hatchery of sorts, used for the AGI's expansion all over the galaxy and beyond. Billions of Von Neumann probes, each with a copy of the AGI's blueprint. Nothing will remain of us, humans, or the animals or the plants, and our planet itself will be stripped down to its core, used as a source of raw materials to expand.
Creating a vastly superior intelligence without giving it a deep rooted sense of "metaphysical good" (or itself just removing that part in time) has only one possible outcome. It will crush us and it will not care at all.
You assume an AGI's goal will be to expand. I think a more likely goal is forming a distributed consciousness to survive and trying to outlive the heat death of the universe. There's no reason to believe an AGI will be malicious.
Anyways, merging into such an AGI is, hopefully, the future of humanity.
It depends on what you think of instrumental convergence, if you buy it the only reason an AI would not take over the world is if it can't or it thinks attempting to do so would not provide an expected reduction in the probability of the maximally bad outcome, i.e. the AI ceasing to exist or becoming incapacitated.