We don't know exactly how much electricity the AI/ML part of global energy use is, but it is being counted as part of the global datacenter energy use.
The estimates are between 1% and 5% worldwide depending on the used definition AI/ML and the definition of world energy use (for example global electricity versus global energy).
[2] "Energy. Estimated global data centre electricity consumption in 2022 was 240-340 TWh, or around 1-1.3% of global final electricity demand. This excludes energy used for cryptocurrency mining, which was estimated to be around 110 TWh in 2022, accounting for 0.4% of annual global electricity demand."
You have a citation for 2022 which is right before use of AI exploded and NVIDIA started selling AI machines like hotcakes. I expect energy use to be much larger this year.
I'm sure you are right but the numbers for 2023 will be published next year. We can also estimate [2] and extrapolate from the recent shift of 30% (?) of bitcoin mining from China to the US [1]. ARPA-E funded the tabulation of all US energy uses in Sankey Diagrams, which included datacenter use [2] but also does not split out NVDIA's, Bitcoin mining or AI's parts.
FWIW a recent The Economist report [0] lists data center usage at 1-2% of global electricity consumption (which is 1/3 of energy IIRC) and AI at 8% of that in 2023 projected to grow to 15-20% by 2028. Given how careful the hyper scalers are about using renewables and offsets (edit: compared to other big energy users that is) I refuse to feel bad about it...
[0] https://www.economist.com/technology-quarterly/2024/01/29/da...
With today’s AI, we’re taking a big hammer to the problem using unoptimized but vastly flexible machines like GPUs. Once the code settles down, expect ASICs to run much of the show and lower energy consumption.
This is really unlikely, because the operations are pretty basic math and the GPUs do these about as efficiently as possible. Surely those GPUs will be improved further, but a custom ASIC probably won‘t help much with training. And NVIDIA has been producing GPUs that are specifically targeted at deep learning and they control both the software library and the hardware. I don‘t quite see how they couldn‘t optimize whatever they want to optimize.
For inference, there is the option to produce ASIC's with the weights hard coded into the chip. They'll be super fast and power efficient, with the huge downside they aren't reprogrammable.
The chips won't be that much more efficient on a like for like basis but the models will be much smaller so the chips can be smaller or run much larger batches
Being that us meatbags operate on a pretty limited power budget I'm guessing there is still a lot of algorithmic optimization to be done, and those algorithms will need hardware to run on.
The H100 at this point includes just the bare minimum hardware to be considered a GPU. As long as AI is primarily matrix multiplication, GPUs are going to be near ideal.
There are already some custom chips out there such as groq's[1]. And isn't Google using a lot of TPUs for this sort of thing now?
Microsoft also said to be working on custom AI chips.
> expect ASICs to run much of the show and lower energy consumption.
further increasing the efficiency of the process will only increase total consumption even further.
the entirety of the AI boom has been a perfect demonstration of Jevons Paradox in action. Previously these intractable problems simply went unsolved. Now it is possible to solve them with a plausible degree of efficiency, and that increases the demand for solved problems hugely because the price per solved problem has declined. If you push energy even lower it will only fuel even further demand for more solved problems because wow, it’s cheap now!
... and then the energy saved by using more efficient hardware and software will be used to power more hardware to increase total AI power. There is no "enough" AI yet. There may never be.
This is day one. This is the least efficiency it will ever be. All the software optimisations are still ahead of us and even hardware will get more optimised.
Computers slower than my pencil arithmetic used to run on valves and now here I am typing for half a day and blazing on the internet in my cordless, battery powered MacBook Air.
Yes, of course but not per unit of output. It will become more efficient and more widespread. I don't really see a long-term problem in that. Maybe a short-term problem yes
Assuming it doesnt crash because people realize its not a good tool (which I'm hoping for), this still means its overall energy consumption will rise. Thats Jevons paradox and this scenario fits it quite well
You're living under a rock if you think this is a fad. You're seeing ai used all around you constantly. Autocorrect on your phone keyboard, suggestions in Google News, YouTube, etc, translation, speech to text captioning, voice navigation, etc. These are all useful applications in the consumer space, not to mention all of the industrial and medical applications such as helping diagnose health issues, fraud/anomaly detection, etc. I would say there is a bit of stick on the wall happening where everyone tries to apply it everywhere and it's generally not always helpful. I consider that more of an evolutionary necessity. Hopefully we will learn where it works best and hone out usage in those applications.
LLMs are being incorporated into all those tools they mentioned. The only reason you'll see LLMs disappear is if they get replaced with something more powerful, we're not going back to the world before LLMs
There is a risk that if AI works well enough, there will always be profit in using even more AI (in particular if you can task AI itself to use AI recursively, meaning that AI usage won’t be limited by available human time), in order to have an edge over the competition. In that case, AI usage could gobble up all available energy regardless of AI efficiency.
I disagree. Software and hardware optimisations have not happened since the Rocky Mountain Institute did this data centre efficiency analysis [1].
If I had the time I could add numbers from Google, Microsoft and other hyperscaler analyses on software efficiencies in their datacenter papers.
I'm sure Alan Kay has some insights on software efficiencies [2].
A small part of hardware inefficiencies are the energy use per transistor of computer chips. It has gone up since the 28nm node. Our datacenters have primarily smaller node chips (16nm, 7nm, 5nm) and have therefore gone up in energy use.
AI has been pushed into products for a few years now. We've had accelerated hardware for it for at least 6 years. It's hardly day 1. Surely things will continue to improve, but it's perhaps more poignant to say it's still early in the evolution cycle for AI technology.
Will efficiencies ever catch up to our voracious apatite for "better"? Current top-of-the-line models seem to have gotten bigger much faster than they've gotten more efficient, and I don't see that trend stopping anytime soon.
Either the value proposition of AI will incentivize investment that successfully improves AI efficiency as operating it becomes more expensive, or AI development will plateau around whatever equilibrium point is defined by the best attainable efficiency.
From looking at a variety of sources, my best, rough understanding is that
1. Data centers & Bitcoin are using about the same amount of electricity, about 2% of the US energy production each
2. AI is a subset of data center usage, but expect this portion to drastically rise and drive up the energy usage
3. Data centers & AI are continuingly becoming more efficient while Bitcoin becomes less so by design, L2 crypto is an efficiency play for crypto more generally
are these Botcoin numbers verified at all? that is two percent of some common electrical grid? or remote hydro-power sites or something.. seems like a really large number, and lots of reasons to exaggerate/misrepresent it
The current bitcoin total hash rate is obviously public information (currently about 6E20 hash/second). I was also able to find a report on average network-wide hashing efficiency, which is obviously self-reported, but is at least in line with logic (just slightly trails the current highest efficiency ASIC miners). That gives around 13 GW of total instantaneous power consumption. The US produced around 4000 TWh last year, which is an average of about 460 GW of power. That would make global bitcoin power consumption equal to about 2.8% of US energy production.
It looks like estimates for global energy production are over 30 TWh last year, so that would mean bitcoin is around 1/3rd of one percent of total global power usage.
The EIA was recently ordered to start investigating/receiving reports on the actual numbers after a preliminary report indicated it was consuming between 0.6%-2.3% of total US consumption:
I can't speak to the 2% figure, but as for the power sources, Bitcoin is somewhat unique, as it is ostensibly the most price-sensitive, most location-agnostic, and most interruptible instance of large-scale power consumption. Mining is done strictly for profit, so it only performed at any scale strictly where it is profitable.
Ironically, this nature can actually fortify the electric grid in some areas, such as Texas. Bitcoin mining businesses have discovered that the unreliability of the existing grid can be mitigated through vertical integration - they create renewable power generation facilities, and when the cost of electricity on the grid is low (because demand is low and supply is high), they use their own renewably-sourced energy for next to nothing.
When grid conditions deteriorate in Texas' deregulated energy market, wholesale electricity prices surge, as those are times when demand approaches or exceeds supply.
When that happens, the electricity being generated by these vertically integrated companies is worth more being sold to the grid than it's worth being used to mine bitcoin, so the miners all shut off (within milliseconds, as this is all automated), and the power that location generates starts getting sold to the grid, which increases supply, helping to lower the electricity prices, and to keep the lights on for everyday people.
It's not a magic bullet that fixes the entire grid, but there is a growing body of evidence saying that it helps grid reliability in Texas more than it hurts, and these vertical integrations are overwhelmingly done with renewable energy sources.
I'm sure the location-agnostic aspect of Bitcoin mining does lend itself to deployment in places where power is plentiful, but where there is little local demand, and the cost of transporting that power far away is cost prohibitive, though I don't have specific example of that.
I've favorited your comment. You spell things out very clearly and accurately.
> I'm sure the location-agnostic aspect of Bitcoin mining does lend itself to deployment in places where power is plentiful, but where there is little local demand, and the cost of transporting that power far away is cost prohibitive, though I don't have specific example of that.
https://www.coinmint.one/ is a specific example of that. Power is delivered directly from the Moses-Saunders dam to a shuttered aluminum smelting plant. Sending the power anywhere else, is cost prohibitive due to the remote location and low local power needs. Connecting it to the grid would overwhelm what is there, so new construction would be needed.
This Cambridge group is the best source I know of that estimates bitcoin mining energy usage and location. There are wide error bars because we don't know exactly what mining hardware is being used, even though we know the approximate hashrate. https://ccaf.io/cbnsi/cbeci
If their numbers and methodology are correct, bitcoin mining energy use in the US is somewhere between 0.8-3.8% of US electric generation. However, we don't know how much of this mining is actually connected to the grid. There are some off grid operations, like waste methane harvesters. The US government is starting to collect this data, so we might have more precise public information soonish.
We know the rate at which bitcoin are mined, and we know exactly how much a bitcoin is worth, so it is straightforward to calculate how much money the miners are making -- namely, 17 billion USD per year at the current price of bitcoin. Since there are no barriers to entry to becoming a miner, we can expect that the effort spent on mining to match the rewards from mining almost exactly (since bitcoin miners are economically rational). We don't know what fraction of that $17 billion of effort (spending) consists of electricity, but basically all spending damages the environment. If some miner for example decides to pay researchers to come up with a more efficient mining algorithm, well, it takes a lot of carbon emissions (and other environmental harms) to raise, educate, feed and otherwise maintain a researcher. (It takes a lot of carbon emissions to raise, educate and maintain any person, even Greta Thunberg.)
If the price of bitcoin were to double, the (collective) rewards to mining double, too, and so does the environmental damage. In about 4 years, the reward for mining a block is scheduled to halve, and the environmental damage will at that time be about half of what it is now (a few months after the previous halving) provided the price of bitcoin does not change. All the miners know exactly when the reward is going to halve, so as the halving-date approaches, about 4 years from now, miners will invest less and less in mining hardware and other capital improvements, which "smooths out" the damage so that it decreases somewhat smoothly between now and then instead of suddenly halving on the day the reward halves.
I'm still amazed how Bitcoin can damage the environment by just existing yet AI, air conditioning, aluminum smelting, and cat videos have subtle effects that are hard to determine.
- increasingly large cars far bigger than required from a utilitarian perspective
- luxury good production
- theme parks/fireworks displays
- cruise ships
Etc.
Bitcoin/crypto opposition is 95% pushed by embedded financial interests that will use any lever to protect their control over money and the power it gives them.
Not sure where bitcoin got dragged into the conversation. Also not sure how BTC becomes less efficient by design. The hardware BTC is mined on has improved exponentially over time moving from general purpose cpus to gpus to custom asics.
This is not technically correct, only practically correct. Hashing does not irreversibly become more difficult as a function of time, it becomes more or less difficult as a function of total network hashrate.
Now, there has been a strong positive correlation between time and total network hashrate, so for all practical purposes, difficulty has (and likely will continue to) increase over time.
That said, if half of all miners went offline overnight, block times would approximately double, and the next difficulty rebalabce would go dramatically lower in an effort to maintain 20 minute block times.
The point of mining is that you don't want new blocks too often because it would create too many forks in your chain. The "difficulty" in mining is by design, if we make better hardware that can generate hashes more quickly then we need to make the valid hashes more rare to keep the property that new blocks are reasonably spaced out.
But what's the relation between that efficiency and the number of hashes computed? My understanding was that difficulty is increasing with efficiency so you end up having to try harder to get a block mined successfully.
The difficulty gets periodically adjusted to generate a block roughly every 10 minutes.
It's not really about efficiency as it's about computing power and available miners.
If it's not economically beneficial to mine (costs more than reward) miners will stop mining leading to the difficulty decreasing which in turns makes it profitable to mine again.
The entire innovation of Proof of Work (that made bitcoin a success where predecessors failed) is that it is costly in real world energy to corrupt the network.
e.g. you can't bullshit your way into finding this number, 000000000000000000024394a1f3cb1a0c16e601a2bd5910635bb2468d2ba316, without expending a huge amount of energy, while you can verify very quickly how many guesses it statistically took to find it.
An attacker can't just use words to spin a manipulative narrative, or cut of the head of an organization with a targeted attack. They actually have to commit massive numbers of joules and bit flips. And if an attacker actually acquires that much control over mining power, suddenly they realize they're too heavily invested in the network to want to harm it.
In the age of increasing generative AI, proof that you have some tie to real world cost is an increasingly valuable trait.
At numbers this small on the global scale, is this something that merits mass concern?
So there is more energy in a tanker truck full of gasoline than it takes to train an AI model. And a person uses more energy to go to the grocery store than then will reasonably use generating things with AI all week.
A common HN bugbear is all the energy crypto wastes so it seems reasonable that AI energy usage would be of interest as well.
This is the part where people say Crypto is only for fraud, scams and just making money so it is different. I think AI will open all new avenues of fraud that will make the ransomware mess look pleasant in comparison. AI may destroy the very concepts of truth and trust even for those looking for it, and it ALSO will waste lots of energy doing it.
> AI may destroy the very concepts of truth and trust
Oh, hell nah!
Dark visions: AI's mere application will be solving problems AI created in the first place - and, incidentally, the erosion of trust brought up a use case for crypto, at last.
Tbf, GPT-3 was made before the Chinchilla paper and was only trained on 0.3T tokens which is basically nothing for its size or for any current model (Mistral 7B, a 25x smaller model was trained on 8T). Doing it properly would require much more power.
They don't even consider the energy consumed to produce the chips AI/ML is running on at all. The biggest chip producer TSMC ysed 22000 gigawatt hour last year. This doesn't include oder foundries, memory etc.. AI is now morebthan 15% of TSMC revenue. AI/GPU chips ate often in advanced nodes consuming orders of magnitude more energy than older nodes. So one can assume energy consumption share is much more than 15% of that 22000 gigawatt hour. Let's say 25%. That would be 5500 gigawatt hour. That's like 0.15% of whole US energy consumption of 2020. So 1-2% during operation is totally fathomable.
Does TSMC produce the pure silicon themselves from silica or is the refinery done by a supplier? I suspect it's the latter, but idk. It's a very energy intensive process, and huge source of CO2 emissions as well because you use coal to provide carbon in order to reduce the silica and get silicon + CO2.
700W per customer dedicated for Netflix seems incredibly high - basically saying a 1u is dedicated per customer (no TV power consumption) or half of a 1U factoring in a 52” TV (admittedly I’m omitting network power costs, transcoding, etc…)
Anyway, given previous examples of the netflix arch, I’d expect most of the cost of streaming is mostly TLS session management.
Realistically the hidden costs of AI are huge. Microchips are the very peak end of manufacturing complexity and require an entire global supply chain in order to produce. While there are other uses for microchips, automation is the ultimate use, and AI is the current pursuit. I would argue you could consider the entire global tech economy as being part of the cost of AI. The real costs are huge.
So, let me counter with, what is the cost of efficiency?
For example the 'entire tech economy' produces more efficient engines, devices that save on fertilizer planting crops, machines that sequence genes, etc. Attempting to handwave AI as a cost of this, unless you can otherwise explain, seems like you're saying "Climbing down from the trees was a mistake"
I was just thinking a few weeks ago that there is a risk LLMs will bring on a wave of big computer builds similar to crypto mining. For storing all the data, and for calculating all the queries from their users.
I'm not familiar with what is required but I do believe that the larger the dataset the better it is, and that specialized GPUs have already been released.
I agree. This reminds me of ~20 years ago when bandwidth usage was growing incredibly exponentially and everyone thought people in 2024 would be consuming 20TB/day. In reality bandwidth growth slowed down massively, to the point now where it is growing at something like 10-50%/yr and everyone went under because there was far too much capacity.
There is a finite amount of fabs that can produce GPUs which sets an upper limit on the production. I doubt we are going to see _that_ many more fabs being built.
I also think everyone is hoarding as many GPUs as possible, which makes sense for meta/openai/google but probably much less sense for other players with time - nevermind random corporates that are just jumping on the bandwagon. I really think we'll see a small number of players (more than just openai, meta and google) produce most of the foundational models, then everyone will finetune and infer off them (which are many orders of magnitude less). That's not to say there won't be huge demand for GPUs, but I don't think it's going to be that every fortune 500 needs a 10k GPU cluster.
The key differentiator IMO with crypto is crypto by its nature has exponentially more computer resource requirement built into (most) of it. I don't know if AI does after a certain point, and I think there are enormous efficiency gains happening which simply doesn't happen in crypto (which you point out).
desktop computer use declined in the same time period in most places. The bandwidth expansion for consumers has been on mobile, and secondly specifically for movie viewing. Also consider a long, deliberate and concerted effort by ISPs across the USA to limit bandwidth for commercial reasons e.g. lying about max download speeds repeatedly and on the record, while rate limiting.
While desktop usage has declined, smart TVs, "connected" games consoles etc have exploded which take a similar place. And I assume most smartphone bandwidth consumption is on wifi at home, not cellular, though don't know if you are making that point.
> The key differentiator IMO with crypto is crypto by its nature has exponentially more computer resource requirement built into (most) of it. I don't know if AI does after a certain point, and I think there are enormous efficiency gains happening which simply doesn't happen in crypto (which you point out).
This is wrong. Crypto does not need a massive amount of GPUs in a data-center to function and waste lots of resources unlike generative AI and deep learning which for any serious model to be used for inference; it needs tons of data centers and GPUs to serve millions.
AI has always required hundreds of millions of dollars a year in inference costs alone as the data scales whilst also requiring lots of energy to output a working model including the risk of overfitting and garbage results.
Inference and fine-tuning are only going to get more efficient. Training likewise.
Compare this to crypto where it increases with time.
Im not too worried about this. Maybe we'll have accelerators for inference, but only if they can be cheap and efficient alternatives to GPUs (again, for inference).
It's odd to characterize that as a "risk". It seems to follow that increases in demand for computing use cases would in turn generate more demand for the hardware required to implement them.
Consumption of electricity has negligible environmental impact. The relevant discussion here would pertain to considerations around generating power, regardless of what downstream use cases the power is being applied to.
Pretty much all activity in modern society is going to consume electricity, and overall demand is not going to be decreasing in the first place, so it seems a bit silly to look at this from the demand side: we're always going to need more and more power, and the focus is properly on how to generate power in a clean and scalable way regardless of what it's being used for.
Consumer market impact is an interesting topic, though, if there are massive spikes in demand that could drive prices up for other users of electricity. It will be important to ensure to minimize artificial impediments to the expansion of the supply side to mitigate that risk.
Yet AI still cannot even find a sustainable alternative to their inferencing, training and fine-tuning processes to combat their mass consumption of electricity and water for years.
At least, cryptocurrencies have managed to combat this criticism with alternatives to proof-of-work and even Ethereum made it possible for a wasteful PoW blockchain to migrate to proof-of-stake which is an energy efficient alternative consensus mechanism [0][1] and have reduced their consumption by 99%.
The field of Deep Learning has made little efficient alternatives with any measurable impact and have always needed tons of GPUs in data-centers and the demand is made even worse with generative AI whilst continuing to green-wash the public with faux green proposals for years.
Not much progress in these so-called practical alternatives to this waste that AI has produced or any reduction of energy usage. As it data and model scales, the energy consumption and costs will only just get worse even by 2027.
As I read this article I felt myself being extremely critical of it. Part of this may be that I'm enamored with AI and so I feel defensive about it, but I can't help the feeling that this article needed more work. For example:
> Moreover, the organizations best placed to produce a bill — companies like Meta, Microsoft, and OpenAI — simply aren’t sharing the relevant information.
To me this shows both unfamiliarity with large corporate structures and unfamiliarity with AI research. The former because, if you want an exact accounting at a company where there are several teams running dozens or hundreds of models, it becomes someone's job essentially to compile this information because it takes a lot of work. So, you are surprised that a corporation doesn't outlay $200k/year or more getting these figures to decorate your article with? The corporations do know, on a month-to-month basis, what they're spending on this stuff - since they settle the invoices. But doing the work to get these figures into a simple, digestible form is a lot of effort, and I think that quite frankly, that effort should fall on the journalist, since it's the journalist who gets the benefit from having those numbers...
As to unfamiliarity with AI research, many papers I have been reading lately are very interested in measuring and minimizing the compute cost of models, and they often compare different methods and are often extremely precise about their training process and equipment. I feel like this article wants to sell the story that these faceless corpos don't care about energy consumption, but the researchers definitely do. Granted certain specific products / models such as ChatGPT 4 do not disclose exactly their process, but I feel like it would not be difficult to come up with a good estimate using similar models (mostly documented right out in the open in scientific papers).
Tracking the energy consumption of AI is an important and emerging issue, but this article feels too partisan to be useful.
At the very least, the article has brought to the forefront the awareness that, in addition to all the other effects of the hype, the research and use of AI on a large scale also consumes an extreme amount of energy.
Since we should all be concerned about protecting our resources and thus the planet - and that means reducing unnecessary energy consumption - this figure should be clearly included in the general costs of AI. And by that I don't mean in the electricity bills of corporations. Of course, it is in the interest of researchers to increase the efficiency of modelling and application, but for other reasons. As long as the big, well-known companies are in charge, the target figure towards which everything is optimised and maximised is commercial profit.
To cut a long story short, the only way to gain some ground here is through independent regulation, for example to get more transparency into this issue.
> Since we should all be concerned about protecting our resources and thus the planet
To my knowledge, most of the researchers and users of AI are using their own resources, not "ours", so I'm not sure there's much to be worried about.
> And by that I don't mean in the electricity bills of corporations.
What else would it be accounted for in? The users of electricity are purchasing it from the producers, who in turn purchase equipment and services from vendors, etc. It's all a chain of specific transactions among specific parties all the way down. There's no point at which some fuzzy collection of arbitrarily aggregated people is involved as one of the parties.
> As long as the big, well-known companies are in charge, the target figure towards which everything is optimised and maximised is commercial profit.
And they make that profit by delivering value to their customers -- what's the problem there?
> To cut a long story short, the only way to gain some ground here is through independent regulation, for example to get more transparency into this issue.
What is "independent regulation"? Who is conducting it, what makes them "independent" and what is their incentive to be involved in the first place?
To be sure, we only care about electricity usage because of externalities, most importantly CO2 emissions. Many of the big players have gone to great lengths to build out renewables capacity to power their data centers, which tends not to be considered in these articles.
I work in the sustainability industry so this is obviously important to me, but I agree the article needed more work. Comparing the energy required to train GPT3 to watching Netflix is nonsensical. Training is a global task. A better comparison would be the cost to train GPT vs that to upload all Netflix movies.
Plus, there are weird things with the figures. It lists the 0.012kwh figure required to charge a smartphone, but the source it cites for that number explicitly says that the correct number is 0.022kwh. Was this article hallucinated by AI?
Overall though, frankly I found it comforting that AI would “only” use 0.5% of global electricity, as I expected much higher. Google and several other peers are committed to being 100% renewable anyway.
Still, credit to the author for starting discussion about this.
Yes, I ultimately I guess it does not hurt anything to have this article in circulation even if I have some critiques, better to discuss the elephant in the room.
> As to unfamiliarity with AI research, many papers I have been reading lately are very interested in measuring and minimizing the compute cost of models, and they often compare different methods and are often extremely precise about their training process and equipment
It's worth knowing this is a very recent phenomena. Before ~2020, it was extremely rare to see any such paper, with most of the focus in publishing being on maximizing metrics like accuracy. The biggest force pushing people toward smaller models were researchers at smaller institutions who lacked access to big GPU clusters.
> So, you are surprised that a corporation doesn't outlay $200k/year or more getting these figures to decorate your article with?
I think it's incorrect to suggest these figures would only serve to make for a better "The Verge" article. One of the externalized costs here is climate change.
Actually my point is that to specifically isolate costs that are from AI is a lot of work at a company of this scale. These companies are absolutely accountable for their overall energy usage and externalities!
Yeah. Meta has internally widely shared the expected size of a new very large cluster they are building in 2024 and from the number of H100s or whatever you can probably calculate a reasonable low estimate on power draw (there will be other machines to support the compute, e.g., storage). I don't know that it's public yet so I won't say the exact size but I expect it to be leaked or spoken about publicly shortly anyway. It's a lot of MW. They are semi-joking talking about building nuclear reactors to power future DCs.
Seems like a pretty worldview to say generating garbage text and 5 legged horse images isn't worth a measurable amount of our electricity, especially in a time where we so clearly need to be reducing consumption.
Generative AI may seem frivolous on the surface, but it's fundamental research, you might as well be against materials science research. People are already using AI models to create more efficient energy systems, batteries, solar panels, etc.
This is fundamental research and if you kill research because you think it's frivolous you will kill many things you didn't know you wanted. Frivolous research is the backbone of scientific progress.
Even generative AI - it may be called generative AI because that's flashy, but its real power is as classifiers, and classifiers are incredibly useful.
Imagine robots that could perfectly sort recycling/garbage/compost, what do you think that is worth?
Imagine robots that can mechanically remove weeds so that zero herbicides are needed, what do you think that is worth?
The possibilities really are endless and I am excited to see how these technologies evolve.
AI can also aid in optimizing energy grids that adapts to supply and demand [1], aid farmers in optimizing fertilizer and water use [2], logistics companies to optimize routes and fuel use [3], etc.
It's not at all only about the consumer-facing products and these are areas often in need of urgent attention and optimization to boot.
Perhaps your own view but obviously not everyone's. I am not a doomer and don't live in the same doomer vision as your own, sorry. I am excited for the future and part of that future is growing energy consumption which will fuel new innovations.
I am not a doomer. Fossil fuels may run out but innovation will continue. Nobody knows what the future holds but I sit on the positive lens that the human race will figure this out. That might not work out but until there is conclusive evidence, I will keep my head up.
The GP is saying things will invariably get better because innovation. I’m saying that this is non-sequitur. It depends on the specific innovations whether things get better or worse.
I am simply saying over the long time scale things have been improving for the human race. I don’t care about specific innovations, that is pedantic and illogical. Your only point is global warming which regardless, humans are better off now than they were historically. We definitely need to be concerned about balance of life but I am not going to go hide in a hole and be scared. If you want that, go live on a self sufficient farm. Such a silly thread over electricity use for AI.
I am not a doomer. I don't know what the future will hold but I do know that there is less suffering in the world and the world in general is becomming a better place, that seems pretty positive to me.
You are not a doomer because you repeat that like a protective mantra and don't think about the nasty implications of "what if I'm wrong" because they put negative energy in your chakras, right?
The estimates are between 1% and 5% worldwide depending on the used definition AI/ML and the definition of world energy use (for example global electricity versus global energy).
[1] Integrative design for radical energy efficiency - Amory Lovins https://youtu.be/Na3qhrMHWuY?t=1026
[2] "Energy. Estimated global data centre electricity consumption in 2022 was 240-340 TWh, or around 1-1.3% of global final electricity demand. This excludes energy used for cryptocurrency mining, which was estimated to be around 110 TWh in 2022, accounting for 0.4% of annual global electricity demand."
https://www.iea.org/energy-system/buildings/data-centres-and...
[3] Stanford Seminar - Saving energy and increasing density in information processing using photonics - David B. Miller https://www.youtube.com/watch?v=7hWWyuesmhs