Batteries aren't produced at remotely enough scale to be viable for grid storage. To put this in perspective, the world uses 60 TWh of electricity per day. By comparison global lithium ion battery production was 1.1 TWh [1]. Remember, production capacity is distinct from the actual production figures. It's typical for actual production figures to be ~50% of production capacity.
Intermittent sources don't just experience daily fluctuation, but also seasonal fluctuation. Even just 3 days of storage amounts to an impossible amount of batteries to provision, even assuming growing battery production capacity. Not to mention, even modest amounts of battery grid storage would severely hamper EV adoption, which would increase emissions.
There's a reason why most plans for a primarily wind and solar grid assume that there will be some technological breakthrough that solves storage: hydrogen, compressed air, alternative battery chemistries, etc. are really common to see in plans for a primarily renewable grid.
Modelling in Australia with simulations that multiply up current wind and solar (using real-time data of actual renewable generation) [1] showed that well over 99% of demand can be delivered with only about five hours of storage, so we're not really talking about days.
There will likely always be some gas peaking but we're talking less than a percent per year (maybe a few single digit percent in some places where solar isn't as good but still not much).
Australia is a hot country, with lots of sunshine, lots of windy coast and not a lot of people per square mile. And peak demand (summer days for AC) corresponds for peak solar irradiation.
Europe, for example, is the other way around for most of the above.
> Modelling in Australia with simulations that multiply up current wind and solar (using real-time data of actual renewable generation) [1] showed that well over 99% of demand can be delivered with only about five hours of storage, so we're not really talking about days.
That's still a gap two orders of magnitude larger than existing standards:
> Meanwhile, reliability standards in industrialized countries are typically very high (e.g., targeting <2–3 h of unplanned outages per year, or ~99.97%17). Resource adequacy planning standards for “1-in-10” are also high: in North America (BAL-502-RF-03)18, generating resources must be adequate to provide no more than 1 day of unmet electricity demand—or in some cases 1 loss of load event—in 10 years (i.e., 99.97% or 99.99%, respectively)19.
Even leaving 1% of demand unfulfilled amounts to multiple orders of magnitude more frequent electricity production shortfalls. Figures like "fulfill 99% of electricity demand" might sound promising, until you compare against the standards of reliability modern society expects of the electrical grid.
And that's in Australia, quite literally the best-case scenario for renewables. By comparison, in Germany even 12 hours of storage would only satisfy 80-90% of demand.
The thing is you can calculate when there will be short falls and you can amount to that perfectly.
We now know the weather pretty much 1 day before it happens. adjusting that 1% of course amounts to maintaining natural gas facilities but it's really not that big of a deal.
Using natural gas means climate change still progresses. Not to mention you'll be paying all of the overhead cost of maintaining natural gas plants, but only use them for a fraction of the time. So net cost per watt hour will be very high.
> "Batteries aren't produced at remotely enough scale to be viable for grid storage."
The idea that if we can't have a renewable grid identical to the fossil fuel grid, then we may as well stick with the fossil fuel grid even if it means the end of the world is a bit weird.
The UK's biggest energy need is heating, but the housing stock in the UK is famously shitty, 38% of homes were built before 1946 and it's the worst value for money of any developed country[1]. It isn't well insulated, triple-glazed, heat-pump fitted, using local industrial waste heat for home heating.
Heat is harder to move than electricity, but easy and cheap to store - this 2019 pilot project can store 130MWh of heat for up to a week[2], something we couldn't reasonably or cheaply do with 130MWh of electricity.
It's possible that energy and electricity requirements could be reduced meaningfully without dropping quality of life, and that meaningful amounts of energy could be stored in heat and synthetic gas[3] rather than in more expensive electric charge storage.
[Is this used Nissan Leaf at 30kWh for £2,000 the cheapest battery storage I could buy in the UK right now?[4]]
Nobody, really nobody asks for the whole world consumption to be stored in batteries.
This is such a bullshit argument it really paints the rest of your comment in a bad light.
You can get so far by just storing up to 12 hours OF NIGHT TIME on a local level. Who cares about the 5% of times where we have to burn natural gas to stabilise the grid.
95% renewable is orders of magnitude better than today. Anyone saying different is literally a grifter.
Also battery storage cost prognosis is 50% less in 5-6 years. Batteries are already cost effective and there are a lot of grid storage options build right now.
If we are heating with electricity and cannot use gas storage to make up for the seasons, chemical batteries are nowhere near price effective, or even physically or projected to be available.
You can find out how much energy Europe stores in gas fields to get through the winter. You can divide that number by what you think is a reasonable sCOP for your heat pump. That number you can put next to total battery capacity ever produced, and I'll even let you add any other convertible energy storage capacity. You will find a gap off by orders of magnitude, and powerwalls in every home are not going to cover it. As you said, they'll cover a day, maybe a few, which leaves us 3 months short in the season where we have virtually no home solar.
Of course burning that gas to generate electricity for use in heat pumps would get you the same heat for half the gas even before you include any wind power (or hydro etc.)
That then doubles your storage for gas that you may or may not need to burn depending on the weather.
So really the path to fully renewable just goes through a series of win-wins on the journey to fully phase out fossil fuels.
Sure, switching to heat pumps doubles (probably triples) your storage, or, cut the storage energy in the form of gas by 66%. But, I thought we were discussing decarbonzing the storage too. Looking at that amount of energy, (chemical) electrical storage isn't even in the right ballpark yet. And other forms neither.
Because fuel costs only matter so much. You will still need exactly the same number of natural gas (or similar) plants with 12 hours of battery vs. 1 hour to cover seasonal events in most places on the planet. Until your battery backup can supply a week or more worth of power you have simply created an inevitable disaster in the making.
No one is building a natural gas plant to staff it and let it sit idle for 95% of the time. The natural gas burned is only a fraction of its input costs.
Battery storage is headed in the right direction but the fact almost all articles on the subject can’t even get the units correct as it would betray how ridiculously small the deployments actually are is quite telling in itself.
The grifting are those pretending magic natural gas backing plants are going to pop up out of nowhere and not including that capital or maintenance expense when quoting intermittent power source costs.
Right now those sources have been able to cherry pick the cheap and easy problems to solve since they’ve been using someone else’s power when they can’t meet demand. Eventually you run out of it though.
Cheap intermittent sources have their place, and should be used maximally wherever possible. For example every watt of hydro production should have a watt of solar or wind built on top of it. Store the water for when the intermittent sources can’t keep up with demand.
> No one is building a natural gas plant to staff it and let it sit idle for 95% of the time. The natural gas burned is only a fraction of its input costs.
Serious question: why do you think that’s true? If it costs X per year to run it 5% of the year and you save more than X with this strategy, then the maths is simple and someone will build it. Several energy companies could probably be convinced to each pay a share so no one is left footing the whole bill but everyone benefits from the existence of the facility. If the maths works, potentially even some of the cost could be passed on to the taxpayer.
In the UK we already have a couple of facilities that operate exactly like this, Cruachan for example (it’s not gas, it’s water). Over the years, ways to improve its utilisation have been found, but it’s still sitting there at a relatively low portion of its capacity so that it can black start the grid if it’s ever needed.
Because it has been, at least thus far. Perhaps in the hazy future this will change, and some regulatory/capacity/energy market will evolve into making such things profitable by paying someone to build underutilized power plants. I know of no such market currently.
I'm only somewhat familiar with the US market, not the UK. But a single plant is really not interesting for the discussion at hand. It can be considered a cost of doing business to have such a plant be useful for "black starts" - but that's all a single plant will ever be useful for. If it's ever being used for such a purpose you've already lost the game.
The scale is what matters. A single power plant that is 1% of your grid capacity being utilized 5% of the time is an expense that can probably be justified. Hundreds of power plants that match 100% (or close to it) of your grid capacity used 5% of the time would be an economically unjustifiable expense as you've effectively built your entire generation capacity twice.
Right now that's what we would be talking about building since every regional grid seems to experience week (or longer) periods where intermittent power generation is extremely unreliable due to weather events. It's not 100%, but it's close to it. You need to plan for the 1000 year event for something as critical as a national grid or folks literally start dying and the economic impact is astronomical.
I don't know what the exact capacity factor you'd need to have for a reasonable intermittent:dispatchable ratio, but it's certainly quite a lot higher than most would seemingly believe. Once batteries get to the point of backing the entire grid for a single night while the wind doesn't blow there might be signs of change. In most markets in the US where batteries are considered huge successes they have only recently (in the past year or two) transitioned from providing ancillary services to actual energy production for regular daily usage during the duck curve.
This can all be solved in time and in theory with a number of technologies and additional grid interconnection. But the trends simply are not as positive as one would like to see when you start delving into primary sources.
> But a single plant is really not interesting for the discussion at hand.
Maybe, but the UK has 4, and is building another 5 in the next 5 years.
Average grid consumption is somewhere around 30GW, the existing facilities have around 30GWh of storage. The additional 5 should bring around another 100GWh.
So we’re already at 1 hour’s worth of grid capacity stored, by 2030 we’ll be at 4 hours, and that’s assuming absolutely zero energy from other sources (wind, solar, nuclear, fossil fuel, biomass, other countries), although to be fair the existing facilities can only deliver at around 3GW, and the new facilities will only bring that up to 6GW.
I’m not sure if you’ve ever been to the UK, but a whole week without either sun or wind seems a bit unlikely, especially when half of the UKs wind comes from offshore wind farms.
Stick in a few more of these, and keep a couple of the existing fossil fuel plants around in case of emergencies, and I can definitely see how this continues to be just a “cost of doing business”.
I appreciate the situation in the US may be worse.
> This is such a bullshit argument it really paints the rest of your comment in a bad light.
I responded to a comment stating that excess storage can be stored in batteries: https://news.ycombinator.com/item?id=43249008 I'd suggest reading the comments people are responding to before calling them bullshit.
> You can get so far by just storing up to 12 hours OF NIGHT TIME on a local level.
"only" 12 hours of storage is 30 TWh of storage, at the world's current electricity consumption rates. This is an immense amount of storage, amounting to decades worth of global battery production. And that's ignoring the fact that the vast majority of batteries are going to electric vehicles, not grid storage. It's true that battery production is growing, but electricity demand will similarly grow as fossil fuel use in transportation and industrial processes are electrified. Out of all of our fossil fuel use, electricity production is only ~40%. Not to mention, poorer countries are developing and will eventually start deploying refrigeration and air conditioning on similar scales as developed countries.
Let's say it is 30Twh a day in 2030 - you can calculate 50% less energy usage during night making it 20 TWh during daytime and 10Twh during night time.
This excludes large wind farms that add to base load if you average over the world. There is always wind somewhere around you.
Realistically if we reach 5Twh storage we are able to be >90% renewable.
Having 5Twh of storage is of course not an easy feat if estimates are correct we will have 6.5TWh battery production in 2030. If we amount for 10% of that used in grid storage we would need a decade for a 90%+ renewable grid.
There is no faster method. It is realistic.
> 12 hours of night time is not 30TWh The world is currently at 26TWh A DAY
From your link:
> The global electricity consumption in 2022 was 24,398 terawatt-hour (TWh)
24,398. / 365 is 66.8 TWh of electricity used per day. And again, that's current electricity consumption. Before industrial processes are electrified. Before poor countries adopt air conditioning at the same rates as rich ones. Before transport is fully switched to EVs.
> Let's say it is 30Twh a day in 2030 - you can calculate 50% less energy usage during night making it 20 TWh during daytime and 10Twh during night time
That's not how th consumption curve works. Even in the summer, the ratio of daytime to nighttime energy use isn't so high. And in the winter it's inverted, with nighttime energy use exceeding daytime use.
why should I? I have no idea how fast adaption rate are, we can calculate what it would cost for the current grid to be feasible and work based off of that.
Ah yeah the magical 107 Twh hydrogen capacity. How far off is that? That's a pipe dream.
This is a plan for anything past 2050. I'm talking right now.
Here are high-ball numbers for going off the grid; 2000 sf house in California:
- 30 panels ~ 10kw: $20K
- batteries ~ 10kwh: $8K.
- permits + labor: $20K (California...)
- 100+kwh EV with v2h bidirectional charging: $50K
- comparable ICE car (offset): -$40K
- heat pump water heater $1.5K
- heat pump furnace: $15K
- induction range: $2K
That adds to: $76.5K. Typical PG&E bills are $500-1000 per month. Budget $200 / month for gas. (Again, California prices.). That’s 63-110 months till break even, which is less than the expected lifetime of the panels + battery.
For another $10-20K, you can add propane backup, but I assume extended storms are rare enough to just charge the car and drive the electrons home a few times a year. A fireplace is about $5k installed.
Not going full off-grid is cheaper. So is scaling up to beyond one house.
LLMs should not be used as a reliable source of numbers for research like that. You keep saying how obvious this is and trivial to research. Maybe just post a quality research link instead in that case?
I am suggesting it as a way to do a back of the envelope calculation that can be thoroughly checked manually. It's very easy to check the numbers yourself.
## Upfront Capital Cost
- *Nuclear*: Very high (£4,000-6,000/kW), with 10+ year construction time
- *Natural Gas (CCGT)*: Low to moderate (£500-1,000/kW), with 2-3 year construction time
- *Wind + Battery*: Moderate for turbines (£1,000-1,500/kW) plus substantial battery costs
- *Solar + Battery*: Moderate for panels (£800-1,200/kW) plus large battery costs, especially for winter
## Plant Lifespan
- *Nuclear*: Typically 60 years, with possible extensions; 2+ builds over 100 years
- *Natural Gas*: 25-30 years; requires 3-4 rebuilds over 100 years
- *Wind + Battery*: 25 years for turbines, 10-15 years for batteries; multiple replacements needed
- *Solar + Battery*: 25-30 years for panels (with declining output), 10-15 years for batteries
## Fuel & Operating Costs
- *Nuclear*: Low fuel cost, high operating cost (staffing, maintenance, safety)
- *Natural Gas*: Major cost is fuel (price volatility), plus potential carbon costs
- *Wind + Battery*: No fuel cost, moderate turbine O&M, plus battery replacement costs
- *Solar + Battery*: No fuel cost, low panel O&M, plus battery replacement costs
## Levelized Cost (No subsidies)
- *Nuclear*: £90-120/MWh
- *Natural Gas*: £50-60/MWh (without carbon cost), £100+/MWh with high carbon prices
- *Wind + Battery*: Base wind £40-50/MWh, potentially exceeding £100-150/MWh with storage for 90% CF
- *Solar + Battery*: Base solar £40-50/MWh, potentially exceeding £150-200/MWh with storage
## Reliability / Capacity Factor
- *Nuclear*: ~90% capacity factor, suited for baseload
- *Natural Gas*: 80-90% if run as baseload, highly flexible
- *Wind + Battery*: 35-50% raw CF for wind alone, requires battery + overbuild for 90% CF
- *Solar + Battery*: 10-15% raw CF in UK, requires massive overbuild and storage for 90% CF