Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The paper you cited shows that 1.5x overproduction can get the US to 75% of electricity from solar and wind without building any storage.

Every big construction project is cheaper in China than in the US. China can build a solar farm for $0.83/watt [1], half the cost I cited up-thread for an American solar farm. But I don't think that $0.83/watt is a credible construction cost for a solar farm in the US in 2020. Nor is $3.17/watt a credible cost for constructing a nuclear plant in the US in 2020.

I am weary of hearing about how the next reactor design is going to be affordable and take only 5 years to build. That's what I heard from Westinghouse and Areva/EDF when the AP1000 and EPR were still theoretical reactors. I believed it for a while. I don't believe it any more in 2020. I don't believe it about molten salt reactors, traveling wave reactors, or whatever theoretical reactor is currently cherished by Ted Talkers. I'm willing to start believing it again after a reactor design has entered commercial operation in the United States or European Union and lived up to its claims.

I don't want to write off nuclear power entirely. It's much safer than fossil power, both in terms of acute human health risks and climate risks. I still couldn't advise any American utility that new nuclear plants are a fiscally prudent part of decarbonization planning for the next decade.

[1] https://www.bloomberg.com/news/articles/2019-12-26/china-s-b...



> The paper you cited shows that 1.5x overproduction can get the US to 75% of electricity from solar and wind without building any storage.

Right, without storage you have to scale back the percentage of electricity you get from intermittent sources. Like I said, without storage, renewables do not present a valid option for a carbon free source of energy.

I you want a US example of a nuclear plant construction, you can take the Diablo Canyon plant [1]. This is not only built in the US recently, but also in an earthquake prone area and thus needed more robust construction. It cost 13.8 billion in 2018 dollars. Plants constructed earlier are even more cost effective, even when adjusted for inflation. The Donald Cook plant [2] produces about as much as the AP1000 plant, for only 3.35 billion 2007 dollars. That's about 4.25 billion in 2020 dollars. And this isn't an anomaly. Plenty of plants built during the 70s were similarly cost effective [3] [4] You're picking an individual plant to inflate the cost of nuclear dramatically.

You're absolutely right that the next reactor design isn't going to be affordable. First of a kind production is always the most expensive. Nuclear power gets cheaper with repeated constructions of the same design. You don't need to re-build the manufacturing pipeline to make components, staff become experienced in construction, and other benefits.

This is why France's nuclear program was one of the most successful. They standardized on 3 designs, and built serial production of those same 3 designs. The APR is expensive precisely because it's one of the first large plants that France has built in decades, and they don't have that advantage of serial production.

This is also why nuclear plants built during the 1970s in the US are much more cost effective than the ones built today. The US was building a lot of nuclear plants, and so it benefited from this economy of scale.

1. https://en.wikipedia.org/wiki/Diablo_Canyon_Power_Plant

2. https://en.wikipedia.org/wiki/Donald_C._Cook_Nuclear_Plant

3. https://en.wikipedia.org/wiki/Byron_Nuclear_Generating_Stati...

4. https://en.wikipedia.org/wiki/Braidwood_Nuclear_Generating_S...


75% of all electricity consumption is too low to be a "valid option" for decarbonizing electricity? We have different ideas of what's valid.

If you want a US example of a nuclear plant construction, you can take the Diablo Canyon plant [1]. This is not only built in the US recently, but also in an earthquake prone area and thus needed more robust construction. It cost 13.8 billion in 2018 dollars. ... You're picking an individual plant to inflate the cost of nuclear dramatically.

Diablo Canyon unit 1 construction began in 1968 and unit 2 in 1970 according to the page you linked. That's hardly recent. I was actually trying to be charitable to American nuclear by talking about the new Vogtle units without mentioning the billions spent at V.C. Summer on two incomplete AP1000 reactors that will not enter service at all.

I agree that 1970s era nuclear plants cost less. Part of that was that there was more experience with building them, so the average was lower. Another reason is that when we look at nuclear plants running today and tabulate cost by year of construction, we're cherry picking the projects that went well.

I count 41 American reactors that were cancelled while under construction on this page:

https://en.wikipedia.org/wiki/List_of_cancelled_nuclear_reac...

37 of them started construction in the 1970s. That's a high project failure rate compared to any other generation source.


> 75% of all electricity consumption is too low to be a "valid option" for decarbonizing electricity? We have different ideas of what's valid.

The US already generates ~40% of its electricity from carbon free sources. An increase to 75% represents only about a 60% reduction in fossil fuels from the current status quo. This is absolutely not a solution. We're still advancing climate change. Halving the time it takes to reach a disaster is not even remotely close to the same thing as averting a disaster.

> 37 of them started construction in the 1970s. That's a high project failure rate compared to any other generation source

Yes, because of Three Mile Island. Most of the cancellations were for reactors that started in the 1970s, because the US mostly stopped constructing reactors in the 1970s. You don't have cancellations for projects that began in the 1960s because those projects finished before Three Mile Island. You don't have project cancellations in the 1980s and 1990s because the US largely stopped starting new nuclear projects.

This pattern of cancellations demonstrates the fact that the high failure rate is due to political pressure, not economic unsuitability of nuclear reactors. Serialized production yields costs per kilowatt hour well below your estimates for renewables even without the cost of storage.

And no, if we look at reactors that are no longer in operation we don't see a much higher cost [1] [2] [3] [4] [5].

Serialized nuclear construction is by far the cheapest and most effective way to eliminate carbon emissions from electricity generation. Intermittent sources are cheap until they reach saturation during their time of production. Then costs skyrocket when storage becomes necessary. And even more importantly, nuclear is the only proven way of accomplishing this.

1. https://en.wikipedia.org/wiki/Rancho_Seco_Nuclear_Generating...

2. https://en.wikipedia.org/wiki/Big_Rock_Point_Nuclear_Power_P...

3. https://en.wikipedia.org/wiki/Crystal_River_Nuclear_Plant

4. https://en.wikipedia.org/wiki/Kewaunee_Power_Station

5. https://en.wikipedia.org/wiki/Oyster_Creek_Nuclear_Generatin...


The 75% figure is from solar and wind alone. The balance can come from any combination of nuclear, other renewables, and fossil fuels. It doesn't mean 25% electricity supplied by fossil fuels.

You consider France's electricity sector a decarbonization success story, right? It generates "only" 72% of electricity from nuclear power: https://en.wikipedia.org/wiki/Electricity_sector_in_France

The United States also gets 7% of its electricity from hydro and geothermal:

https://www.eia.gov/tools/faqs/faq.php?id=427&t=3

Plus 19.7% from existing American reactors, though that number is going to drift downward as retirements continue to outpace construction.

I think that the residual fossil demand would be significantly less than 25%. I can't say how much exactly. That needs another study. In the absence of storage, neither nuclear nor geothermal plants are good at supplying peak demands. Hydro can serve a peaking role to some extent even without building new pumped storage but it's constrained by reservoir volumes, minimum downstream flow rates, seasonal snow melt, etc.


> In the absence of storage, neither nuclear nor geothermal plants are good at supplying peak demands.

This is false. Nuclear and geothermal power can both satisfy peak demand just fine. Nuclear plants do take time to alter the thermal output of the reactor, but they can easily reduce the electric output through overcooling. Same MWt, but lower MWe. Geothermal plants just pump less water into the borehole. Better yet, nuclear plants can direct the excess energy to things like desalination. This is much easier than trying to satisfy peak demand with solar or wind, where peak demand often occurs when the sun isn't visible or when wind speed is substantially higher or lower than demand.

Again, France at its peak generated over 85% of it's electricity from nuclear power. I'm not sure where you're getting this myth that nuclear plants can't satisfy variable demand.


Nuclear plants can throttle down. But the economics of nuclear power make it prohibitive to satisfy peak demand from nuclear reactors. Even France relies on gas plants and power imports to satisfy its electricity demand peaks, despite being a net electricity exporter over the course of a full year.

The CAISO grid had an average power demand of 25 GW in 2019 but a peak demand of 46 GW (Table 1.1):

http://www.caiso.com/Documents/2019AnnualReportonMarketIssue...

Keeping enough nuclear reactors operating to supply the most demanding hours of the most demanding season would have extraordinarily high marginal costs for the last few terawatt hours.


This is true not because of any shortcomings of nuclear, but by virtue of the fact that demand is not uniform. The same need to have excess capacity during non peak hours in order to have sufficient capacity during peak hours exists with fossil fuels. If peak demand is 130 GW and trough demand is 100GW you need 130 GW of capacity.


Nuclear has never been inexpensive enough to be the first choice for meeting peak demand in the absence of storage. Some pumped hydro plants were built in the 20th century to be charged by nuclear generation so nuclear could effectively supply peaks too. Peaking batteries could also be charged by nuclear power.

Gas turbines have a construction cost under $1/watt in the US while Vogtle's AP1000s were estimated at $6/watt even before all the cost overruns started. If you're going to leave an asset idle most of the time, much better to idle a sub-$1/watt asset than a $6/watt asset.


This is just factually wrong. France had generated the majority of it's electricity from nuclear since the 80s, and Belgium now generates the majority of it's power from nuclear. Nuclear had repeatedly been used to satisfy peak demand.

Nobody doubts that fossil fuels are cheaper. Yes, that's why fossil fuels are still in widespread use outside of France, and several countries with lots of hydroelectric power. But if we want to halt climate change we need to eliminate - not just reduce - usage of fossil fuels. Any plan to use renewables as a significant source of energy either involves the continued use of fossil fuels, or the involvement of staggering amounts of energy storage.


Per the beginning of this long thread, wind and solar can supply 75% of annual US electrical demand without storage. France supplies about that much of its annual demand from nuclear power and Belgium supplies a bit over half of annual demand with nuclear power. Both countries rely on fossil generated electricity for meeting demand peaks, both via domestic generating plants and cross-border imports from foreign fossil plants. There was never a year when France met its peak electrical demand without fossil power.

The example of France proves that electrical generation could have been largely decarbonized in the 20th century, if other leaders had made a concerted push to reduce fossil fuel dependency like France's leaders did. It's tragic that other major economies did not do the same. But even France did not eliminate all fossil generation. 10% of French electricity generation is fossil as of 2017 [1].

Getting the USA's electrical generation down to only 10% fossil would be a vast improvement. I don't think that a contemporary optimized plan for getting there involves much new nuclear power even though a 20th century plan would have. A cost optimized plan from 1985 certainly would have called for a lot more nuclear power. The costs of building American solar and wind farms have plummeted since 1985. The costs of building American nuclear plants have not.

[1] https://www.world-nuclear.org/information-library/country-pr...


France's peak energy consumption is more than 40% higher than it's trough energy demand [1]. Most years France generates ~10% of its energy from fossil fuels. Nuclear was indeed used to supply a substantial part of the peak energy demand. The substantial majority of its peak demand was fulfilled with nuclear energy. France's share of fossil fuels actually used to be lower than 10%. France's more recent uptick in fossil fuels accompanied by a drop in the share of nuclear power generation [2]. Renewable energy production increased, but its intermittency leads to an increase in fossil fuel consumption. A real world example of how the shortcomings of renewables as compared to nuclear power results in more carbon emissions.

As per your own comments, solar and wind even with substantial overproduction leave a quarter of electricity demand unfulfilled. The places that are fortunate to have hydroelectric or geothermal power available could go carbon free, but the rest are left supplying a quarter of energy with fossil fuels. The overproduction puts their price well above the cost of nuclear when you don't cherry pick one of the most infamous cost overruns as a representative example. And let's just be generous and ignore the ecological devastation caused by covering 2-4% of the Earth's land surface in solar panels.

So we have a more expensive option that leaves a quarter of electricity demand to be fulfilled by other sources (mostly fossil fuels), requires massive amounts of land to be cleared and covered in solar panels, and is subject the challenges of intermittent power generation. And we have nuclear, which is cheaper, much less land intensive, and generates consistent energy. The superior choice is unambiguous. And real world examples demonstrate this. Look at the disparity between France and Germany. The former is the posterchild of nuclear, the latter the posterchild of renewables. France's carbon intensity of electricity is usually more than a factor of 4 times smaller than Germany's. We've already put nuclear and renewables to the test. And nuclear proves to be far superior.

1. https://www.services-rte.com/en/view-data-published-by-rte/d...

2. https://en.wikipedia.org/wiki/Electricity_sector_in_France#/...


I'll be interested again when France builds new reactors cheaper than new renewables. Flamanville 3 and Olkiluoto 3, even if they don't have further cost overruns, are going to produce electricity at a higher cost per MWh than German solar farms completed in the same year.

If France does build 6 more EPRs, a proposal floated last year, they will have a chance to prove that the problems to date were merely FOAK learning experiences.


France doesn't need to build new reactors largely because it's existing reactor fleet is still enough to fulfill demand. That's one of the biggest advantages of nuclear power, it lasts for the better part of a century not a decade or two. A serial run if reactor production makes more sense once the disparity between supply and demand is greater.

I'll be interested in renewables once Germany's carbon intensity per Watt is on the same order of magnitude as France. They need to drop from ~500 grams per Watt to under a hundred.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: