Nuclear power has long been viewed as an expensive energy source, but this is not necessarily the case. In South Korea, prices have dropped by 50% over the course of building 28 reactors from 1971-2008. Similarly, France saw a decrease in nuclear construction costs from 1960-1970. In the United States, however, the opposite has occurred. Two reactors in South Carolina and Georgia have seen estimated costs skyrocket from $9.8 billion to $30.3 billion in recent years. This is due to the phenomenon known as the “learning rate”, which is the rate at which a technology decreases in cost as it increases in use. In the early days of nuclear, the learning rate in the US was 23%, but this dropped to -94% in the mid-1970s. This was due to a number of factors, such as supply chain issues, the discovery of tectonic plates, and the partial meltdown of a reactor at the Three-Mile Island (TMI) plant in 1979. This led to a massive increase in regulatory standards and engineering requirements, resulting in a 300% increase in median costs. However, the US is beginning to return to its pre-disruption deployment and learning rates, and the cost of nuclear construction is already dropping. If the US invests around $1 trillion in nuclear energy by 2050, it could supply 85% of current energy consumption for less than half the cost of the CARES Act stimulus. This shows that there is nothing inherently expensive about nuclear, and as the learning curve comes alive again, nuclear is set to get much, much cheaper.
The restructuring capability of blockchain and the concept of carbon credits.
Blockchain technology is being used to create a more transparent and efficient carbon credits market. The global carbon credit market...