I see that Sabine Hossenfelder and others have videos out about the latest cost blowout, and delays, in the ITER project that is only a research project for fusion, with no prospect of it ever actually being an electricity generator.
For reasons I have outlined before, I am firmly on the sceptic side of fusion energy ever being a practical source of energy.
But this latest problem did make me think about how it's odd that both high-tech ideas for future energy (fusion, or even new fission reactors) and a much more modest-tech idea (large scale deployment of renewable energy plants which already work, but still have practical problems in replacing old style generators) both share a similar issue: they are hampered by continual changes in technology that make planning for their development (or deployment) very difficult.
I mentioned in a recent post about how, even over the (nearly) two decades of writing this blog, you can see how ideas for new renewable energy have been floated, sometimes partially developed, and abandoned: in many cases surpassed by the steady increase in efficiency and manufacturing improvements in the "been around forever, but getting way better all the time" sources (mainly solar panels and wind generators). And now we are at a point where we know we need renewable energy deployed very rapidly to drop CO2 before we bake the world even further, but the issue of energy storage is still seemingly at the stage "too many ideas", and no one really knows the best way to deploy it for maximum efficiency and best cost outcomes. A large part of the problem is surely that some ideas (molten salts, hot rocks or sand, chemical flow batteries, etc) will be beaten out of contention by improvements in competing systems, as nearly all storage ideas are still undergoing a lot of development and research and technological improvement. Hence, it may sound like a good idea to subsidise (say) Tesla powerwalls for domestic use on a massive scale - but I would presume that all battery storage is likely to be better, cheaper and safer in (say) five or ten years time, so just how much money is it wise to spend now on the current model?
On the fusion question, I have seen it said (I presume reliably) that a large part of ITER's problem is that it was designed on the basis of magnet technology current (I think) a couple of decades ago, but that has been surpassed by big improvements in the field. Hence it is in one sense already a white elephant, and becoming more white elephant-y every year a cost increase or repair delays its operation.
I would guess that the same could be a significant issue in the field of new fission reactor designs - what company wants to spend a ton of money on a design that might work but be soon out-competed by an alternative new design in terms of cost, efficiency or safety?
I guess there is likely a simple name for this in economics, or some management field - this race between technological development and its deployment on the one hand, and redundancy on the other - but I don't know what it is.
I also don't really know the solution.
What I do think, though, is that surely the billions spent on a research reactor for a source of energy that may never be economically viable could have gone a very long way towards resolving the issue of the best way to store energy from renewables, and likely come up with some good answers on that a great many years before ITER is even switched on.
I was always skeptical about the ability to always maintain the magnetic field containment. One small failure and the plant goes down for months if not years. That is just an intuition, I lack the expertise to know how consistent the containment can be but in most human devices from time to time shxt happens and in the Tokamak configuration one small shxt can be catastrophic.
ReplyDeleteFusion is a dead letter. It’s simple. It’s easy. You don’t need high temperatures. But it will never be an energy spinner. They have the basic physics wrong.
ReplyDelete