SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
");background-position:center;background-size:19px 19px;background-repeat:no-repeat;background-color:var(--button-bg-color);padding:0;width:var(--form-elem-height);height:var(--form-elem-height);font-size:0;}:is(.js-newsletter-wrapper, .newsletter_bar.newsletter-wrapper) .widget__body:has(.response:not(:empty)) :is(.widget__headline, .widget__subheadline, #mc_embed_signup .mc-field-group, #mc_embed_signup input[type="submit"]){display:none;}:is(.grey_newsblock .newsletter-wrapper, .newsletter-wrapper) #mce-responses:has(.response:not(:empty)){grid-row:1 / -1;grid-column:1 / -1;}.newsletter-wrapper .widget__body > .snark-line:has(.response:not(:empty)){grid-column:1 / -1;}:is(.grey_newsblock .newsletter-wrapper, .newsletter-wrapper) :is(.newsletter-campaign:has(.response:not(:empty)), .newsletter-and-social:has(.response:not(:empty))){width:100%;}.newsletter-wrapper .newsletter_bar_col{display:flex;flex-wrap:wrap;justify-content:center;align-items:center;gap:8px 20px;margin:0 auto;}.newsletter-wrapper .newsletter_bar_col .text-element{display:flex;color:var(--shares-color);margin:0 !important;font-weight:400 !important;font-size:16px !important;}.newsletter-wrapper .newsletter_bar_col .whitebar_social{display:flex;gap:12px;width:auto;}.newsletter-wrapper .newsletter_bar_col a{margin:0;background-color:#0000;padding:0;width:32px;height:32px;}.newsletter-wrapper .social_icon:after{display:none;}.newsletter-wrapper .widget article:before, .newsletter-wrapper .widget article:after{display:none;}#sFollow_Block_0_0_1_0_0_0_1{margin:0;}.donation_banner{position:relative;background:#000;}.donation_banner .posts-custom *, .donation_banner .posts-custom :after, .donation_banner .posts-custom :before{margin:0;}.donation_banner .posts-custom .widget{position:absolute;inset:0;}.donation_banner__wrapper{position:relative;z-index:2;pointer-events:none;}.donation_banner .donate_btn{position:relative;z-index:2;}#sSHARED_-_Support_Block_0_0_7_0_0_3_1_0{color:#fff;}#sSHARED_-_Support_Block_0_0_7_0_0_3_1_1{font-weight:normal;}.grey_newsblock .newsletter-wrapper, .newsletter-wrapper, .newsletter-wrapper.sidebar{background:linear-gradient(91deg, #005dc7 28%, #1d63b2 65%, #0353ae 85%);}
To donate by check, phone, or other method, see our More Ways to Give page.
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
The promising energy source is not a silver bullet for the climate crisis and has dangers all its own.
I awoke on December 13th to news about what could be the most significant scientific breakthrough since the Food and Drug Administration authorized the first Covid vaccine for emergency use two years ago. This time, however, the achievement had nothing to do with that ongoing public health crisis. Instead, as the New York Times and CNN alerted me that morning, at stake was a new technology that could potentially solve the worst dilemma humanity faces: climate change and the desperate overheating of our planet. Net-energy-gain fusion, a long-sought-after panacea for all that’s wrong with traditional nuclear-fission energy (read: accidents, radioactive waste), had finally been achieved at the Lawrence Livermore National Laboratory in California.
“This is such a wonderful example of a possibility realized, a scientific milestone achieved, and a road ahead to the possibilities for clean energy,” exclaimed White House science adviser Arati Prabhakar.
The New York Times was quick to follow Prabhakar’s lead, boasting that fusion is an “energy source devoid of the pollution and greenhouse gasses caused by the burning of fossil fuels.” Even Fox News, not exactly at the top of anyone’s list of places focused on climate change, jumped on the bandwagon, declaring fusion “a technology that has the potential to accelerate the planet’s shift away from fossil fuels and produce nearly limitless, carbon-free energy.”
Sadly, fusion won’t save the Arctic from melting, but if we don’t put a stop to it, that breakthrough technology could someday melt us all.
All in all, the reviews for fusion were positively glowing and it seemed to make instant sense. After all, what could possibly be wrong with something that might end our reliance on fossil fuels, even as it reduced the risks posed by our aging nuclear industry? The message, repeated again and again in the days that followed: this was a genuine global-warming game-changer.
After all, in the fusion process, no atoms have to be split to create heat. Gigantic lasers are used, not uranium, so there’s no toxic mining involved, nor do thousands of gallons of cold water have to be pumped in to cool overheated reactors, nor will there be radioactive waste byproducts lasting hundreds of thousands of years. And not a risk of a nuclear meltdown in sight! Fusion, so the cheery news went, is safe, effective, and efficient!
Or is it?
The Big Catch
On a very basic level, fusion is the stuff of stars. Within the Earth’s sun, hydrogen combines with helium to create heat in the form of sunlight. Inside the walls of the Livermore Lab, this natural process was imitated by blasting 192 gigantic lasers into a tube the size of a baby’s toe. Inside that cylinder sat a “hydrogen-encased diamond.” When the laser shot through the small hole, it destroyed that diamond quicker than the blink of an eye. In doing so, it created a bunch of invisible x-rays that compressed a small pellet of deuterium and tritium, which scientists refer to as “heavy hydrogen.”
“In a brief moment lasting less than 100 trillionths of a second, 2.05 megajoules of energy — roughly the equivalent of a pound of TNT — bombarded the hydrogen pellet,”explainedNew York Times reporter Kenneth Chang. “Out flowed a flood of neutron particles — the product of fusion — which carried about 3 megajoules of energy, a factor of 1.5 in energy gain.”
As with so many breakthroughs, there was a catch. First, 3 megajoules isn’t much energy. After all, it takes 360,000 megajoules to create 300 hours of light from a single 100-watt light bulb. So, Livermore’s fusion development isn’t going to electrify a single home, let alone a million homes, anytime soon. And there was another nagging issue with this little fusion creation as well: it took 300 megajoules to power up those 192 lasers. Simply put, at the moment, they require 100 times more energy to charge than the energy they ended up producing.
“The reality is that fusion energy will not be viable at scale anytime within the next decade, a time frame over which carbon emissions must be reduced by 50% to avoid catastrophic warming of more than 1.5°C,”says climate expert Michael Mann, a professor of earth and environmental science at the University of Pennsylvania. “That task will only be achievable through the scaling up of existing clean energy — renewable sources such as wind and solar — along with energy storage capability and efficiency and conservation measures.”
Tritium Trials and Tribulations
The secretive and heavily secured National Ignition Facility where that test took place is the size of a sprawling sports arena. It could, in fact, hold three football fields. Which makes me wonder: how much space would be needed to do fusion on a commercial scale? No good answer is yet available. Then there’s the trouble with that isotope tritium needed to help along the fusion reaction. It’s not easy to come by and costs about as much as diamonds, around $30,000 per gram. Right now, even some of the bigwigs at the Department of Defense are worried that we’re running out of usable tritium.
“Fusion advocates often boast that the fuel for their reactors will be cheap and plentiful. That is certainly true for deuterium,”writes Daniel Clery in Science. “Roughly one in every 5,000 hydrogen atoms in the oceans is deuterium, and it sells for about $13 per gram. But tritium, with a half-life of 12.3 years, exists naturally only in trace amounts in the upper atmosphere, the product of cosmic ray bombardment.”
Fusion boosters brush this unwelcome fact aside, pointing out that “tritium breeding” — a process in which tritium is endlessly produced in a loop-like fashion — is entirely possible in a fully operating fusion reactor. In theory, this may seem plausible, but you need a bunch of tritium to jumpstart the initial chain reaction and doubt abounds that there’s enough of it out there to begin with. On top of that, the reactors themselves will have to be lined with a lot of lithium, itself an expensive chemical element at $71 a kilogram (copper, by contrast, is around $9.44 a kilogram), to allow the process to work correctly.
Then there’s also a commonly repeated misstatement that fusion doesn’t create significant radioactive waste, a haunting reality for the world’s current fleet of nuclear plants. True, plutonium, which can be used as fuel in atomic weapons, isn’t a natural byproduct of fusion, but tritium is the radioactive form of hydrogen. Its little isotopes are great at permeating metals and finding ways to escape tight enclosures. Obviously, this will pose a significant problem for those who want to continuously breed tritium in a fusion reactor. It also presents a concern for people worried about radioactivity making its way out of such facilities and into the environment.
“Cancer is the main risk from humans ingesting tritium. When tritium decays it spits out a low-energy electron (roughly 18,000 electron volts) that escapes and slams into DNA, a ribosome, or some other biologically important molecule,” David Biello explains in Scientific American. “And, unlike other radionuclides, tritium is usually part of water, so it ends up in all parts of the body and therefore can, in theory, promote any kind of cancer. But that also helps reduce the risk: any tritiated water is typically excreted in less than a month.”
If that sounds problematic, that’s because it is. This country’s above-ground atomic bomb testing in the 1950s and 1960s was responsible for most of the man-made tritium that’s lingering in the environment. And it will be at least 2046, 84 years after the last American atmospheric nuclear detonation in Nevada, before tritium there will no longer pose a problem for the area.
Of course, tritium also escapes from our existing nuclear reactors and is routinely found near such facilities where it occurs “naturally” during the fission process. In fact, after Illinois farmers discovered their wells had been contaminated by the nearby Braidwood nuclear plant, they successfully sued the site’s operator Exelon, which, in 2005, was caught discharging 6.2 million gallons of tritium-laden water into the soil.
In the United States, the Nuclear Regulatory Commission (NRC) allows the industry to monitor for tritium releases at nuclear sites; the industry is politely asked to alert the NRC in a “timely manner” if tritium is either intentionally or accidentally released. But a June 2011 report issued by the Government Accountability Office cast doubt on the NRC’s archaic system for assessing tritium discharges, suggesting that it’s anything but effective. (“Absent such an assessment, we continue to believe that NRC has no assurance that the Groundwater Protection Initiative will lead to prompt detection of underground piping system leaks as nuclear power plants age.”)
Consider all of this a way of saying that, if the NRC isn’t doing an adequate job of monitoring tritium leaks already occurring with regularity at the country’s nuclear plants, how the heck will it do a better job of tracking the stuff at fusion plants in the future? And as I suggest in my new book, Atomic Days: The Untold Story of the Most Toxic Place in America, the NRC is plain awful at just about everything it does.
Instruments of Death
All of that got me wondering: if tritium, vital for the fusion process, is radioactive, and if they aren’t going to be operating those lasers in time to put the brakes on climate change, what’s really going on here?
Maybe some clues lie (as is so often the case) in history. The initial idea for a fusion reaction was proposed by English physicist Arthur Eddington in 1920. More than 30 years later, on November 1, 1952, the first full-scale U.S. test of a thermonuclear device, “Operation Ivy,” took place in the Marshall Islands in the Pacific Ocean. It yielded a mushroom-cloud explosion from a fusion reaction equivalent in its power to 10.4 Megatons of TNT. That was 450 times more powerful than the atomic bomb the U.S. had dropped on the Japanese city of Nagasaki only seven years earlier to end World War II. It created an underwater crater 6,240 feet wide and 164 feet deep.
“The Shot, as witnessed aboard the various vessels at sea, is not easily described,” noted a military report on that nuclear experiment. “Accompanied by a brilliant light, the heat wave was felt immediately at distances of thirty to thirty-five miles. The tremendous fireball, appearing on the horizon like the sun when half-risen, quickly expanded after a momentary hover time.”
Nicknamed “Ivy Mike,” the bomb was a Teller-Ulam thermonuclear device, named after its creators Edward Teller and Stanislaw Ulam. It was also the United States’ first full-scale hydrogen bomb, an altogether different beast than the two awful nukes dropped on Japan in August 1945. Those bombs utilized fission in their cores to create massive explosions. But Ivy Mike gave a little insight into what was still possible for future weapons of annihilation.
The details of how the Teller-Ulam device works are still classified, but historian of science Alex Wellerstein explained the concept well in the New Yorker:
“The basic idea is, as far as we know, as follows. Take a fission weapon — call it the primary. Take a capsule of fusionable material, cover it with depleted uranium, and call it the secondary. Take both the primary and the secondary and put them inside a radiation case — a box made of very heavy materials. When the primary detonates, radiation flows out of it, filling the case with X rays. This process, which is known as radiation implosion, will, through one mechanism or another… compress the secondary to very high densities, inaugurating fusion reactions on a large scale. These fusion reactions will, in turn, let off neutrons of such a high energy that they can make the normally inert depleted uranium of the secondary’s casing undergo fission.”
Got it? Ivy Mike was, in fact, a fission explosion that initiated a fusion reaction. But ultimately, the science of how those instruments of death work isn’t all that important. The takeaway here is that, since first tried out in that monstrous Marshall Islands explosion, fusion has been intended as a tool of war. And sadly, so it remains, despite all the publicity about its possible use some distant day in relation to climate change. In truth, any fusion breakthroughs are potentially of critical importance not as a remedy for our warming climate but for a future apocalyptic world of war. Despite all the fantastic media publicity, that’s how the U.S. government has always seen it and that’s why the latest fusion test to create “energy” was executed in the utmost secrecy at the Lawrence Livermore National Laboratory. One thing should be taken for granted: the American government is interested not in using fusion technology to power the energy grid, but in using it to further strengthen this country’s already massive arsenal of atomic weapons.
Consider it an irony, under the circumstances, but in its announcement about the success at Livermore — though this obviously wasn’t what made the headlines — the Department of Energy didn’t skirt around the issue of gains for future atomic weaponry. Jill Hruby, the department’s undersecretary for nuclear security, admitted that, in achieving a fusion ignition, researchers had “opened a new chapter in NNSA’s science-based Stockpile Stewardship Program.” (NNSA stands for the National Nuclear Security Administration.) That “chapter” Hruby was bragging about has a lot more to do with “modernizing” the country’s nuclear weapons capabilities than with using laser fusion to end our reliance on fossil fuels.
“Had we not pursued the hydrogen bomb,” Edward Teller once said, “there is a very real threat that we would now all be speaking Russian. I have no regrets.” Some attitudes die hard.
Buried deep in the Lawrence Livermore National Laboratory’s website, the government comes clean about what these fusion experiments at the $3.5 billion National Ignition Facility (NIF) are really all about:
“NIF’s high energy density and inertial confinement fusion experiments, coupled with the increasingly sophisticated simulations available from some of the world’s most powerful supercomputers, increase our understanding of weapon physics, including the properties and survivability of weapons-relevant materials… The high rigor and multidisciplinary nature of NIF experiments play a key role in attracting, training, testing, and retaining new generations of skilled stockpile stewards who will continue the mission to protect America into the future.”
Yes, despite all the media attention to climate change, this is a rare yet intentional admission, surely meant to frighten officials in China and Russia. It leaves little doubt about what this fusion breakthrough means. It’s not about creating future clean energy and never has been. It’s about “protecting” the world’s greatest capitalist superpower. Competitors beware.
Sadly, fusion won’t save the Arctic from melting, but if we don’t put a stop to it, that breakthrough technology could someday melt us all.
It is yet another attempt by those who believe that only a mega-scaled, technology-intensive approach can be a viable alternative to our current fossil fuel-dependent energy infrastructure.
In a dramatic scientific and engineering breakthrough, researchers at the Bay Area's Lawrence Berkeley Laboratory recently achieved the long-sought goal of generating a nuclear fusion reaction that produced more energy than was directly injected into a tiny reactor vessel. By the very next day, pundits well across the political spectrum were touting that breakthrough as a harbinger of a new era in energy production, suggesting that a future of limitless, low-impact fusion energy was perhaps a few decades away. In reality, however, commercially viable nuclear fusion is only infinitesimally closer than it was back in the 1980s when a contained fusion reaction—i.e. not occurring in the sun or from a bomb—was first achieved.
A meaningfully just energy transition needs to both be fully renewable, and also reject the myths of perpetual growth that emerged from the fossil fuel era.
While most honest writers have at least acknowledged the obstacles to commercially-scaled fusion, they typically still underestimate them—as much so today as back in the 1980s. We are told that a fusion reaction would have to occur "many times a second" to produce usable amounts of energy. But the blast of energy from the LBL fusion reactor actually only lasted one tenth of a nanosecond—that's a ten-billionth of a second. Apparently other fusion reactions (with a net energy loss) have operated for a few nanoseconds, but reproducing this reaction over a billion times every second is far beyond what researchers are even contemplating.
We are told that the reactor produced about 1.5 times the amount of energy that was input, but this only counts the laser energy that actually struck the reactor vessel. That energy, which is necessary to generate temperatures over a hundred million degrees, was the product of an array of 192 high-powered lasers, which required well over 100 times as much energy to operate. Third, we are told that nuclear fusion will someday free up vast areas of land that are currently needed to operate solar and wind power installations. But the entire facility needed to house the 192 lasers and all the other necessary control equipment was large enough to contain three football fields, even though the actual fusion reaction takes place in a gold or diamond vessel smaller than a pea. All this just to generate the equivalent of about 10-20 minutes of energy that is used by a typical small home. Clearly, even the most inexpensive rooftop solar systems can already do far more. And Prof. Mark Jacobson's group at Stanford University has calculated that a total conversion to wind, water and solar power might use about as much land as is currently occupied by the world's fossil fuel infrastructure.
Long-time nuclear critic Karl Grossmanwrote on Counterpunch recently of the many likely obstacles to scaling up fusion reactors, even in principle, including high radioactivity, rapid corrosion of equipment, excessive water demands for cooling, and the likely breakdown of components that would need to operate at unfathomably high temperatures and pressures. His main source on these issues is Dr. Daniel Jassby, who headed Princeton's pioneering fusion research lab for 25 years. The Princeton lab, along with researchers in Europe, has led the development of a more common device for achieving nuclear fusion reactions, a doughnut-shaped or spherical vessel known as a tokamak. Tokamaks, which contain much larger volumes of highly ionized gas (actually a plasma, a fundamentally different state of matter), have achieved substantially more voluminous fusion reactions for several seconds at a time, but have never come close to producing more energy than is injected into the reactor.
The laser-mediated fusion reaction achieved at LBL occurred at a lab called the National Ignition Facility, which touts its work on fusion for energy, but is primarily dedicated to nuclear weapons research. Prof. M. V. Ramana of the University of British Columbia, whoserecent article was posted on the newly revived ZNetwork, explains, "NIF was set up as part of the Science Based Stockpile Stewardship Program, which was the ransom paid to the US nuclear weapons laboratories for forgoing the right to test after the United States signed the Comprehensive Test Ban Treaty" in 1996. It is "a way to continue investment into modernizing nuclear weapons, albeit without explosive tests, and dressing it up as a means to produce 'clean' energy." Ramana cites a 1998 article that explained how one aim of laser fusion experiments is to try to develop a hydrogen bomb that doesn't require a conventional fission bomb to ignite it, potentially eliminating the need for highly enriched uranium or plutonium in nuclear weapons.
While some writers predict a future of nuclear fusion reactors running on seawater, the actual fuel for both tokamaks and laser fusion experiments consists of two unique isotopes of hydrogen known as deuterium—which has an extra neutron in its nucleus—and tritium—with two extra neutrons. Deuterium is stable and somewhat common: approximately one out of every 5-6000 hydrogen atoms in seawater is actually deuterium, and it is a necessary ingredient (as a component of "heavy water") in conventional nuclear reactors. Tritium, however, is radioactive, with a half-life of twelve years, and is typically a costly byproduct ($30,000 per gram) of an unusual type of nuclear reactor known as CANDU, mainly found today in Canada and South Korea. With half the operating CANDU reactors scheduled for retirement this decade, available tritium supplies will likely peak before 2030 and a new experimental fusion facility under construction in France will nearly exhaust the available supply in the early 2050s. That is the conclusion of ahighly revealing article that appeared in Science magazine last June, months before the latest fusion breakthrough. While the Princeton lab has made some progress toward potentially recycling tritium, fusion researchers remain highly dependent on rapidly diminishing supplies. Alternative fuels for fusion reactors are also under development, based on radioactive helium or boron, but these require temperatures up to a billion degrees to trigger a fusion reaction. The European lab plans to experiment with new ways of generating tritium, but these also significantly increase the radioactivity of the entire process and a tritium gain of only 5 to 15 percent is anticipated. The more downtime between experimental runs, the less tritium it will produce. The Science article quotes D. Jassby, formerly of the Princeton fusion lab, saying that the tritium supply issue essentially "makes deuterium-tritium fusion reactors impossible."
So why all this attention toward the imagined potential for fusion energy? It is yet another attempt by those who believe that only a mega-scaled, technology-intensive approach can be a viable alternative to our current fossil fuel-dependent energy infrastructure. Some of the same interests continue to promote the false claims that a "new generation" of nuclear fission reactors will solve the persistent problems with nuclear power, or that massive scale capture and burial of carbon dioxide from fossil fueled power plants will make it possible to perpetuate the fossil-based economy far into the future. It is beyond the scope of this article to systematically address those claims, but it is clear that today's promises for a new generation of "advanced" reactors is not much different from what we were hearing back in the 1980s, '90s or early 2000s.
Nuclear whistleblower Arnie Gundersen hassystematically exposed the flaws in the 'new' reactor design currently favored by Bill Gates, explaining that the underlying sodium-cooled technology is the same as in the reactor that "almost lost Detroit" due to a partial meltdown back in 1966, and has repeatedly caused problems in Tennessee, France and Japan. France's nuclear energy infrastructure, which has long been touted as a model for the future, is increasingly plagued by equipment problems, massive cost overruns and some sources of cooling water no longer being cool enough, due to rising global temperatures. An attempt to export French nuclear technology to Finland took more than twenty years longer than anticipated, at many times the original estimated cost. As for carbon capture, we know that countless, highly subsidized carbon capture experiments have failed and that the vast majority of the CO2 currently captured from power plants is used for "enhanced oil recovery," i.e. increasing the efficiency of existing oil wells. The pipelines that would be needed to actually collect CO2 and bury it underground would be comparable to the entire current infrastructure for piping oil and gas, and the notion of permanent burial will likely prove to be a pipedream.
Meanwhile, we know that new solar and wind power facilities are already cheaper to build than new fossil fueled power plants and in some locations are even less costly than continuing to operate existing power plants. Last May, California was briefly able to run its entire electricity grid on renewable energy, a milestone that had already been achieved in Denmark and in South Australia. And we know that a variety of energy storage methods, combined with sophisticated load management and upgrades to transmission infrastructure are already helping solve the problem of intermittency of solar and wind energy in Europe, California and other locations. At the same time, awareness is growing about the increasing reliance of renewable technology, including advanced batteries, on minerals extracted from Indigenous lands and the global South. Thus a meaningfully just energy transition needs to both be fully renewable, and also reject the myths of perpetual growth that emerged from the fossil fuel era. If the end of the fossil fuel era portends the end of capitalist growth in all its forms, it is clear that all of life on earth will ultimately be the beneficiary.
This post has been updated with the correct year that the United States signed the Comprehensive Nuclear Test Ban Treaty.