US Energy Secretary Jennifer Granholm announces a major scientific breakthrough on fusion energy

U.S. Energy Secretary Jennifer Granholm announces a major scientific breakthrough on fusion energy from researchers at Nuclear Security and National Nuclear Security Administrations Lawrence Livermore National Laboratory in Washington, DC, on December 13, 2022.

(Photo by Olivier Douliery/AFP via Getty Images)

Nuclear Fusion Could Save Us! But There's a Catch: It Won't

The promising energy source is not a silver bullet for the climate crisis and has dangers all its own.

I awoke on December 13th to news about what could be the most significant scientific breakthrough since the Food and Drug Administration authorized the first Covid vaccine for emergency use two years ago. This time, however, the achievement had nothing to do with that ongoing public health crisis. Instead, as the New York Times and CNN alerted me that morning, at stake was a new technology that could potentially solve the worst dilemma humanity faces: climate change and the desperate overheating of our planet. Net-energy-gain fusion, a long-sought-after panacea for all that’s wrong with traditional nuclear-fission energy (read: accidents, radioactive waste), had finally been achieved at the Lawrence Livermore National Laboratory in California.

“This is such a wonderful example of a possibility realized, a scientific milestone achieved, and a road ahead to the possibilities for clean energy,” exclaimed White House science adviser Arati Prabhakar.

The New York Times was quick to follow Prabhakar’s lead, boasting that fusion is an “energy source devoid of the pollution and greenhouse gasses caused by the burning of fossil fuels.” Even Fox News, not exactly at the top of anyone’s list of places focused on climate change, jumped on the bandwagon, declaring fusion “a technology that has the potential to accelerate the planet’s shift away from fossil fuels and produce nearly limitless, carbon-free energy.”

Sadly, fusion won’t save the Arctic from melting, but if we don’t put a stop to it, that breakthrough technology could someday melt us all.

All in all, the reviews for fusion were positively glowing and it seemed to make instant sense. After all, what could possibly be wrong with something that might end our reliance on fossil fuels, even as it reduced the risks posed by our aging nuclear industry? The message, repeated again and again in the days that followed: this was a genuine global-warming game-changer.

After all, in the fusion process, no atoms have to be split to create heat. Gigantic lasers are used, not uranium, so there’s no toxic mining involved, nor do thousands of gallons of cold water have to be pumped in to cool overheated reactors, nor will there be radioactive waste byproducts lasting hundreds of thousands of years. And not a risk of a nuclear meltdown in sight! Fusion, so the cheery news went, is safe, effective, and efficient!

Or is it?

The Big Catch

On a very basic level, fusion is the stuff of stars. Within the Earth’s sun, hydrogen combines with helium to create heat in the form of sunlight. Inside the walls of the Livermore Lab, this natural process was imitated by blasting 192 gigantic lasers into a tube the size of a baby’s toe. Inside that cylinder sat a “hydrogen-encased diamond.” When the laser shot through the small hole, it destroyed that diamond quicker than the blink of an eye. In doing so, it created a bunch of invisible x-rays that compressed a small pellet of deuterium and tritium, which scientists refer to as “heavy hydrogen.”

“In a brief moment lasting less than 100 trillionths of a second, 2.05 megajoules of energy — roughly the equivalent of a pound of TNT — bombarded the hydrogen pellet,”explainedNew York Times reporter Kenneth Chang. “Out flowed a flood of neutron particles — the product of fusion — which carried about 3 megajoules of energy, a factor of 1.5 in energy gain.”

As with so many breakthroughs, there was a catch. First, 3 megajoules isn’t much energy. After all, it takes 360,000 megajoules to create 300 hours of light from a single 100-watt light bulb. So, Livermore’s fusion development isn’t going to electrify a single home, let alone a million homes, anytime soon. And there was another nagging issue with this little fusion creation as well: it took 300 megajoules to power up those 192 lasers. Simply put, at the moment, they require 100 times more energy to charge than the energy they ended up producing.

“The reality is that fusion energy will not be viable at scale anytime within the next decade, a time frame over which carbon emissions must be reduced by 50% to avoid catastrophic warming of more than 1.5°C,”says climate expert Michael Mann, a professor of earth and environmental science at the University of Pennsylvania. “That task will only be achievable through the scaling up of existing clean energy — renewable sources such as wind and solar — along with energy storage capability and efficiency and conservation measures.”

Tritium Trials and Tribulations

The secretive and heavily secured National Ignition Facility where that test took place is the size of a sprawling sports arena. It could, in fact, hold three football fields. Which makes me wonder: how much space would be needed to do fusion on a commercial scale? No good answer is yet available. Then there’s the trouble with that isotope tritium needed to help along the fusion reaction. It’s not easy to come by and costs about as much as diamonds, around $30,000 per gram. Right now, even some of the bigwigs at the Department of Defense are worried that we’re running out of usable tritium.

“Fusion advocates often boast that the fuel for their reactors will be cheap and plentiful. That is certainly true for deuterium,”writes Daniel Clery in Science. “Roughly one in every 5,000 hydrogen atoms in the oceans is deuterium, and it sells for about $13 per gram. But tritium, with a half-life of 12.3 years, exists naturally only in trace amounts in the upper atmosphere, the product of cosmic ray bombardment.”

Fusion boosters brush this unwelcome fact aside, pointing out that “tritium breeding” — a process in which tritium is endlessly produced in a loop-like fashion — is entirely possible in a fully operating fusion reactor. In theory, this may seem plausible, but you need a bunch of tritium to jumpstart the initial chain reaction and doubt abounds that there’s enough of it out there to begin with. On top of that, the reactors themselves will have to be lined with a lot of lithium, itself an expensive chemical element at $71 a kilogram (copper, by contrast, is around $9.44 a kilogram), to allow the process to work correctly.

Then there’s also a commonly repeated misstatement that fusion doesn’t create significant radioactive waste, a haunting reality for the world’s current fleet of nuclear plants. True, plutonium, which can be used as fuel in atomic weapons, isn’t a natural byproduct of fusion, but tritium is the radioactive form of hydrogen. Its little isotopes are great at permeating metals and finding ways to escape tight enclosures. Obviously, this will pose a significant problem for those who want to continuously breed tritium in a fusion reactor. It also presents a concern for people worried about radioactivity making its way out of such facilities and into the environment.

“Cancer is the main risk from humans ingesting tritium. When tritium decays it spits out a low-energy electron (roughly 18,000 electron volts) that escapes and slams into DNA, a ribosome, or some other biologically important molecule,” David Biello explains in Scientific American. “And, unlike other radionuclides, tritium is usually part of water, so it ends up in all parts of the body and therefore can, in theory, promote any kind of cancer. But that also helps reduce the risk: any tritiated water is typically excreted in less than a month.”

If that sounds problematic, that’s because it is. This country’s above-ground atomic bomb testing in the 1950s and 1960s was responsible for most of the man-made tritium that’s lingering in the environment. And it will be at least 2046, 84 years after the last American atmospheric nuclear detonation in Nevada, before tritium there will no longer pose a problem for the area.

Of course, tritium also escapes from our existing nuclear reactors and is routinely found near such facilities where it occurs “naturally” during the fission process. In fact, after Illinois farmers discovered their wells had been contaminated by the nearby Braidwood nuclear plant, they successfully sued the site’s operator Exelon, which, in 2005, was caught discharging 6.2 million gallons of tritium-laden water into the soil.

In the United States, the Nuclear Regulatory Commission (NRC) allows the industry to monitor for tritium releases at nuclear sites; the industry is politely asked to alert the NRC in a “timely manner” if tritium is either intentionally or accidentally released. But a June 2011 report issued by the Government Accountability Office cast doubt on the NRC’s archaic system for assessing tritium discharges, suggesting that it’s anything but effective. (“Absent such an assessment, we continue to believe that NRC has no assurance that the Groundwater Protection Initiative will lead to prompt detection of underground piping system leaks as nuclear power plants age.”)

Consider all of this a way of saying that, if the NRC isn’t doing an adequate job of monitoring tritium leaks already occurring with regularity at the country’s nuclear plants, how the heck will it do a better job of tracking the stuff at fusion plants in the future? And as I suggest in my new book, Atomic Days: The Untold Story of the Most Toxic Place in America, the NRC is plain awful at just about everything it does.

Instruments of Death

All of that got me wondering: if tritium, vital for the fusion process, is radioactive, and if they aren’t going to be operating those lasers in time to put the brakes on climate change, what’s really going on here?

Maybe some clues lie (as is so often the case) in history. The initial idea for a fusion reaction was proposed by English physicist Arthur Eddington in 1920. More than 30 years later, on November 1, 1952, the first full-scale U.S. test of a thermonuclear device, “Operation Ivy,” took place in the Marshall Islands in the Pacific Ocean. It yielded a mushroom-cloud explosion from a fusion reaction equivalent in its power to 10.4 Megatons of TNT. That was 450 times more powerful than the atomic bomb the U.S. had dropped on the Japanese city of Nagasaki only seven years earlier to end World War II. It created an underwater crater 6,240 feet wide and 164 feet deep.

“The Shot, as witnessed aboard the various vessels at sea, is not easily described,” noted a military report on that nuclear experiment. “Accompanied by a brilliant light, the heat wave was felt immediately at distances of thirty to thirty-five miles. The tremendous fireball, appearing on the horizon like the sun when half-risen, quickly expanded after a momentary hover time.”

Nicknamed “Ivy Mike,” the bomb was a Teller-Ulam thermonuclear device, named after its creators Edward Teller and Stanislaw Ulam. It was also the United States’ first full-scale hydrogen bomb, an altogether different beast than the two awful nukes dropped on Japan in August 1945. Those bombs utilized fission in their cores to create massive explosions. But Ivy Mike gave a little insight into what was still possible for future weapons of annihilation.

The details of how the Teller-Ulam device works are still classified, but historian of science Alex Wellerstein explained the concept well in the New Yorker:

“The basic idea is, as far as we know, as follows. Take a fission weapon — call it the primary. Take a capsule of fusionable material, cover it with depleted uranium, and call it the secondary. Take both the primary and the secondary and put them inside a radiation case — a box made of very heavy materials. When the primary detonates, radiation flows out of it, filling the case with X rays. This process, which is known as radiation implosion, will, through one mechanism or another… compress the secondary to very high densities, inaugurating fusion reactions on a large scale. These fusion reactions will, in turn, let off neutrons of such a high energy that they can make the normally inert depleted uranium of the secondary’s casing undergo fission.”

Got it? Ivy Mike was, in fact, a fission explosion that initiated a fusion reaction. But ultimately, the science of how those instruments of death work isn’t all that important. The takeaway here is that, since first tried out in that monstrous Marshall Islands explosion, fusion has been intended as a tool of war. And sadly, so it remains, despite all the publicity about its possible use some distant day in relation to climate change. In truth, any fusion breakthroughs are potentially of critical importance not as a remedy for our warming climate but for a future apocalyptic world of war. Despite all the fantastic media publicity, that’s how the U.S. government has always seen it and that’s why the latest fusion test to create “energy” was executed in the utmost secrecy at the Lawrence Livermore National Laboratory. One thing should be taken for granted: the American government is interested not in using fusion technology to power the energy grid, but in using it to further strengthen this country’s already massive arsenal of atomic weapons.

Consider it an irony, under the circumstances, but in its announcement about the success at Livermore — though this obviously wasn’t what made the headlines — the Department of Energy didn’t skirt around the issue of gains for future atomic weaponry. Jill Hruby, the department’s undersecretary for nuclear security, admitted that, in achieving a fusion ignition, researchers had “opened a new chapter in NNSA’s science-based Stockpile Stewardship Program.” (NNSA stands for the National Nuclear Security Administration.) That “chapter” Hruby was bragging about has a lot more to do with “modernizing” the country’s nuclear weapons capabilities than with using laser fusion to end our reliance on fossil fuels.

“Had we not pursued the hydrogen bomb,” Edward Teller once said, “there is a very real threat that we would now all be speaking Russian. I have no regrets.” Some attitudes die hard.

Buried deep in the Lawrence Livermore National Laboratory’s website, the government comes clean about what these fusion experiments at the $3.5 billion National Ignition Facility (NIF) are really all about:

“NIF’s high energy density and inertial confinement fusion experiments, coupled with the increasingly sophisticated simulations available from some of the world’s most powerful supercomputers, increase our understanding of weapon physics, including the properties and survivability of weapons-relevant materials… The high rigor and multidisciplinary nature of NIF experiments play a key role in attracting, training, testing, and retaining new generations of skilled stockpile stewards who will continue the mission to protect America into the future.”

Yes, despite all the media attention to climate change, this is a rare yet intentional admission, surely meant to frighten officials in China and Russia. It leaves little doubt about what this fusion breakthrough means. It’s not about creating future clean energy and never has been. It’s about “protecting” the world’s greatest capitalist superpower. Competitors beware.

Sadly, fusion won’t save the Arctic from melting, but if we don’t put a stop to it, that breakthrough technology could someday melt us all.

© 2023 TomDispatch.com