Can Decarbonizing the Electric Grid Help Avert Climate Catastrophe?

Photo by Harry Cunningham @harry.digital: https://www.pexels.com/photo/photo-of-wind-turbines-under-cloudy-sky-3619870/

A Presidential Session Spotlight from the AMS 104th Annual Meeting

By Katie Pflaumer, AMS Staff

Significantly reducing greenhouse gas emissions requires transitioning primarily to carbon-free sources for energy generation, but many challenges stand in the way. What are these challenges, and how can the weather, water, and climate sector help meet them?

A Presidential Session at the 104th AMS Annual Meeting addressed those questions with panelists Debbie Lew (Executive Director at ESIG, the Energy Systems Integration Group), Alexander “Sandy” MacDonald (former AMS President and former director of the NOAA Earth Systems Research Laboratory), Aidan Tuohy (Director of Transmission Operations and Planning at EPRI, the Electric Power Research Institute), and Justin Sharp (then Owner and Principal of Sharply Focused, now Technical Leader in the Transmission and Operations Planning team at EPRI). Here are some key points that arose from the session, titled, “Transition to Carbon-Free Energy Generation,” introduced by NSF NCAR’s Jared Lee, and moderated by MESO, Inc.’s John Zack.

Key Points

  • Decarbonizing the electric grid is key to reducing U.S. greenhouse gas emissions.
  • Wind and solar are now the cheapest forms of energy generation; adoption is increasing, but not fast enough to catch up with the likely growth in demand. 
  • Energy demand is rapidly increasing, driven by the expansion of data centers, AI applications, crypto mining, and the electrification of transportation and heating. Hydrogen production might greatly increase future loads. 
  • Massive buildouts” of both renewable energy plants AND transmission infrastructure are required to reduce emissions. 
  • A reliable and affordable power system with large shares of wind and solar generation requires accurate historical weather information to inform infrastructure buildout, and accurate forecasts to support operations. 
  • To avoid expensive infrastructure that’s only used during peak times, electricity pricing must incentivize consumers to avoid excessive use during periods of high demand. This requires accurate weather forecasting. 
  • Connecting the three main national grids together into a “supergrid” could improve transmission and grid flexibility, significantly reducing emissions.

The need for carbon-free energy is urgent

Greenhouse gas emissions are still increasing sharply. In response, global temperatures are rising faster than even the most pessimistic models would have predicted a few decades ago, noted Lee in his introductory remarks to the panel. The U.S. is the second largest global carbon emitter, despite having a much smaller population than the other top emitters, China and India.

If we don’t solve the greenhouse gas problem by mid-century, warned MacDonald, we will soon hit 700 ppm of carbon dioxide in the atmosphere. If that happens, “We’re back to the Miocene era,” he said, referencing an exceptionally hot period around 12.5 million years ago. “Northern Hemisphere land temperatures will be 11 degrees Fahrenheit warmer. Arctic temps will be 17°F warmer, which is probably going to launch a huge permafrost thaw … The ocean will be 80% more acidic. So we are in an urgent situation.”

What’s the path to a more sustainable future? Decarbonizing the grid.

The energy sector is one of the top sources of U.S. emissions—and reducing emissions there will have knock-on effects in buildings and transportation. Lee noted that wind and solar power have dropped dramatically in price, becoming the cheapest forms of energy generation available. This has led to an increase in adoption: renewables are now second only to natural gas in terms of electrical power generated in the United States. Yet natural gas is still growing fast, and still far exceeds the use of renewables.

Therefore, Lew said in her talk, we need “massive buildouts of [wind, solar, and battery] resources … doubling or even tripling the amount of installed capacity. We’re going to be electrifying buildings, transportation, industry [and] massively building out transmission and distribution networks … And we’re going to be using fossil fuel generators for reliability needs.” Doing this could get us to 80-90% fossil-free energy production.

Bridging the gap

But what about that last 10–20%?

“We need some kind of cost-effective, clean, firm resource” to fill in the gaps and act as a bridge fuel—a resource that’s available 24/7 no matter the weather or season—said Lew. This resource might end up being hydrogen, advanced nuclear energy, or even green bioenergy with carbon capture and sequestration to offset emissions from natural gas. “We need all options on the table.”

Weather? Or not?

Trying to transition to renewables without incorporating reliability and resilience will lead to blackouts and power outages, Tuohy noted. These would have major economic consequences and reduce the political viability of renewables, as well as leading to unjust allocation of energy.

A resilient grid, he said, requires enough energy production to meet future demand; adequate transmission and delivery infrastructure to meet future needs and to balance supply with demand moment-to-moment every day; reliability despite constant shifts in energy production; and the ability to prevent a problem in one place from causing cascading outages across the system. 

Making a new, wind- and solar-dependent grid truly work means balancing—and forecasting—energy availability and demand across the nation, accounting for the current and predicted weather at each solar and wind energy site, as well as how climate change will affect resource availability. This means a massive meteorological infrastructure must be created.

Read our upcoming post from Justin Sharp to learn more about how weather and renewable energy must work together.

“[This is] an operational need, not a research project … There’s an imperative to have dedicated, accurate, and expertly curated weather information to support the energy transition.”

—Justin Sharp

Uncertainty

Demands on the grid are now subject to extreme variability, not just from weather and climate, Tuohy said. For example, demand projections from 2022 versus 2023 were radically different because of new energy-intensive data centers coming online.

“We’ve gone from a kind of deterministic system — [in which we] had good sense of, our peak demand’s going to happen in July—to a far more stochastic and variable type, both on the demand and the supply side,” said Tuohy. We have a lot of data and computational tools, but we must be able to bring those datasets together effectively so we can analyze and predict change. “We need to … develop tools that account for [uncertainty].”

Changing behavior

The infrastructure required for the necessary expansion of renewable energy generation will be expensive. Keeping the cost manageable means not wasting money to build extra infrastructure that’s only useful during times of peak demand. That means we need to avoid high peaks in energy use.

We know that people can be a lot more conscientious about energy consumption if they think it will save them money. Yet many consumers are currently sheltered from the financial consequences of overloading the grid. “There’s tremendous flexibility in load if you … expose consumers to better price signals,” Lew said.

Consumers could be financially incentivized, for example, to choose off-peak times to turn on a heater or charge an electric vehicle. Such programs should be carefully designed to minimize negative impacts on vulnerable consumers, but the fact remains that to keep those consumers safe, the climate crisis must be confronted.

Supergrid to the rescue?

The main problem with a renewable energy grid, the speakers acknowledged, is transmission—both connecting new generators and moving energy based on supply and demand. “You’ve got to be able to move wind and solar energy around at continental scales,” said MacDonald. A study by ESIG suggested that simply adding a 2-gigawatt transmission line connecting the Texas power grid with the Eastern U.S. power grid would effectively act like 4 GW of extra electricity generating capacity across the two regions, because their grids experience risk and stress at different, complementary times.

A 2016 paper MacDonald and colleagues published in Nature Climate Change suggests that U.S. electricity-sector carbon emissions could be decreased by 80% — with current technology and without increased electricity costs — if the United States can implement a “supergrid.” That means connecting all three major electrical grids currently serving the continental United States. When it’s sunny in San Jose and snowing in Cincinnati, you could transmit solar-produced energy to keep Ohio homes warm, rather than generating extra power locally. 

It will take a lot of effort, but “if we [start implementing a supergrid] now, in a 40-year transition, we can preserve the environment we have,” MacDonald said. “If we wait until the 2040s, we are basically going to devastate the planet’s life for thousands of years.”

You can view all the AMS 104th Annual Meeting presentations online. Watch this Presidential Session.

Photo at top: Harry Cunningham on Pexels (@harry.digital)

BEST: Capturing the Worst Tornado Winds

Greenfield tornado

Greenfield, Doppler on Wheels, and what happens where a twister meets the ground

By Katie Pflaumer, AMS Staff

Featured image: The Greenfield tornado, south of the town. Photo credit: Lauren Baca.

On 21 May, 2024, a powerful tornado hit the town of Greenfield, Iowa. A mobile team from the NSF BEST project was able to capture radar and instrument data, measuring one-second gusts among the highest ever recorded. Karen Kosiba, PhD, Principal Investigator (PI) of the BEST project, and Jen Walton, founder of AMS partner organization Girls Who Chase, were both part of the team who intercepted the Greenfield tornado. We spoke with them about what it was like, and what their valuable data might yield.

The tornado that hit Greenfield was fast, narrow, and violent, cutting a 44-mile path through southwestern Iowa. Moving into town from the southwest, it had already destroyed wind turbines and family farms, with multiple vortices visibly rotating around its center. 

But as it neared Greenfield, where it would kill five people, the tornado was obscured by a cloak of rain. Racing toward the town with her colleagues, Jen Walton told me, “We could see nothing but a wall of white ahead of us.” They were trying to put themselves right in the path of a hidden monster.

Karen Kosiba wouldn’t have seen it anyway, although she was less than a quarter of a mile from the vortex. “I [almost] never look out the window,” she told me. Her attention was glued to the radar screen. As Principal Investigator on the NSF-funded BEST (Boundary-layer Evolution and Structure of Tornadoes) project, her job was to track the path of the tornado on radar so their team could get close enough to obtain high-resolution dual-Doppler radar and weather instrument data of the tornadic winds closest to the surface of the earth. 

They had sped through Greenfield, and her mobile radar vehicle was now parked just to the east of town, hoping for a clear line of sight in the hilly, tree-covered terrain. “I’m operating the radar, we’re basically scanning through this [storm], tracing the path of the tornado, and it was getting more and more obvious it was going to go through Greenfield,” she said.

Karen Kosiba in DOW
Dr. Karen Kosiba reading the radar screen in a DOW vehicle. Photo credit: Jen Walton/FARM Facility.

DOW(n) Low with Tornadoes

Obtaining high-resolution data from tornadoes is incredibly difficult using stationary instruments and radars — especially for near-surface conditions. The earth’s curvature and obstacles like trees and topography mean that far-away radars simply can’t get a good view of where a twister meets the ground. Also, because of beam spreading, far away radars have worse spatial resolution. Josh Wurman invented the Doppler on Wheels (DOW) network of truck-mounted Doppler radars — now part of the University of Illinois’ Flexible Array of Radars and Mesonets (FARM) Facility — in the 1990s to address challenges like these. DOWs have been used all over the world to look at everything from hurricanes to flooding and wildfires.  

FARM missions currently involve some combination of their four DOWs, a variety of support vehicles equipped with mesonets, and quickly deployable weather stations (Pods), as well as weather balloon-borne instrumentation. The equipment has advanced greatly since the ’90s, Kosiba says. “We scan fast, with really short gates that get us fine-resolution … dual-pol data, which is important for understanding debris signatures and inferring microphysics.” 

The BEST project (which Kosiba co-leads with Wurman) deploys DOWs, Pods, and weather balloons to study boundary-layer tornado winds. “We’re looking at … near-surface wind profiles, and how those vary as a function of tornado structure,” said Kosiba. “We’re also looking at thermodynamics — the relative humidity and temperature, more or less buoyant air, where it originates from — and how that affects tornado intensity, structure, and longevity. Is [the tornado] intensifying, weakening, going on for a long or short time?” It’s the kind of assignment the DOWS were made for.

“Some, rare, observations show that tornado winds can exceed 300 mph, and that the most intense winds are very near the ground, where they are especially hard to measure. In order to mitigate the hazards posed by tornadoes, it is critical to better understand their basic structure and intensity.”

—Excerpt from NSF Boundary-layer Evolution and Structure of Tornadoes (BEST) project grant description

In Greenfield

As TV screens and tornado sirens blared warnings to the town of Greenfield, the BEST team frantically tried to find a place to deploy as the tornado bore down. 

“It was evolving too quickly,” Kosiba told me. One DOW raced to get about 10 miles out, while Kosiba’s DOW truck tried to get closer — and Jen Walton and colleagues went even closer to the tornado, attempting to drop a Pod. Pods are placed in the projected path of the tornado, with the hope that they will obtain surface wind observations from within the radius of maximum winds. Positioning the Pod was difficult with a storm moving at close to 45 mph. 

“As we drove back west toward Greenfield … it was absolutely pouring, making it difficult to make out any features of the tornado-producing storm entering town. But as we pulled up and began to deploy the Pod, the rain bands took on a left-to-right motion indicative of rotation,” said Walton. “That’s when we knew we were in the bear’s cage — chaser slang for the mesocyclone portion of a supercell where a tornado can typically be found, if there is one. As we took GPS coordinates and prepared to depart, debris began falling slantwise out of the rain. We knew it was time to go.”

As it turned out, the Pod team wasn’t the only group having a close encounter. Kosiba’s DOW vehicle ended up directly in the path of a weaker tornado that was forming as they collected data near Greenfield. “The storm was going through a cyclic thing, and there was a new tornado forming very near us. It got windy and rainy.” Although they noticed this in real-time, there wasn’t much they could do except keep collecting data. Luckily, the tornado strengthened after it passed their location.  

As so often happens with this work, for Kosiba at least, there was no time even to be nervous. “Tornadoes are so fast, and you’re so focused on getting people in the right place, in a safe place, and getting the data, so there’s no time to think about anything other than that.” 

What was harrowing was driving into Greenfield once the tornado had passed. “There’s clearly a path of destruction … In that narrow region [where the tornado went through], it was pretty raked over. People were still coming out of their houses, animals were still trying to get oriented.”

Rare Data from a Disaster

The radar data from the BEST team is high-resolution enough that researchers will be able to examine how specific structures in Greenfield failed in the high winds. “Measuring low-level winds very close to a town is very rare … we can see in a very localized area what these structures experienced,” Kosiba said. These grim analyses could assist damage assessors after future storms, and perhaps even help those building and maintaining man-made structures to make them safer.

“We’re in the preliminary stages of inventorying what we’ve got and what we can do,” said Kosiba. “But it’s a rich and unusual dataset.”

DOW8 in Greenfield
DOW8 vehicle in Greenfield after the tornado’s passage. Photo credit: Maiana Hanshaw/FARM Facility.

Strongest Winds Ever?

During the storm, the team was concerned only with acquiring good data. When they actually looked at the Greenfield readings, however, they were surprised to note winds of around 270 miles per hour, with gusts well above that. These one-second wind speeds are difficult to pinpoint exactly, said Kosiba, as the particles measured by radar — “debris, raindrops, grass, two-by-fours” — are all moving differently through the air and at different angles to the radar beam. “We’re trying to give a range, which puts this event at 309–318 mph.” The two strongest known tornadoes, El Reno in 2013 and Bridge Creek in 1999, both had DOW-measured wind speeds within that range.

Yet the Greenfield tornado was “only” deemed an EF4 by the National Weather Service (indicating three-second wind speeds up to 200 mph). This is likely because the EF scale is based on the structural damage a tornado leaves, not radar/instrument measurements. To receive the highest rating, EF5, a tornado has to damage structures to a degree that only an EF5 could. “It’s possible there was nothing [in its path] that could have sustained an EF5 level of damage,” said Kosiba. 

In addition, the highest wind gusts measured by the DOW team were for very short intervals, often less than one second, rather than longer-period averages. Due to the relative dearth of close-up measurements, we don’t know enough to say how unusual such high wind speeds near the surface really are.

Chasing the Data

“Twisters,” the long-anticipated sequel to the 1996 movie “Twister,” has hit movie screens, highlighting the awe of dangerous storms–and the divisions sometimes drawn between scientific researchers and those who chase storms because it’s their passion. As researchers and storm chasers who work together to get vital information about tornadoes, what do Kosiba and Walton think?

Jen Walton deploys a Pod of weather instruments in the path of the Greenfield tornado. Photo credit: BEST/FARM Facility.

“In my opinion, storm chasers are fonts of historical knowledge and expertise that are underutilized by the scientific community, and this is something I’m discussing with AMS and the broader research community,” said Walton. “We get a bad rap for being adrenaline junkies seeking our next thrill, and of course some folks are. But many people, myself included, would love to have more tangible ways to contribute in addition to already serving as eyes on the ground for the National Weather Service and/or working with local broadcast meteorologists. When Karen mentioned the opportunity to support the BEST Project, I jumped at the opportunity to use my own knowledge and expertise to contribute to work I know will truly make a difference in peoples’ lives – and even though my 2024 looked very different than a typical season, my time in the field with the DOWs is an experience I wouldn’t trade.”

“This kind of data collection is high risk but high payoff. You have to be out in the field to do it,” said Kosiba. “People who storm chase can make very valuable parts of the scientific team. Jen knows storm structure and forecasting … We want people who know what they’re looking at, who can think about exits; they need to be able to make some autonomous decisions out there. … If you just have a textbook understanding of storms, you have to get ramped up [on the practical side]. But people who’ve been looking at these storms for a long time and making decisions, that’s a great skill.”

To learn more about Girls Who Chase, listen to podcast interviews with experts like Dr. Kosiba, or even start your storm chasing education, check out girlswhochase.com.

To learn more about BEST and the DOWs, AMS Members and Weather Band members can watch our 23 July, 2024 webinar featuring Drs. Kosiba and Wurman and moderated by Jen Walton: Tornado on the Ground: DOW insights from 2024 tornadoes, including the Greenfield, IA EF4.

Asian American and Pacific Islander Heritage Month Spotlight: Dr. Tetsuya “Ted” Fujita

Tidal Basin with cherry blossoms and ducks (NPS photo)

By AMS President Anjuli S. Bamzai

Blossoming cherry trees are stars of springtime in Washington, D.C., and the most popular place to visit the cherry blossom trees is the Tidal Basin. Their bloom is one of the most joyful events of the year, awaited with much anticipation by tourists, meteorologists, local businesses, and the National Park Service.

Celebrating the friendship between the Japanese and American peoples, the Tidal Basin cherry trees were a gift from the Mayor of Tokyo to the United States in 1912. While the precise timing of peak bloom varies from year to year (April 4 on average, driven largely by winter/early spring temperatures), peak bloom has been occurring earlier due to warming trends. Furthermore, a combination of rising sea level and sinking land has necessitated plans for a new seawall that requires many existing trees to be removed. Yet the government of Japan has promised new trees to replace those that were lost.

This year’s beautiful blossoms strongly reminded me of the remarkable contributions of Japanese Americans — in particular Japanese American meteorologists. Our science would be especially bereft without the contributions of several scientists who, after receiving their advanced degrees at the University of Tokyo in the so-called “Syono school” of dynamic meteorology, immigrated to the U.S. from postwar Japan. Among them were Tetsuya Fujita, Akio Arakawa, Akira Kasahara, Kikuro Miyakoda, Takio Murakami, Katsuyuki Ooyama, Michio Yanai, and of course, Syukuro ‘Suki’ Manabe, one of the three recipients of the Nobel Prize in Physics in 2021.

Celebrating AAPI Heritage Month, in this post I chose to showcase the contributions of the legendary Dr. Tetsuya Theodore ‘Ted’ Fujita. Nicknamed “Mr. Tornado,” he linked tornado damage with wind speed and in 1971, developed the Fujita scale for rating tornado intensity based on ground and/or aerial damage surveys. He is also recognized as the discoverer of downbursts and microbursts, which are serious potential threats to aviation safety. Thus his discoveries made aviation safer.

Fujita (left) with John McCarthy, Inaugural Director of NCAR-RAP/RAL, in 1982. After studying tornadoes for over two decades, Fujita had just seen his first one in person. Photo: Texas Tech, found in Fujita’s memoir, “Memoirs of an Effort to Unlock The Mystery of Severe Storms During the 50 Years, 1942–1992,” in the Texas Tech Southwest Collection/Special Collections Library.

But let’s take a step back. How did Fujita get interested in tornadoes in the first place? In part, his involvement was yet another legacy of the Manhattan Project: Fujita began his life’s work studying damage in Hiroshima and Nagasaki in the aftermath of the atomic bombs.

Fujita was working as assistant professor in physics at Meiji College of Technology in Tobata, exactly halfway between the two cities. A couple of years earlier, in compliance with his dying father’s wishes, he had opted to go to Tobata for his studies in mechanical engineering rather than Hiroshima. In the month following the bombings, Fujita and his team of students went on an observational mission to study the blast zones at both sites. At Nagasaki, through studying the burn marks of various objects, Fujita had the goal of estimating the position of the atomic bomb when it exploded. At ground zero, most trees, though scarred black by radiation, were still standing upright while buildings were in ruins. Seen from above, it looked like a giant starburst pattern.

After WWII ended, he joined the University of Chicago. By a stroke of genius, the Japanese American meteorologist was able to draw comparisons between severe weather and the nuclear shock waves he had studied some twenty-five years earlier at Hiroshima and Nagasaki, through studying the debris and damage of tornadoes before cleanup. He led the development of the Fujita Scale to categorize tornado intensity, a modified version of which remains in use today.

Following the Super Outbreak of 3–4 April, 1974, which covered over 2,600 miles and produced nearly 150 tornadoes in an 18-hour period, Fujita carried out aerial and ground damage surveys covering over 10,000 miles. Through meticulous analysis of the observational data, he demonstrated the existence of smaller tornadoes — suction vortices — within the parent tornado. The aerial surveys also led to the discovery of microbursts.

Photo: Dr. Fujita as a professor of Geophysical Sciences at the University of Chicago, photo taken in April 1961. Special Collections Research Center, University of Chicago Library.

You can read more about his discovery of the downburst and its contributions to aviation safety (including his work as a principal investigator for the National Intensive Meteorological Research On Downburst [NIMROD] project) here.

In 2000, two of his former students organized the “Symposium on the Mystery of Severe Storms: A Tribute to the work of T. Theodore Fujita,” held at the 80th AMS Annual meeting. They were none other than Gregory S. Forbes from The Weather Channel and Roger M. Wakimoto from UCLA, both distinguished meteorologists in their own right. Roger was of course our AMS President in 2017–2018. The photo below shows the three of them at an event at the University of Chicago from the early 1980s.

Dr. Roger Wakimoto (left), Dr. Ted Fujita (middle) and Dr. Gregory Forbes (right), taken in the early 1980s when all were at the University of Chicago. Photo Courtesy of Roger Wakimoto, honorary member of the AMS.

You can read the proceedings of the Symposium here to get a fuller sense of Fujita’s immense contributions to atmospheric science. In this short piece, I have barely scratched the surface.

You can also learn about Fujita through the PBS American Experience series, which describes events and people who have shaped the landscape over the course of history. Fujita is profiled in the episode titled, “Mr. Tornado.”

Featured image: Cherry blossoms surround the Tidal Basin in Washington, D.C. Photo: National Park Service, Kelsey Graczyk

Anjuli is grateful to Katherine ‘Katie’ Pflaumer for providing useful edits.

Be There: Estimating Wind Speeds of Tornadoes and Other Windstorms

Tornado photo

By James LaDue, NOAA/NWS Warning Decision Training Division (symposium co-chair)

Did you know that the AMS is co-branding a standard with the American Society for Civil Engineers and that you can be involved as a member? For the past several years, both organizations have signed together to develop a standard on wind speed estimation for tornadoes and other severe storms. To learn more about this standard, and the methods it’s developing, the standards committee on Wind Speed Estimation is hosting a symposium this Thursday at the AMS 104th Annual Meeting, aptly named “Estimating Wind Speeds of Tornadoes and Other Windstorms.” In this conference you will learn more about how you can be involved in the process.

Ever since the EF scale was implemented in 2007, damage surveyors found reasons for improvement. They formed a grassroots stakeholder group in 2010 and published a paper in 2013 highlighting areas needing improvement. Then after the Joplin, MO tornado of 2011, an investigation led by NIST recommended that a committee be formed to improve the EF scale. But that’s not all there was to estimating wind speeds. New methods were maturing quickly to estimate winds in severe storms: methods such as Doppler radar, tree-fall patterns left behind tornadoes, probabilistic wind speed analysis forensics, multispectral passive remote sensing, and in-situ observations. Many of these methods can also be applied to other windstorm types.

The committee on Wind Speed Estimation, begun within the ASCE in 2015, is devoted to refining all of these methods into an ANSI standard (American National Standard).  Comprised of engineers, meteorologists, architects, forest ecologists, an arborist, and an emergency manager, we are now deep in the internal balloting phase of the standard’s individual chapters. While the ASCE provides the logistical support for our committee, the AMS was added and the standard co-branded under both organizations. The process by which a standard forms is one of the most rigorous vetting processes known in the STEM fields and often can take a decade or more. We’ve been conducting internal ballots for several years, and this may last a couple more. Once the internal balloting phase is over, the standard goes to a public comment phase.  

The Wind Speed symposium is designed to let you know how and why we have this standards process, how the methods are designed in the standard, and how you can be involved, especially when the public comment period commences. We have a panel discussion at the beginning to give you a chance to engage with the committee, followed by more in-depth presentations on the methods. There are also oral and poster presentations regarding new science coming out that could provide more advances in the standard and its application. We hope to see you there! 

Featured image: Photo of tornado with dust cloud near power lines in Matador, TX, taken 21 June 2023. Image credit: James LaDue.

The Estimating Wind Speeds of Tornadoes and Other Windstorms Symposium will be held Thursday, 1 February, 2024 at the AMS 104th Annual Meeting, in Baltimore and online. Learn more about the Symposium and view the program.

AMS 2024 Session Highlight: Transition to Carbon-Free Energy Generation

A line of wind turbines

The AMS 2024 Presidential Panel Session “Transition to Carbon-Free Energy Generation” discusses crucial challenges to the Energy Enterprise’s transition to renewables, and the AMS community’s role in solving them. Working in the carbon-free energy sector on research and development including forecasting and resource assessment, grid integration, and weather and climate effects on generation and demand, the session’s organizers know what it’s like to be on the frontlines of climate solutions. We spoke with all four of them–NSF NCAR’s Jared A. Lee, John Zack of MESO, Inc., and Nick P. Bassill and Jeff Freedman of the University at Albany–about what to expect, and how the session ties into the 104th Annual Meeting’s key theme of “Living in a Changing Environment.” Join us for this session Thursday, 1 February at 10:45 a.m. Eastern!

What was the impetus for organizing this session?

Jared: With the theme of the 2024 AMS Annual Meeting being, “Living in a Changing Environment,” it is wonderfully appropriate to have a discussion about our in-progress transition to carbon-free energy generation, as a key component to dramatically reduce the pace of climate change. But instead of merely having this be yet another forum in which we lay out the critical need for the energy transition, we organized this session with these panelists (Debbie Lew, Justin Sharp, Alexander “Sandy” MacDonald, and Aidan Tuohy) to shine a light on some real issues, hurdles, and barriers that must be overcome before we can start adding carbon-free energy generation at the pace that would be needed to meet aggressive clean-energy goals that many governments have by 2040 or 2050. The more that the weather–water–climate community is aware of these complex issues, the more we as a community can collectively focus on developing practical, innovative, and achievable solutions to them, both in science/technology and in policy/regulations. 

Jeff: We are at an inflection point in terms of the growth of renewable energy generation, with hundreds of billions of dollars committed to funding R&D efforts. To move forward towards renewable energy generation goals requires an informed public and providing policy makers with the information and options necessary.

Required fossil fuel and renewable energy production trajectories to meet renewable energy goals. Graphic by Jeff Freedman, using data from USEIA.

Since now both energy generation and demand will be dominated by what the weather and climate are doing, it is important that we take advantage of the talent we have in our community of experts to support these efforts. We are only 16 years out from a popular target date (2040) to reach 100% renewable energy generation. That’s not very far away. Communication and the exchange of ideas regarding problems and potential solutions are key to generating public confidence in our abilities to reach these goals within these timelines without disruption to the grid or economic impacts on people’s wallets.

What are some of the barriers to carbon-free energy that the AMS community is poised to help address?

Jeff and John: From a meteorological and climatological perspective, we have pretty high confidence in establishing what the renewable energy resource is in a given area. .. We have, for the most part, developed very good forecasting tools for predicting generation out to the next day at least. But sub-seasonal (beyond a week) and seasonal forecasting for renewables remains problematic. We know that the existing transmission infrastructure needs to be upgraded, thousands of miles of new transmission needs to be built, siting and commissioning timelines need to be shortened, and we need to coordinate the retirement of fossil fuel generation and its simultaneous replacement with renewables to insure grid stability. This panel will discuss some of the potential solutions we have at hand, and what is/are the best pathway(s) forward. 

On the other hand, meeting the various state and federal targets regarding 100% renewable energy generation also implicates other unresolved issues, such as:  how will we accelerate the necessary mining, manufacturing, and construction and operation by a factor of nearly five in order to achieve these power generation goals? Not to mention how all this is affected by financing, the current patchwork of … regulatory schemes, NIMBY issues, and a constantly changing landscape of policy initiatives (depending on how the political wind is blowing–sorry for the pun!). And of course, there is the question of the “unknown unknowns!”

What will AMS 104th attendees gain from the session?

Nick: Achieving the energy transition is fundamental for the health and success of all societies globally, and indeed, may be one of the defining topics of history books for this time. With that said, the transition to carbon-free energy will not be a straight line, and many factors are important for achieving success. This session should provide an understanding of the current status of our transition, and what obstacles and key questions need to be overcome and answered, respectively, to complete our transition.

Header photo: Wind turbines operating on an oil patch in a wind farm south of Lubbock, Texas. Photo credit: Jeff Freedman.

About the AMS 104th Annual Meeting

The American Meteorological Society’s Annual Meeting brings together thousands of weather, water, and climate scientists, professionals, and students from across the United States and the world. Taking place 28 January to 1 February, 2024, the AMS 104th Annual Meeting will explore the latest scientific and professional advances in areas from renewable energy to space weather, weather and climate extremes, environmental health, and more. In addition, cross-cutting interdisciplinary sessions will explore the theme of Living in a Changing Environment, especially the role of the weather, water, and climate enterprise in helping improve society’s response to climate and environmental change. The Annual Meeting will be held at the Baltimore Convention Center, with online/hybrid participation options. Learn more at annual.ametsoc.org

Flying the Fastest Skies

How fast can an airliner go? Monday night a Virgin Atlantic Boeing 787-9 reached 801 m.p.h. en route from Los Angeles to London. Matthew Cappucci of the Washington Post reported the jet reached this amazing speed—a record for the Boeing 787-9 and probably the highest speed for a non-supersonic commercial flight—while cruising at 35,000 feet over the central Pennsylvania.
Clearly the plane was hurled along by an intense jet streak; Cappucci showed a sounding at 250 mb—a level nearly as high as the plane—that night over Long Island: the jet stream was moving at 231 m.p.h. This is what pushed the aircraft more than 200 m.p.h. beyond its top airspeed. (The plane’s record speed was relative to the ground, not the swiftly moving air around it.) The Post article states that the sounding “sets the record for the fastest 250-millibar wind speed ever recorded over New York and, probably, the country.”
This raises the other question of speed: just how fast can a jet stream go? It turns out the question is not so easy to answer. To find out, we e-mailed an experienced weather records sleuth, Arizona State University’s Randy Cerveny, who is the World Meteorological Organization’s rapporteur of weather and climate extremes. Cerveny replied,

I had set up a WMO committee this past summer to look into that very question—the strongest tropospheric winds (and so the strongest winds recorded on the planet). As we started to look at the data, we found that by far the strongest tropospheric winds are found east of Japan in the Pacific and normally occur right at this time of the year. They are associated with the normal area when polar and subtropical jets merge. The second area of max tropospheric winds are over New Hampshire and has the same thing happen—polar and subtropical jets merge. BUT unfortunately we ran into serious problems with the quality of extreme tropospheric wind measurements. My experts say that right now the quality of the data for those upper air extreme winds is not good enough to support an investigation for global fastest tropospheric winds. So we are not investigating that record until (and if) NCEI and other groups can establish a viable record for an extreme. We have seen data (again, not good to accept) that has winds in excess of 133 m/s or 297 miles per hour. It is likely that some of those values ARE good but we are still quality-controlling the radiosonde extreme dataset.
With that in mind, we dug into the AMS journals archive and found a February 1955 Journal of Meteorology article by Herbert Riehl, F. A. Berry, and H. Maynard detailing research flights into the jet stream over the Mid-Atlantic states. They record one case of a 240-knot jet stream (276 m.p.h.) and another of 210 knots (241 m.p.h.), each representing averages over 28 miles of flight path.
These can’t be counted as definitive—Riehl et al. emphasized the difficulties of their measurement process. And Cerveny emphasizes that, “No measurement that we have seen at extreme values has been judged of sufficient quality to warrant a full evaluation at this time.”
So for now, just sit back and enjoy the flight.