No Do-Overs for Plainfield, Please

If ever there was a day meteorologists might like to do over, it was exactly 20 years ago today, on 28 August 1990. Somehow, on an afternoon originally projected to have a mere moderate risk of severe weather, an F-5 tornado—the only such powerful tornado in August in U.S. history—struck northern Illinois, killing 29 people in the small town of Plainfield, just 30 miles southwest of Chicago.
The sky turned black, and few people knew what happened as the rain-wrapped tornado ripped through the landscape. Almost no one saw the funnel. (Paul Sirvatka, just then starting up his College of DuPage storm chasing program was a rare exception.) Even though another tornado was spotted earlier in the afternoon in northern Illinois, no sirens wailed in Plainfield until too late. No tornado warnings were issued until too late.
Tom Skilling (WGNE-TV) broadcast a report this week for the 20th anniversary of the tragic tornado, explaining why warnings would likely be much better should similar weather now threaten the Chicago area.

The gist of the story: back in 1990 Chicagoland didn’t have NEXRAD Doppler radar and other recent advances in observing and modeling. Also, the aftermath led to a reorganization of the overworked Weather Service meteorologists in Illinois, narrowing the purview of the Chicago office and adding more offices to help cover the state.
While most stories in the media (for example, also here) have been showing why 20 years have made a repeat of Plainfield’s helplessness less likely, Gino Izzo of the NWS Chicago office decided to have a do-over anyway–on the computer. At his presentation for the AMS Conference on Broadcast Meteorology in June, Izzo described how he reran the severe weather forecasts for 28 August 1990 using the North American Regional Reanalysis and the most up-to-date model of 2010, the Weather Research & Forecast model (WRF) from UCAR.
With a nested 4 km grid at its most detailed, Izzo ran the WRF overnight–it took 10 hours on the available computer in the office–and found that in fact, with the observational limits of 1990, the latest, greatest numerical forecasting doesn’t really improve the severe weather outlooks for the Plainfield disaster. The WRF moved the likely areas of severe weather (not tornadoes necessarily, but probably winds associated with a bow echo) too far eastward. Only when the model time horizon was getting within a few hours of the killer tornado did the severe weather outlook for northern Illinois start to look moderate, as the model begins to slow down the eastward progress of the cold front.
Audio and slides from Izzo’s striking presentation are available on the AMS meeting archive. The message is pretty clear: no matter how good the models get, Doppler radar, wind profilers, aircraft-based soundings, and satellites make a huge difference in our severe weather safety these days.
Of course, with or without better warnings, a repeat of the Plainfield disaster would be potentially catastrophic. The area has more than doubled its population since 1990. And 28 August just happened to be one day before school resumed for the fall—few people were at the high school that was totally destroyed that day. Even just a day’s difference, let alone two decades, could have been critical. Nobody wants a do-over.

The Aerographer's Advice

Ray Boylan, former chair of the AMS Broadcast Board, who died yesterday at age 76, was a Navy enlisted man who found his way into meteorology by a fluke. Maybe that’s why he never lost a homespun attitude toward celebrity and science that we ought to remember.

Ray Boylan at his first station after retiring from Navy hurricane hunting.

After training at airman’s prep in Norman, Oklahoma, in 1953, Boylan was casting about for the next assignment when he noticed that the Aerographer’s mate school was in Lakehurst, New Jersey—near home and best of all near his girlfriend.
So it was off to New Jersey for a career in meteorology. The Navy service was his only formal training in weather, and included some 2,000 hours as a hurricane hunter flight meteorologist. First assignment—flying straight into Camille in 1969:

Back in those days the Navy had the low level mission and the Air Force had the high level mission. Whatever the lowest cloud level was, we went in below those so that I could see the sea surface to keep the wind just forward of the left wing. That’s how we navigated in. I remember a fellow at a Rotary meeting who asked,
‘How many times do you hit downdrafts?’
‘Just once.’

Realizing that viewers had a realistic yes/no experience with rain, Boylan resisted using PoPs on the air, according to today’s Charlotte Observer obituary:

“I’d rather say, ‘It’s going to be scattered like fleas on a bulldog’s back – and if you’re close, you’ll get bitten.’ Or, ‘like freckles scattered across a pretty girl’s face.'”

Lamenting the hype of local TV news these days, Boylan told the WeatherBrains on their 19 February 2008 podcast,

One of the things I see now, is that every weather system that approaches a tv market is a storm system. Not every weather system is a storm system, but that vernacular is there.

Sometimes the medium gets in the glow of the medium’s eye. It’s kind of a narcissistic thing. The media looks at itself as absolutely invaluable. And it can be invaluable, but not if the media thinks so.

The work of the on air forecaster is not to impress, but in

Trying to get the forecast as right as you possibly can. Building the trust of the audience so that they’ll forgive even when you are wrong, And there’s no one out there in our business who hasn’t been wrong, and won’t be again, including myself. …If you can build that confidence and trust base, they’ll forgive you some of the small ones if you get the big one.

Speaking of building trust, Lakehurst turned out pretty well for Boylan. Fifty-five years later he would say, “The science and the girl are still with me.”
(Click here to download the audio of the full 20-minute WeatherBrains interview with Boylan.)

Climatology: Inverting the Infrastructure

Atmospheric science may not seem like a particularly subversive job, but from an information science perspective, it involves continually dismantling the infrastructure that it requires to survive. At least that’s the way Paul Edwards, Associate Professor of Information at the University of Michigan described climatology, and one other sister science, in an interesting hour-long interview on the radio show, “Against the Grain” last week. (Full audio is also available  for download.)
In the interview Edwards describes how the weather observing and forecasting infrastructure works (skip to about the 29 minute mark if that’s familiar), then notes that climatology is the art of undoing all that:

To know anything about the climate of the world as a whole we have to look back at all those old [weather] records. …But then you need to know about how reliable those are. [Climate scientists] unpack all those old records and study them, scrutinize them and find out how they were made and what might be wrong with them–how they compare with each other, how they need to be adjusted, and all kinds of other things–in order to try to get a more precise and definitive record of the history of weather since records have been kept. That’s what I call infrastructural inversion. They take the weather infrastructure and they flip it on its head. They look at its guts.

In his book, The Vast Machine: Computer, Models, Climate Data, and the Politics of Global Warming, Edwards points out that people don’t realize how much of this unpacking—and with it multiple layers of numerical modeling–is necessary to turn observations into usable, consistent data for analysis and (ultimately) numerical weather and climate predictions. The relationship between data and models is complicated:

In all data there are modeled aspects, and in all models, there are lots of data. Now that sounds like it might be something specific to [climate] science, but …in any study of anything large in scope, you’ll find the same thing.

In part because of this “complicated relationship” between observations and models, there’s a lot of misunderstanding about what scientists mean when they talk about “uncertainty” in technical terms rather than in the colloquial sense of “not knowing”. Says Edwards,

We will always keep on finding out more about how people learned about the weather in the past and will always find ways to fix it a little bit. It doesn’t mean [the climate record] will change randomly or in some really unexpected way. That’s very unlikely at this point. It means that it will bounce around within a range…and that range gets narrower and narrower. Our knowledge is getting better. It’s just that we’ll never fix on a single exact set of numbers that describes the history of weather.

Climatology is not alone in this perpetual unpacking of infrastructure. Economists seem like they know all about what’s going on today with their indexes, Gross Domestic Products, inflation rates, and money supply numbers. That’s like meteorology. But to put together an accurate history of the economy, they have to do a huge amount of modeling and historical research to piece together incongruous sources from different countries.

There is a thing called national income accounting that has been standardized by the United Nations. It wasn’t really applied very universally until after the Cold War….Just to find out the GDP of nations you have to compare apples and oranges and find out what the differences are.

And to go back as recently as the 1930s?

You would have to do the same things the climate scientists have to do…invert the infrastructure.

Beware the Wrath of a Presidential Storm

The Philippine Atmospheric, Geophysical and Astronomical Services Administration (PAGASA) may indeed have underestimated the danger that Typhoon Conson posed to Manila in July. But it seems even more likely that PAGASA director Prisco Nilo underestimated the political storm that ensued.
At an emergency disaster coordination meeting after the storm (known locally as Typhoon Basyang), President Benigno Aquino scolded Nilo because PAGASA had led Manilans to believe the capital would be spared the brunt of the rain and winds:

That [storm] information it is sorely lacking and we have had this problem for quite a long time. … You do what you are supposed to do… this is not acceptable. I hope this is the last time that we are all brought to areas different from where we should be.

“He really was not angry,” Nilo commented at a press conference. “It was just a comment made by a President, he wanted things to improve, that was his point.”  Yet it seems the president was indeed angry; angry enough to fire Nilo a few weeks later.
It was only the second week of Aquino’s term when Basyang hit metropolitan Manila on 13 July, initially as a weak Category 1 tropical cyclone. Heavy rains and flooding led to at least 100 deaths (at least 70 people were initially reported as missing). The 95 kph

Typhoon Conson (Basyang)
Typhoon Conson approaching the Philippines in July. Portents of political trouble.

winds caused also caused power and communications outages that paralyzed the city for days. PAGASA’s last advisory that night at 11:00 p.m. said that the typhoon had weakened and was headed farther north of Manila. Yet around midnight the eye of the storm passed through the area.
Nilo’s explanation to President Aquino was that the bureau’s equipment limited storm updates and that the system needed to be updated:

We update the bulletin every six hours to take into account possible changes that were not earlier indicated by the mathematical models we are using as guidance in coming up with our forecast.

According to the Philippine news service GMA News, others have spoken up about similar constraints on PAGASA:

PAGASA officials have repeatedly said lack of modern equipment is hampering them from doing their jobs more effectively.

President Aquino’s responded that the bureau should have consulted

Read more

They Still Make Them Like They Used To

This summer, the Catlin Arctic Survey team became the first explorers to ever take ocean water samples at the North Pole. The three-person team covered 500 miles over 2 1/2 months in their expedition across sea ice off the coast of Greenland. On the way they were met with numerous obstacles: a persistent southerly drift that regularly pushed them backward, strong headwinds, ice cracks opening under their tent, dangerously thin ice, and areas of open water they had to swim across.

The Catlin camp

They persisted through it all, measuring ice thickness, drilling ice cores, and collecting water samples (see the video below) and plankton data. They hope their research will provide insight into the effects of carbon dioxide on local marine life and Arctic Ocean acidification.
The heartiness of the Catlin team reminds us of the rich history of polar exploration in the name of meteorology. Historian Roger Turner of the University of Pennsylvania gave a fascinating presentation at the AMS Annual Meeting in Atlanta about the origins of the tradition, spotlighting the group of young Scandinavian meteorologists who studied under Vilhelm Bjerknes in Bergen, Norway. They were vital contributors to numerous Arctic expeditions in the 1920s.
This first wave of Bergen School meteorologists was well-suited to polar exploration, where they contributed their familiarity with the Far North conditions as well as their new understanding of upper-air dynamics. But Turner argues that their affinity for outdoors activities–particularly in the harsh conditions of the Arctic–also set them apart from others in their generation and, by implication, from the desk-bound meteorologists today.
We think those hardy meteorological pioneers of yesteryear would appreciate the intrepid scientific spirit of the Catlin team.

New Director for the Storm Prediction Center

From the National Weather Service Facebook page:

Dr. Russell Schneider named as new Storm Prediction Center director. Schneider spent his entire career at the National Weather Service. He began at NCEP’s Environmental Modeling Center before becoming the first science and operations officer at the Hydrometeorological Prediction Center. He has been the science support …branch chief at SPC in Norman, Okla., since 1997. Author and co-author of numerous professional publications, Schneider also served as an associate editor of the American Meteorological Society Journal “Weather and Forecasting” for more than a decade.

In June 2004, BAMS published a brief interview with Russ Schneider by Ashton Robinson-Cook. Schneider recalled that his interest in meteorology was ignited as a youngster–a situation surely familiar to many AMS members. In particular 1965, when he was seven years old,  the weather brought two pivotal storms to Chicago: an inch-thick icing in the winter, followed by the Palm Sunday outbreak in which an F4 struck within 10 miles of Schneider’s home.
That was a pretty intense start, which led to a Ph.D. at the University of Wisconsin-Madison and a long career with the weather service serving the intersection of research and operations. He told Robinson-Cook,

I think the ability to use science to warn people of the threat of severe weather hazards is what my commitment (and many others in atmospheric science) is all about.

Congratulations on taking that attitude to the next step.

Shoes for Showers

Finally… incontrovertible proof that precip distribution is a step function.
From Regina Regis, in Italy. 69 Euros.

New Partnership

Recognizing the need for greater cooperation in understanding and predicting weather and climate, the U.K. Met Office and NCAR recently signed an agreement to conduct joint research. While many scientists from the two organizations already work together individually, the new arrangement encourages the merging of observing and field programs and facilitates the sharing of insights related to modeling, software engineering, and instrument creation. Specific research that the agreement promotes includes fundamental atmospheric processes, particularly related to the planetary boundary layer; upper-ocean processes and short-term upper ocean-atmosphere interactions in weather and climate prediction; next-generation modeling for massively parallel computing architectures; predictability of the climate system on seasonal-to-decadal timescales; and real-time attribution of hazardous weather and climate extremes to human influences.

Representatives of NCAR and the U.K. Met Office sign the partnership agreement at NCAR's Mesa Laboratory. From left to right, standing: NCAR's William Skamarock, Maura Hagan, and Peter Backlund, May Akrawi of the Office of the British Consulate-General, and NCAR's Rachel Hauser; seated: UCAR's Richard Anthes, Julia Slingo of the U.K. Met Office, and NCAR's Roger Wakimoto.

Knowing What the Earth Will Do Next? Priceless.

by William Hooke, AMS Policy Program Director

from the AMS Project, Living on the Real World
When it comes to valuation…what is a product or service worth?…maybe the MasterCard folks say it best. Let’s paraphrase them. Cost of the Earth’s observing systems, satellite- and surface- based (ocean and land)? Some very few tens of billions of dollars per year[1]. Cost of the engineers and scientists to handle the communication of the data, build the models, and convert the results into decision support, knowledge and a predictive understanding? No more than another ten billion dollars or so annually. Knowing what the Earth will do next – that is, adding to our knowledge base of the resources such as food and fiber, water, and energy, on which we depend; distinguishing between habitats, environments, and ecosystems that are robust and those which are threatened and need our immediate help; and foreseeing the onset of natural extremes such as flood and drought, heat and cold, and the approach of storms? Priceless.
At last week’s AMS Summer Community meeting, one of the senior participants suggested that those of us in Earth-science-based services and supporting research ought to do a better job explaining the value of our work. In the context of that discussion, he was absolutely right, but his idea presupposed that we had the quantitative foundation to make our case. The fact is that what little we know about the value of Earth observations and their use is rudimentary, fragmented, anecdotal, and therefore, somewhat suspect. This is a nascent field.
Happily, the Value of Information (VOI) workshop held at Resources for the Future in late June sowed seeds for changing this picture [2]). The summary report, released yesterday is itself worth reading. It contains a nice framing of

Read more

Putting Out Fires with Meteorology

San Diego Gas and Electric has embarked on an ambitious weather-monitoring effort that should warm the hearts of meteorologists–whose help the utility may still need to solve a larger wildfire safety controversy.
SDG&E recently installed 94 solar-powered weather monitoring systems on utility poles scattered in rugged rural San Diego County, where few weather observations are currently available. The purpose is to help prevent and control forest fires during Santa Ana winds. The plan has won plaudits from local fire chiefs and meteorologists alike, since the data will be available to National Weather Service forecasters and models as well as the utility’s own decision makers.
“That makes San Diego the most heavily weather instrumented place on Planet Earth,” says broadcast meteorologist John Coleman in his report on the story for KUSI News.
SDG&E’s intensified interest in meteorological monitoring is precipitated by the hot water the company got into due to its role in forest fires in 2007: Electricity arcing from power lines is blamed for three fires that year that killed two people and destroyed 1,300 homes in rural areas around San Diego. While not acknowledging fault, the company has compensated insurance companies to the tune of over $700 million.
To improve safety the company came up with plans last year to shut off the grid for up to 120,000 people in rural areas if dry weather turns windy–classic Santa Ana conditions. The shut off would initiate in 56 m.p.h.  winds, the design standard for much of the power system, and then power would be restored when sustained winds remained below 40 m.p.h. assuming the lines prove reliable. Southern California Edison used a similar cut-off tactic in 2003 with relatively positive reaction from customers, but the company later aggressively cleared areas around power lines and has not utilized the plan since.
SDG&E, by contrast, had been in federal mediation for months with customers angry about the shut-off plan. One of the main gripes about the plan has been that the power company didn”t expect to warn customers about the outages. The company said it couldn’t predict the winds on a sufficiently localized basis.
Clearly the controversy could be alleviated by enhanced meteorology with the newly established weather stations.  Brian D’Agostino, the local meteorologist who helped SDG&E design the weather monitoring strategy told KGTV Channel 10 News:

We’re taking a lot of areas where we always just figured the winds were at a certain speed and now we’re going to know for sure….Right now, the National Weather Service gets its information once every hour. Now, we’re able to provide it with data every 10 minutes.