Storm chasing is sometimes as much a gripping challenge of driving through nasty weather as it is a calculated pursuit of meteorological bounties.
So perhaps it’s not so surprising that it took a storm chaser…Dan Robinson’s his name…to start a web site about the fatal hazard of ice and snow on our roads. Over half of the weather-related deaths on American roads each year are in wintry conditions.
Robinson took the liberty of tacking road statistics into the preliminary NOAA numbers for weather hazards (recently released for 2009 here).
The effect is striking, indeed, and a good lead in to Bill Hooke’s report from a Federal Highway Administration workshop today on road weather and the future of intelligent transportation systems.
Clearly we’ve got a lot of work to do and a lot of lives to save…Hooke, the AMS Policy Program Director, makes the case and points out some of the bumps in the road to better weather safety in your car.
Jeff
Vortex Delight
This Monday at the AMS Conference on Mountain Meteorology, Rieke Heinze of the Institut für Meteorologie und Klimatologie at the Leibniz Universität Hannover presented this very cool looking simulation of von Kármán vortex streets, which sometimes show up in satellite images of clouds in the lee of isolated mountain islands. The nifty thing about Heinze’s simulation project is that it shows the vortices retaining a warm core from bottom to top in the flow (cross section not shown here).
On her project web site (where you can download the video), Heinze writes:
Atmospheric vortex streets consist of two rows of counterrotating mesoscale eddies with vertical axis in the wake of large islands. They resemble classical Kármán vortex streets which occur in laboratory experiments behind a cylinder. Usually, atmospheric vortex streets can be found in the stratocumulus capped mixed layer over the ocean when there is a strong elevated inversion well below the island top.
In the animations the island consists of a single Gaussian shaped mountain with a height of about 1.3 km and a base diameter of about 12km. Particles are released in one layer and act as passive tracers. Their vertical motion is disabled. The colour of the particles reflects the difference between the temperature at the respective particle position and the mean temperature, horizontally averaged over the total domain. Blue/red colours represent a relatively low/high temperature. The animation shows that the cores of the eddies are warmer than the environment. The length of the animation corresponds to about 14h real time.
A Dose of Reality: The Social Side of Disasters
by William Hooke, AMS Policy Program Director,
from the AMS Project, Living on the Real World
Reality: Disasters – that is, disruptions of entire communities, persisting after an extreme has come and gone, and exceeding a community’s ability to recover on its own – are largely a social construct. Consider this simple example. Meteorologists call a tropical storm a hurricane when its winds exceed some 75 miles per hour. The strongest hurricanes ever observed show wind speeds about twice this level, 150 mph, say. Physics tells us that the forces on buildings and structures should vary as the square of the wind speed (getting a little technical …). The strongest hurricanes therefore pack a wallop about four times that of the weakest; the area suffering hurricane-force winds also tends to be a bit bigger. But the damages from these largest storms may be 200 times as great. One contributor to this big difference? Building codes. These are county-based, and so vary somewhat across the 3000-some counties in the United States, but in hurricane-prone areas usually require that brick-and-mortar residential construction withstand wind speeds of about 120 mph – right in the middle of the hurricane-force wind range. [Manufactured housing, by contrast, is governed by a less-rigorous federal standard, which requires that such structures only maintain their integrity at wind speeds up to some 70-90 mph.] Change building codes, and you change this loss profile.
Alert! These are our choices, but they’re not necessarily bad choices. When it comes to building codes, we’re simply forced to set a realistic standard. Whenever we wish, we can elect to build homes that will withstand hurricane-force winds, or even the strongest tornadic winds, which might approach 300 mph. But these homes would be considerably more expensive. They might be built largely underground, looking more like World War II “pillboxes” than homes, with narrow slits for windows, etc. Remember, they have to bear up not just to the winds but also to the windborne debris: lumber, roofing, automobiles, etc. Most of us wouldn’t be able to afford them, and wouldn’t want to live in them even if we could. We prefer our views, the connection with the outdoors. So we build a safe room, or a storm cellar, and accept the remaining risk.
Building codes are but one example; here’s another – land use. Simply by choosing to build in the flood plain alongside rivers and on the coasts, we ensure that our future will be punctuated by repetitive loss. Build on an earthquake fault-line? Expect the same outcome. There’s much more to this topic. We’ll return to it down the road.
But, for now, let’s set it aside in order to introduce a more complicated notion. When we choose to urbanize – and over half the world’s population lives in cities now – we expose ourselves to other risks, and different kinds of risk at that. To live in cities requires critical infrastructure, ranging from the elevators that service high-rise buildings to networks of roads, sewage systems, water works, electrical grids, communications, financial services, health care, and so on. As a result, we’re now subject to outages of these systems, from whatever cause. An example from the Midwest flooding of 1993. The city of Des Moines, Iowa remained largely dry; only a small sliver of land
Remembering Katrina and New Orleans
by William Hooke, AMS Policy Program Director, from the AMS project, Living on the Real World
“…we can not dedicate, we can not consecrate – we can not hallow – this ground. The brave men, living and dead, who struggled here, have consecrated it far above our poor power to add or detract.”– Abraham Lincoln, Gettysburg address
The last few posts, as we’ve started to think about disasters, we’ve asked, what’s it worth to see a disaster coming? Katrina shows vividly that it’s worth relatively little if we can’t or won’t act. People had been vocal about the growth of vulnerability in New Orleans for decades, even as the vulnerability and risks ratcheted up. The warnings didn’t seem to be enough.
Some salient features of this landscape? Well, for one, the 2002 series of articles by Mark Schleifstein and John McQuaid in the New Orleans Times-Picayune, “Washing Away: how south Louisiana is growing more vulnerable to a catastrophic hurricane.” For another, the 2004 article by Shirley Laska in The Natural Hazards Observer, “What if Hurricane Ivan had not missed New Orleans,” and her June 2005 talk as part of an AMS environmental science seminar on Capitol Hill, in the Senate Hart Office Building, on the topic of “New Orleans, hurricanes, and climate change: a question of resiliency.” That afternoon, one hundred policymakers were in the room, including U.S. Senator Mary Landrieu (D-LA, who to her credit had always been concerned about this threat and working hard, first to avert it, and since to recover from it).
Dr. Laska, a sociologist at the University of New Orleans and then director of the Center for Hazard Assessment, Research, and Technology (CHART), laid out the whole scenario. She touched on the growth in Louisiana population, the development of the Port of New Orleans and the offshore oil extraction and the associated refineries. She recounted a century of bad engineering along the Mississippi, the degradation of Louisiana’s coastal wetlands and the subsequent loss of their natural protection. She discussed the risks in depending upon evacuation as a strategy: the vulnerabilities of the lone evacuation route over Lake Pontchartrain and the fact that at any given time, 100,000 people would be too poor to find a ride, and 2000 people to sick to move. She estimated it would take 90 days to dry out the “bowl” (that portion of New Orleans under sea level and most vulnerable to flooding), and twenty years to recover. So far, as we say in our trade, that forecast seems to be verifying.
No Do-Overs for Plainfield, Please
If ever there was a day meteorologists might like to do over, it was exactly 20 years ago today, on 28 August 1990. Somehow, on an afternoon originally projected to have a mere moderate risk of severe weather, an F-5 tornado—the only such powerful tornado in August in U.S. history—struck northern Illinois, killing 29 people in the small town of Plainfield, just 30 miles southwest of Chicago.
The sky turned black, and few people knew what happened as the rain-wrapped tornado ripped through the landscape. Almost no one saw the funnel. (Paul Sirvatka, just then starting up his College of DuPage storm chasing program was a rare exception.) Even though another tornado was spotted earlier in the afternoon in northern Illinois, no sirens wailed in Plainfield until too late. No tornado warnings were issued until too late.
Tom Skilling (WGNE-TV) broadcast a report this week for the 20th anniversary of the tragic tornado, explaining why warnings would likely be much better should similar weather now threaten the Chicago area.
The gist of the story: back in 1990 Chicagoland didn’t have NEXRAD Doppler radar and other recent advances in observing and modeling. Also, the aftermath led to a reorganization of the overworked Weather Service meteorologists in Illinois, narrowing the purview of the Chicago office and adding more offices to help cover the state.
While most stories in the media (for example, also here) have been showing why 20 years have made a repeat of Plainfield’s helplessness less likely, Gino Izzo of the NWS Chicago office decided to have a do-over anyway–on the computer. At his presentation for the AMS Conference on Broadcast Meteorology in June, Izzo described how he reran the severe weather forecasts for 28 August 1990 using the North American Regional Reanalysis and the most up-to-date model of 2010, the Weather Research & Forecast model (WRF) from UCAR.
With a nested 4 km grid at its most detailed, Izzo ran the WRF overnight–it took 10 hours on the available computer in the office–and found that in fact, with the observational limits of 1990, the latest, greatest numerical forecasting doesn’t really improve the severe weather outlooks for the Plainfield disaster. The WRF moved the likely areas of severe weather (not tornadoes necessarily, but probably winds associated with a bow echo) too far eastward. Only when the model time horizon was getting within a few hours of the killer tornado did the severe weather outlook for northern Illinois start to look moderate, as the model begins to slow down the eastward progress of the cold front.
Audio and slides from Izzo’s striking presentation are available on the AMS meeting archive. The message is pretty clear: no matter how good the models get, Doppler radar, wind profilers, aircraft-based soundings, and satellites make a huge difference in our severe weather safety these days.
Of course, with or without better warnings, a repeat of the Plainfield disaster would be potentially catastrophic. The area has more than doubled its population since 1990. And 28 August just happened to be one day before school resumed for the fall—few people were at the high school that was totally destroyed that day. Even just a day’s difference, let alone two decades, could have been critical. Nobody wants a do-over.
The Aerographer's Advice
Ray Boylan, former chair of the AMS Broadcast Board, who died yesterday at age 76, was a Navy enlisted man who found his way into meteorology by a fluke. Maybe that’s why he never lost a homespun attitude toward celebrity and science that we ought to remember.
After training at airman’s prep in Norman, Oklahoma, in 1953, Boylan was casting about for the next assignment when he noticed that the Aerographer’s mate school was in Lakehurst, New Jersey—near home and best of all near his girlfriend.
So it was off to New Jersey for a career in meteorology. The Navy service was his only formal training in weather, and included some 2,000 hours as a hurricane hunter flight meteorologist. First assignment—flying straight into Camille in 1969:
Back in those days the Navy had the low level mission and the Air Force had the high level mission. Whatever the lowest cloud level was, we went in below those so that I could see the sea surface to keep the wind just forward of the left wing. That’s how we navigated in. I remember a fellow at a Rotary meeting who asked,
‘How many times do you hit downdrafts?’
‘Just once.’
Realizing that viewers had a realistic yes/no experience with rain, Boylan resisted using PoPs on the air, according to today’s Charlotte Observer obituary:
“I’d rather say, ‘It’s going to be scattered like fleas on a bulldog’s back – and if you’re close, you’ll get bitten.’ Or, ‘like freckles scattered across a pretty girl’s face.'”
Lamenting the hype of local TV news these days, Boylan told the WeatherBrains on their 19 February 2008 podcast,
One of the things I see now, is that every weather system that approaches a tv market is a storm system. Not every weather system is a storm system, but that vernacular is there.
Sometimes the medium gets in the glow of the medium’s eye. It’s kind of a narcissistic thing. The media looks at itself as absolutely invaluable. And it can be invaluable, but not if the media thinks so.
The work of the on air forecaster is not to impress, but in
Trying to get the forecast as right as you possibly can. Building the trust of the audience so that they’ll forgive even when you are wrong, And there’s no one out there in our business who hasn’t been wrong, and won’t be again, including myself. …If you can build that confidence and trust base, they’ll forgive you some of the small ones if you get the big one.
Speaking of building trust, Lakehurst turned out pretty well for Boylan. Fifty-five years later he would say, “The science and the girl are still with me.”
(Click here to download the audio of the full 20-minute WeatherBrains interview with Boylan.)
Climatology: Inverting the Infrastructure
Atmospheric science may not seem like a particularly subversive job, but from an information science perspective, it involves continually dismantling the infrastructure that it requires to survive. At least that’s the way Paul Edwards, Associate Professor of Information at the University of Michigan described climatology, and one other sister science, in an interesting hour-long interview on the radio show, “Against the Grain” last week. (Full audio is also available for download.)
In the interview Edwards describes how the weather observing and forecasting infrastructure works (skip to about the 29 minute mark if that’s familiar), then notes that climatology is the art of undoing all that:
To know anything about the climate of the world as a whole we have to look back at all those old [weather] records. …But then you need to know about how reliable those are. [Climate scientists] unpack all those old records and study them, scrutinize them and find out how they were made and what might be wrong with them–how they compare with each other, how they need to be adjusted, and all kinds of other things–in order to try to get a more precise and definitive record of the history of weather since records have been kept. That’s what I call infrastructural inversion. They take the weather infrastructure and they flip it on its head. They look at its guts.
In his book, The Vast Machine: Computer, Models, Climate Data, and the Politics of Global Warming, Edwards points out that people don’t realize how much of this unpacking—and with it multiple layers of numerical modeling–is necessary to turn observations into usable, consistent data for analysis and (ultimately) numerical weather and climate predictions. The relationship between data and models is complicated:
In all data there are modeled aspects, and in all models, there are lots of data. Now that sounds like it might be something specific to [climate] science, but …in any study of anything large in scope, you’ll find the same thing.
In part because of this “complicated relationship” between observations and models, there’s a lot of misunderstanding about what scientists mean when they talk about “uncertainty” in technical terms rather than in the colloquial sense of “not knowing”. Says Edwards,
We will always keep on finding out more about how people learned about the weather in the past and will always find ways to fix it a little bit. It doesn’t mean [the climate record] will change randomly or in some really unexpected way. That’s very unlikely at this point. It means that it will bounce around within a range…and that range gets narrower and narrower. Our knowledge is getting better. It’s just that we’ll never fix on a single exact set of numbers that describes the history of weather.
Climatology is not alone in this perpetual unpacking of infrastructure. Economists seem like they know all about what’s going on today with their indexes, Gross Domestic Products, inflation rates, and money supply numbers. That’s like meteorology. But to put together an accurate history of the economy, they have to do a huge amount of modeling and historical research to piece together incongruous sources from different countries.
There is a thing called national income accounting that has been standardized by the United Nations. It wasn’t really applied very universally until after the Cold War….Just to find out the GDP of nations you have to compare apples and oranges and find out what the differences are.
And to go back as recently as the 1930s?
You would have to do the same things the climate scientists have to do…invert the infrastructure.
New Director for the Storm Prediction Center
From the National Weather Service Facebook page:
Dr. Russell Schneider named as new Storm Prediction Center director. Schneider spent his entire career at the National Weather Service. He began at NCEP’s Environmental Modeling Center before becoming the first science and operations officer at the Hydrometeorological Prediction Center. He has been the science support …branch chief at SPC in Norman, Okla., since 1997. Author and co-author of numerous professional publications, Schneider also served as an associate editor of the American Meteorological Society Journal “Weather and Forecasting” for more than a decade.
In June 2004, BAMS published a brief interview with Russ Schneider by Ashton Robinson-Cook. Schneider recalled that his interest in meteorology was ignited as a youngster–a situation surely familiar to many AMS members. In particular 1965, when he was seven years old, the weather brought two pivotal storms to Chicago: an inch-thick icing in the winter, followed by the Palm Sunday outbreak in which an F4 struck within 10 miles of Schneider’s home.
That was a pretty intense start, which led to a Ph.D. at the University of Wisconsin-Madison and a long career with the weather service serving the intersection of research and operations. He told Robinson-Cook,
I think the ability to use science to warn people of the threat of severe weather hazards is what my commitment (and many others in atmospheric science) is all about.
Congratulations on taking that attitude to the next step.
Shoes for Showers
Finally… incontrovertible proof that precip distribution is a step function.
From Regina Regis, in Italy. 69 Euros.
Knowing What the Earth Will Do Next? Priceless.
by William Hooke, AMS Policy Program Director
from the AMS Project, Living on the Real World
When it comes to valuation…what is a product or service worth?…maybe the MasterCard folks say it best. Let’s paraphrase them. Cost of the Earth’s observing systems, satellite- and surface- based (ocean and land)? Some very few tens of billions of dollars per year[1]. Cost of the engineers and scientists to handle the communication of the data, build the models, and convert the results into decision support, knowledge and a predictive understanding? No more than another ten billion dollars or so annually. Knowing what the Earth will do next – that is, adding to our knowledge base of the resources such as food and fiber, water, and energy, on which we depend; distinguishing between habitats, environments, and ecosystems that are robust and those which are threatened and need our immediate help; and foreseeing the onset of natural extremes such as flood and drought, heat and cold, and the approach of storms? Priceless.
At last week’s AMS Summer Community meeting, one of the senior participants suggested that those of us in Earth-science-based services and supporting research ought to do a better job explaining the value of our work. In the context of that discussion, he was absolutely right, but his idea presupposed that we had the quantitative foundation to make our case. The fact is that what little we know about the value of Earth observations and their use is rudimentary, fragmented, anecdotal, and therefore, somewhat suspect. This is a nascent field.
Happily, the Value of Information (VOI) workshop held at Resources for the Future in late June sowed seeds for changing this picture [2]). The summary report, released yesterday is itself worth reading. It contains a nice framing of