(Weather) Ready, Set, Go…

After a year that brought record-setting severe weather, the need to effectively prepare society for whatever Mother Nature throws at us has never been more evident. Throughout the week of the Annual Meeting, the Seventh Symposium on Policy and Socio-Economic Research will explore how to create a more weather-savvy society, and how technology will help us reach that goal.

Jane Lubchenco

The Symposium’s keynote address, “Science for a Weather Ready Nation” (Tuesday, 9:00 a.m., Room 243), will be given by Jane Lubchenco, the Under Secretary of Commerce for Oceans and Atmosphere and the head of NOAA. This is a critical and fascinating time for NOAA, and through partnerships with scientists, the private sector, and other government agencies, its Weather Ready Nation initiative is pursuing a number of goals to help reduce the nation’s vulnerability to weather events:

  • Improved precision of weather and water forecasts and effective communication of risk to local authorities;
  • Improved weather decision support services with new initiatives such as the development of mobile-ready emergency response specialist teams;
  • Innovative science and technological solutions such as the nationwide implementation of Dual Pol radar technology, Integrated Water Resources Science and Services, and the Joint Polar Satellite System;
  • Strengthening joint partnerships to enhance community preparedness;
  • Working with weather enterprise partners and the emergency management community to enhance safety and economic output and effectively manage environmental resources.

(A PDF of the entire Weather Ready Nation strategic plan can be downloaded here.)
The Symposium will consider a wide array of topics relating to this theme, including:

  • policy issues, particularly the use and influence of scientific information on climate policy;
  • communication, including the role of technology (such as social media) in communicating weather and climate information, as well as how diverse populations can receive information they can understand and use;
  • economic matters relating to weather and climate information;
  • New Orleans’s recovery from Katrina and adaptation to future weather events;
  • societal dimensions of weather, especially relating to climate change hazards.

Fly or Drive?…The Aesthetics of Emissions Reduction

Before heading to New Orleans for the AMS Annual Meeting in the next day or so, let’s take a moment for a few important travel considerations.
First of all, we wish you safe travels and look forward to seeing you soon. Second, remember that this year, as in the past several, AMS is making increasing efforts to ensure meetings are as environmentally friendly as possible. The biggest part of this is your flight to New Orleans, which will involve a huge quantity of CO2 emissions. According to www.Atmosfair.de, the flight from Washington, D.C. to New Orleans emits the equivalent of 920 kg of CO2 per passenger, which is about half of an entire year’s output of an midsize family car.
So take a moment to consider offsetting these emissions through one of the websites recommended by the AMS (see web page here or information available at the registration desk).
Or, if it’s possible, consider carpooling.  If you miss the idea of a few hours reverie while soaring through the clouds–and let’s face it, lots of meteorologists like to fly just because of the spectacular in situ experience–consider the impression you’d make arriving in the Big Easy in one of these:

Pulling up to the Convention Center, partly sunny, partly Green. Photo by Maria Cordell.

...or drive the car that makes your own personal cloud. Photo by Andrea Polli.

 

The Return of the Ozone Layer

It’s always nice to hear good news: The ozone layer is recovering, and by around 2032 the amount of ozone in the atmosphere should return to 1980 levels, according to the 2010 Scientific Assessment of Ozone Depletion. At last fall’s symposium on Stratospheric Ozone and Climate Change, co-sponsored by AMS, Paul Newman gave a talk about this progress–and what the world would have looked like had the landmark Montreal Protocol not been implemented in 1987.  Here’s his message, in a nutshell, courtesy of a NASA video:

(You can see Newman’s in-depth presentation on the Assessment from the Bjerknes Lecture at the AGU Fall Meeting as well).
Comprehensive data are available in the links, but a couple of highlights from Newman’s talk are that 1) amounts of chlorine and bromine in the lower atmosphere are in decline, and 2) if the Montreal Protocol had not been implemented in 1987, two-thirds of the ozone layer would be have disappeared by 2065, while the UV index would have tripled. Not only would this have led to a marked increase in occurrences of skin cancer and other health problems, but it also would have caused crop yields across the world to decline by up to 30%, potentially leading to food shortages.
The technology used in ozone research will be the topic of a number of presentations at the upcoming AMS Annual Meeting in New Orleans. One device of particular interest is the Ozone Mapper Profiler Suite (OMPS), a state-of-the-art instrument onboard the recently launched NPOESS Preparatory Project (NPP) satellite.
Angela Li of NASA and colleagues will discuss the collection and evolution of OMPS data in a presentation titled “End-to-End Ozone Mapper Profiler Suite (OMPS) Mission Data Modeling and Simulation” (Tuesday, 1:45 p.m., Room 343/344).
Glen Jaross of Science Systems and Applications, Inc. will lead an examination of the calibration of instruments like OMPS in the discussion, “Evolution of Calibration Requirements and Techniques for Total Ozone Mappers” (Tuesday, 8:30 a.m., Room 257).
Lawrence Flynn of NOAA/NESDIS will lead a talk (Monday, 5:00 p.m., Room 245) on recent advances in ozone sensors, with a focus on those that make solar Backscatter measurements in the Ultraviolet–a list that includes not only OMPS but also the EuMetSat Global Ozone Monitoring Experiment (GOME-2), the Chinese Meteorological Administration (CMA) Solar Backscatter Ultraviolet Sounders (SBUS) and Total Ozone Units (TOU), and the NOAA Solar Backscatter Ultraviolet instruments (SBUV/2).
Early results from OMPS and other instruments on NPP will be the subject of a panel discussion (Monday, 12:15 p.m., Room 343/344) of NPP science team members and designers.
 
 
 

New Study Now Quantifies the "Huge" Seafloor Movement in 2011 Japanese Earthquake

At a magnitude of 9.0, the earthquake off the Japanese coast last March was already known as one of the most powerful ever recorded, killing (in large part due to the ensuing tsunami) almost 16,000 people and damaging or destroying more than 125,000 buildings. A recent study (available here; subscription required) now quantifies just how monumental the event was: the seafloor in the Japan Trench northeast of the mainland, where the quake originated, was jolted 50 meters horizontally and 10 meters vertically–movement that was “abnormally, extraordinarily huge,” according to Toshiya Fujiwara of the Japan Agency for Marine-Earth Science and Technology.
Fujiwara led the research that used multibeam bathymetric surveys to measure the depth of the water and contouring of the seafloor. He noted that the research team did not expect to be able to use such equipment to detect the crust movement,which during most earthquakes occurs in scales of millimeters or centimeters. For example, the 2005 Miyagi earthquake, which had a magnitude of 7.2, registered a crustal shift of 10 centimeters at a geodetic station near the Japan Trench. The 2011 earthquake had a shift of 15 meters at the same station. The study also found another vertical shift of at least 4-6 meters of a slab of ocean crust between the Japan Trench and the Japanese coastline, which may have contributed to the pulsating pattern of the tsunami waves that eventually struck the country.
The researchers believe that the fault that caused the quake may extend as far as the axis of the Japan Trench.
“Previously, we thought the displacement stopped somewhere underground,” Fujiwara said, “but this earthquake destroyed the entire plate boundary.”
As we posted previously, a number of presentations at the AMS Annual Meeting in New Orleans will cover the community response to the earthquake and tsunami, including Junichi Ishida of the Japan Meteorological Agency who will discuss the earthquake’s impact, the JMA’s response to it, and lessons learned from the disaster in the keynote address for the 28th Conference on Interactive Information Processing Systems (Monday, 11:00 a.m., Room 356).

Weather Alerts Get More (and More) Mobile

The use of social media as a forecast tool seems to develop as rapidly as the devices themselves. In December, the NWS revealed it will soon be providing customized location-specific alerts through a user’s wireless carrier.
“We’re getting this weather, disaster, and other emergency information into your hand,” says David Green of the NWS. “The new service will use geo-location to target alerts to a person’s whereabouts. The goal is to give people greater insight into what’s going on with the weather so they can make the best decisions about how to respond.”
At the AMS Meeting in New Orleans next month, you can get a look at two more ways mobile devices are being used to aid in forecasts. In “Using Mobile Devices to Display, Overlay, and Animate Meteorological Data and Imagery,” David Santek, CIMSS/University of Wisconsin, and colleagues, will show their custom interfaces for smartphones that offer near real-time weather alerts. For more on the details of their applications and the future plans for it, check out their presentation on Monday, 23 January, at 5:00 p.m. (Room 357).
Marcel Molendijk, of the Royal Netherlands Meteorological Institute, offers up a different use in “iWitness; Damage Assessment of Severe Weather by Mobile (phone) Observations.” Instead of sending weather alerts to cell phone users, Moldendijk and colleagues collected accident damage reports from an Apple iOS application they developed, with information including a description of the event, time and location (GPS-based), and an optional photo. To get more information on the KNMI system and the results collected to date, go to the talk on Tuesday, 24 January at 2:30 p.m. (Room 356).

New Release: Midlatitude Synoptic Meteorology

The newest title from AMS  Books is now available: Midlatitude Synoptic Meteorology: Dynamics, Analysis & Forecasting, by Gary Lackmann of North Carolina State University, links theoretical concepts to modern technology and facilitates the meaningful application of concepts, theories, and techniques using real data. It is aimed at those planning careers in meteorological research and weather prediction, and it provides a template for the application of modern technology in the classroom. Among the topics it covers in depth are extratropical cyclones and fronts, topographically trapped flows, weather forecasting, and numerical weather prediction. The book is generously illustrated and contains study questions and problems at the end of each chapter.
Midlatitude Synoptic Meteorology–as well as other AMS publications and merchandise–can be purchased from the AMS bookstore.

Plane Has Combative Attitude toward Storms

Technological advancements don’t always involve brand-new applications; sometimes, progress can be made when older technology is utilized in new ways. Such is the case with aircraft used for scientific research. “Experienced” military aircraft have proven to be effective for many types of atmospheric studies, and with the news (subscription required) that a powerful combat plane used by the military for many years is to be reconfigured and given a new assignment, many are looking forward to even greater research capabilities. Originally developed in the 1970s, the Fairchild Republic A-10 Thunderbolt II, better known as the “Warthog” or just “Hog,” is a twin-engine jet designed for close air support of ground forces. Now it’s being prepared to take on powerful storms.
For many years, the military plane of choice for research inside thunderstorms was the T-28. But as early as 1985, scientists recognized that this aircraft lacked the altitude reach, endurance, and payload capacity to adequately address many of their questions. After a number of workshops to study other options, the A-10 Thunderbolt was identified as a prime candidate to become the Next Generation Storm-Penetrating Aircraft.  A subsequent engineering evaluation confirmed the scientists’ view of the A-10 Thunderbolt, but the U.S. Air Force was resistant to authorizing the jet for civilian use. With the advent of the Center for Interdisciplinary Remotely Piloted Aircraft Studies (CIRPAS), a research center at the Naval Postgraduate School in Monterey, California, an opportunity opened to put an A-10 Thunderbolt into service of the civilian science community.  In 2010, the U.S. Air Force agreed to transfer an A-10 Thunderbolt out of mothballs to the U.S. Navy and, with funding from the National Science Foundation (NSF), and let CIRPAS (on behalf of the Naval Postgraduate School) operate it as it has operated a Twin Otter and other aircraft for the last 15 years. CIRPAS aircraft are equipped with basic meteorological, cloud, and aerosol sensors, and have ample capacity for additional instrumentation that collaborators from other universities or national laboratories may wish to use.

The A-10 Thunderbolt

The A-10 Thunderbolt must be completely reassembled to be prepared for atmospheric research. A main part of this effort is wing replacement, but other activity includes evaluation of reinforcement and engine protection needs. The jet will also have its nose-mounted, 30-millimeter cannon removed, opening up more space for scientific instruments. The aircraft is scheduled to be ready for flight in the fall of 2012 and for flying actual scientific missions by mid-2013.
So other than its name, what makes the A-10 Thunderbolt so qualified to fly into storms? Perhaps most importantly, its heavy armor, designed and built to withstand machine-gun and cannon fire. Most planes avoid cumulonimbus clouds and thunderstorms because the hazards that may be encountered inside such clouds–such as severe turbulence, severe icing, lightning, and hail–can be fatal. Encountering hail is particularly dangerous, as striking golf-ball-size hail at 200 mph can smash windshields and damage the airframe and engines. But the A-10 Thunderbolt is rugged enough to deal with such conditions. As Brad Smull of the NSF’s Division of Atmospheric and Geospace Sciences noted, “It turns out that being able to survive wartime flak has a lot in common with being able to handle a strong storm.”
Also valuable are the A-10 Thunderbolt’s flight capabilities. Much is still unknown about cumulonimbus and thunderstorms, and the A-10 Thunderbolt has the potential to reach parts of storms that were previously off-limits. While the T-28’s maximum flying altitude is about 4.5 miles (7 kilometers), the A-10 Thunderbolt can fly at altitudes of up to almost 7 miles (11 kilometers)–high enough to reach the icy heights of thunderheads and gather data on hail formation. It also has the ability to stay in storms for up to 3 hours, compared to about 1 hour for the T-28, and because the A-10 Thunderbolt flies relatively slowly–about 342 mph (550 kilometers per hour)–the data it collects should be of particularly high quality. It can also fly lower than the T-28, making it ideal for air-sea interaction studies, and its heavy payload will support lidar, radar, and other imaging systems.
Ultimately, the versatility of the A-10 Thunderbolt may prove to be its most attractive trait. For example, it might help  meteorologists understand what governs the evolution of a storm and its eventual severity; atmospheric chemists study how storms generate chemical species, transport material through the depth of the troposphere, and modify them in the process; atmospheric physicists investigate how clouds become electrified and how electrification may feed back to influence the microphysics and dynamics of storms; and scientists who observe storms using remote sensors (radars, lidars, satellite radiometers) and who try to predict storm evolution by use of models gather in-situ measurements to validate their observations.
[Portions of this post contributed by Haf Jonsson of the Naval Postgraduate School]

iPhone Game Puts Satellite Data in Your Hands

The Los Angeles Times compares it to Tetris and calls it “the nerdiest game ever“. As far as we’re concerned, that’s a sure-fire journalistic badge of honor for Satellite Insight, the new iPhone game app from NASA and NOAA.
The object of the new game is to control real-time Earth and space weather data. Colored blocks falling into columns on a grid represent small pieces of data. To save lives and protect expensive instruments, the GOES-R weather satellite must not lose any data. Players bundle like data types together before the grid overflows. Data blocks fall slowly at first, but arrive faster as the game continues. Each speed-up also brings a power-up tool you can use at any time to help clear the grid. Keep it going as long as you can and try to beat your best time. Explains NASA’s web site:

No matter how thirsty you are, it’s not easy to drink from a fire hose. But that’s similar to the challenge of capturing and storing the huge blast of images and information that the new GOES-R weather satellite will gather.

And of course, as a NASA and NOAA product, the game has an educational mission too–the instructions include information about the upcoming real-life GOES-R satellite.
Satellite Insight is available free for iPhone and other iOS devices on iTunes. Check it out here.
 

The Services Response to the Tōhoku Disaster a Focus of the 2012 AMS Meeting

The science ministry in Japan reported last week that more than 30,000 square km–eight percent of the country–is contaminated by radioactive caesium from the Fukushima nuclear plant disaster that stemmed from the Tohoku earthquake and tsunami in March. The radiation was washed out of the skies by rain and snow. As much as four-fifths of the caesium ended up in the ocean–much of it having blown northeastward toward Alaska–and currents carried it to the U.S. coastal waters within a week of reactor releases. By one week later some of the micron-sized particles had traveled around the world.
Because the geophysical dimensions of the earthquake-tsunami-meltdown last March are evident in so many ways, so are the demands it placed on scientific services–from the warnings of giant waves to forecasts of tainted precipitation and groundwater to modeling global ocean currents. Not surprisingly, the disaster literally redefined the job of the Japanese Meteorological Agency.
On the first day of full sessions at the upcoming 2012 AMS Annual Meeting in New Orleans, the epic Tōhoku cataclysm will be discussed from numerous angles, particularly the premium it put on enhanced operational response. “The earthquake and tsunami increased vulnerabilities to meteorological disasters such as sediment disasters, flood, and inundations, in the affected area, by shaking and loosening the soils and damaging the embankments and drainage facilities,” notes JMA’s Junichi Ishida.
Ishida’s presentation is the special keynote address of the Interactive Information Processing Systems (IIPS) conference (11 a.m. Monday, 23 January, Room 356). Ishida will talk about how JMA took increased vulnerabilities into account, by

  • changing criteria for heavy rain warnings to account for runoff and landslide vulnerabilties
  • lowering criteria for coastal inundation warnings (the earthquake actually lowered coastal ground levels, changing tidal configurations)
  • introduced extreme temperature warnings to account for reduced electricity capacity
  • enhanced aviation support (in particular due to traffic for relief flights) because of flight dangers including radioactive clouds

11 March Tsunami sweeps through Sendai Airport, where waters reached the second level of buildings, destroying key operations equipment, scattering mud and debris, and stranding more than a thousand people for two days. The airport eventually reopened as a hub of relief work. Photos copyright Japan Meteorological Agency, with thanks to Junichi Ishida, who will deliver the IIPS conference keynote at the 2012 AMS Annual Meeting.

At the same time (11 a.m. Monday, in Room 338) Yukio Masumoto of the Japan Agency for Marine-Earth Science and Technology will kick off a session devoted to the March 2011 disaster as part of the Coastal Environment symposium. Masumoto will speak about ocean dispersion of radioactive Caesium-137 and Iodine-131 after the Fukushima releases, including relationships with tides, surface winds and, in one case study, atmospheric fallout. In his abstract, Masumoto reports, “In the near-shore region, the wind forcing is a dominant factor that controls the flow field, while large-scale currents and eddies advect the radionuclides in the off-shore region.”
Several other Monday morning presentations in the Coastal Environment session feature rapid American responses last spring to adapt and construct viable modeling systems to depict Japan’s waterborne radiation hazards–speakers include Ronald Meris of the Defense Threat Reduction Agency, William Samuels of Science Applications International Corp (SAIC), and  Matthew Ward of Applied Science Associates.
After lunch, in the same session (2 p.m., Room 338) Gayle Sugiyama of Lawrence Livermore National Laboratory will talk about how the U.S. Department of Energy’s National Atmospheric Release Advisory Center provided analyses and predictions of the radioactive plume, estimating the exposure in both Japan and the United States. Guido Cervone of George Mason University (2:15 p.m., Room 338) will show how dispersion modeling helped reconstruct the otherwise unknown sequence of radioactive releases at the Fukushima nuclear plant. Masayuki Takigawa  (1:45 p.m., Room 338) will discuss results from regional transport modeling of the radioactivity dispersion on land and ocean, while Teddy R. Holt of the U.S. Naval Research Laboratory will show passive tracer modeling capabilities with the Fukushima events in a coupled ocean-atmosphere mesoscale modeling system (1:30 p.m., Room 338).
In a parallel session of the Coastal Environment Conference next door (1:45 p.m., Room 337) Nathan Becker of NOAA/NWS will discuss calculations of detection times for various configurations of the sensors for the Pacific tsunami warning system, concluding that, “for global tsunami hazard mitigation the installation of about 100 additional carefully-selected coastal sea-level gauges could greatly improve the speed of tsunami detection and characterization.”
Interestingly, Monday’s Space Weather posters (2:30 p.m.-4 p.m., Hall E) include a presentation by Tak Cheung of the ionospheric disruptions caused by the great Japanese earthquake last March. Forecasts of ionospheric disturbances affect yet another service in the wake of the disaster: the communications provided by shortwave radio operators. And that will be a topic for Kent Tobiska (Utah State Univ.) in the Space Weather session at 5 p.m. (Room 252/253

Gliders Do the Wave, in the Air and in the Ocean

One would think that the time when gliders were considered cutting-edge technology for science would have long passed. Yet this durable technology remains at the forefront of research, even today.
Where daredevil pilots once pushed the boundaries of engine-less flight into the upper reaches of the troposphere to study mountain waves, now the Perian Project looks to send its pilots into the stratosphere–30,000 meters up–in the extreme reaches of mountain-perturbed winds. With a special glider that has a pressurized  cabin, organizers of the Perian Project hope to double the world’s sailplane altitude record that they set in 2006 with a different sailplane.
Elizabeth Austin of WeatherExtreme, Ltd. (of Fallbrook, California), the forecast provider for the Perian Project, will speak at the AMS Annual Meeting (Monday 23 January at 5 p.m.) about the high-altitude sailplane flights. Tests of the new, Phase 2 glider will begin in 2012 in California. Austin writes,

This two-seat sailplane is a one-of-a-kind, carbon fiber, pressurized sailplane that will utilize the polar night jet associated with the polar vortex to achieve an altitude of 90,000 feet (27.4 kilometers). The phase two glider has a wing span of 84 feet and will weigh 1,800 pounds loaded with two pilots and equipment. The windows are polycarbonate and do not get brittle at low temperatures. A special drogue chute is being designed that will not degrade rapidly with high levels of ozone exposure.

While piloted sailplanes are basically an extension of the daredevil mountain-wave research that’s been going on since before World War II, robotic devices have also recently been extending the art of research gliding far into the oceans.
You may remember that the cover of the August issue of BAMS featured an underwater glider as part of the article on the Alaska Ocean Observing System. At the upcoming Annual Meeting will be several oceanographic presentations involving the use of ocean gliders–for example here for P. Chu and C.W. Fan on thermocline measurements (Monday, 11:30 a.m.) and here for Phelps et al. on conditions for Arctic ice concentrations (Tuesday, 9:45 a.m. poster session).
Thanks to an open-source contest by Liquid Robotics, Inc., you don’t have to wait for the Annual Meeting to find out what it’s like to use the latest robotic gliders in oceanographic and meteorological observing. As a demonstration of robotic gliders powered by wave action, the Sunnyvale, California, company is launching four of its remote controlled craft in San Francisco on 17 November. Their goal: to cross the Pacific Ocean while collecting a variety of oceanic and atmospheric parameters.
The company is calling this record-breaking robotic the PacX Challenge and it involves a prize for the scientist–that could be you!–who comes up with the best use of the data streaming back from the robots as they make their way westward and, hopefully, avoid sharkbite (which has happened to one of the company’s gliders in the past).
The gliders (featured in today’s New York Times), only move at about one knot or so, and will split into pairs in Hawaii. In about 300 days, one pair is expected to reach Japan; the other pair, Australia.

While at sea, the Wave Gliders will be routed across regions never before remotely surveyed and will continuously transmit valuable data on salinity and water temperature, waves, weather, fluorescence, and dissolved oxygen. This data will be made available in near real-time to all registered individuals.
Oceanographic organizations already planning to use the data gathered during the Pacific crossing include Scripps Institution of Oceanography, Woods Hole Oceanographic Institution, and the Monterey Naval Post Graduate School.

If you submit an abstract by 23 April 2012, you can design a scientific mission for the gliders and hope for this:

The grand prize winner will receive six months of free Wave Glider data services and will work with Liquid Robotics to chart the course and mission for the six month deployment, including configuration of onboard sensors.

Not a bad way to let robots do the work for you.