(Weather) Ready, Set, Go…

After a year that brought record-setting severe weather, the need to effectively prepare society for whatever Mother Nature throws at us has never been more evident. Throughout the week of the Annual Meeting, the Seventh Symposium on Policy and Socio-Economic Research will explore how to create a more weather-savvy society, and how technology will help us reach that goal.

Jane Lubchenco

The Symposium’s keynote address, “Science for a Weather Ready Nation” (Tuesday, 9:00 a.m., Room 243), will be given by Jane Lubchenco, the Under Secretary of Commerce for Oceans and Atmosphere and the head of NOAA. This is a critical and fascinating time for NOAA, and through partnerships with scientists, the private sector, and other government agencies, its Weather Ready Nation initiative is pursuing a number of goals to help reduce the nation’s vulnerability to weather events:

  • Improved precision of weather and water forecasts and effective communication of risk to local authorities;
  • Improved weather decision support services with new initiatives such as the development of mobile-ready emergency response specialist teams;
  • Innovative science and technological solutions such as the nationwide implementation of Dual Pol radar technology, Integrated Water Resources Science and Services, and the Joint Polar Satellite System;
  • Strengthening joint partnerships to enhance community preparedness;
  • Working with weather enterprise partners and the emergency management community to enhance safety and economic output and effectively manage environmental resources.

(A PDF of the entire Weather Ready Nation strategic plan can be downloaded here.)
The Symposium will consider a wide array of topics relating to this theme, including:

  • policy issues, particularly the use and influence of scientific information on climate policy;
  • communication, including the role of technology (such as social media) in communicating weather and climate information, as well as how diverse populations can receive information they can understand and use;
  • economic matters relating to weather and climate information;
  • New Orleans’s recovery from Katrina and adaptation to future weather events;
  • societal dimensions of weather, especially relating to climate change hazards.

Goodbye, Greenhouse Gases…Hello, Tyndall Gases!

If there was a Hall of Fame for the atmospheric sciences, John Tyndall would have been one of its first inductees. A truly versatile and inventive scientist, Tyndall’s discovery that blue light is scattered by dust and other tiny particles (now known as the Tyndall Effect) led to an answer to that ever-popular question, “Why is the sky blue?” (Lord Rayleigh gave Tyndall’s discoveries a more formal expression a few years later.)
Tyndall’s imaginative and inquisitive mind ranged far, especially into the chemistry of gases. His study that compared “optically pure” air to regular air found that food remained fresh in the pure air, reinforcing Louis Pasteur’s work on the growth of microorganisms. He studied the flow of glaciers and became an avid mountaineer (there are two mountains and a glacier named after him). He invented the fireman’s respirator and the light pipe (which later led to the development of fiber optics).
But Tyndall is best known for being the person who proved the greenhouse effect of the atmosphere.
Oops!…bad habit, according to Texas A&M’s John Nielsen-Gammon. As part of the Seventh Symposium on Policy and Socio-Economic Research, Nielsen-Gammon will argue (Monday, 2:30 p.m.-4:00 p.m., Hall E) that we should change the term “greenhouse gases” to “Tyndall gases.”

Climate change is quite complicated for the layman to understand. The matter is made worse by the use of a term, the “greenhouse effect”, that refers to a physical system quite unlike the climate system. Communication is not well served by the use of a term that means something different from what it seems to mean.

John Tyndall

I propose that the term “greenhouse gases” be avoided entirely, since such gases are either not found in a greenhouse in special abundance or do not serve to warm the greenhouse to an appreciable extent. Instead, with respect to the scientist, John Tyndall, who first demonstrated that many trace atmospheric gases have powerful infrared absorption properties and thus may play an important role in Earth’s climate, I propose that gases with strong infrared absorptive/emissive properties be dubbed “Tyndall gases”.

We’ll let you attend the poster session to get the details on Nielsen-Gammon’s reasoning, but it sounds like an appropriate way to remember one of the founding fathers of climate science. Not only that, but it honors the fact that Tyndall was an impassioned advocate of science and scientists: clear communication was a specialty of his. He wrote numerous books and contributed articles to popular periodicals, but it was as an orator that he most persuasively brought science to the people. A newspaper of the day noted that “Professor Tyndall has succeeded not only in original investigation and in teaching science soundly and accurately, but in making it attractive. . . .When he lectures at the Royal Institution the theatre is crowded.”  Tyndall was a gifted speaker who regularly gave talks to the general public and effectively explained abstruse scientific concepts. His 1874 Belfast Address famously championed scientific reasoning over religious or nonrational interpretations.
To get to know Tyndall even better, check out the presentation on Tuesday (3:30 p.m., Room 335/336) by Richard Somerville of the Scripps Institution of Oceanography. He will explore Tyndall’s scientific career and his contributions to the atmospheric sciences. Somerville was on the scientific advisory committee of last year’s Tyndall Conference, which celebrated the 150th anniversary of Tyndall’s paper on the greenhouse effect.

The Next Steps for the USGCRP

This week, the National Research Council issued a report of a blue-ribbon panel arguing that the United States Global Change Research Program (USGCRP) may not be able to meet the new decadal goals it’s setting for itself. In less than two weeks, in New Orleans, you’ll get a chance to have your say, too.
USGCRP guides research and disseminates information about climate change, and comprises 13 governmental agencies ranging from the Department of Defense to NASA. The USGCRP assists policymakers; federal, state, and local decision makers; and the public in understanding and adapting to global change.
The program’s new 10-year plan (see draft here; a final version is due next month) broadens USGCRP’s scope from climate to include “climate-related global changes,” building “from core USGCRP capabilities in global climate observation, process understanding, and modeling to strengthen and expand our fundamental scientific understanding of climate change and its interactions with the other critical drivers of global change, such as land-use change, alteration of key biogeochemical cycles, and biodiversity loss.”
The new strategic plan was created to help promote four primary goals of the Program:

  • advance scientific knowledge of the integrated natural and human components of the Earth system;
  • provide the scientific basis for timely adaptation and mitigation;
  • build sustained assessment capacity that improves our understanding, anticipation, and response to global change; and,
  • broaden public understanding of global change.

At the AMS Annual Meeting in New Orleans, a Town Hall Meeting (Tuesday, 12:15 p.m., Room 239) will discuss the new strategic plan and examine forthcoming USGCRP initiatives, including integrated modeling and observations, an interagency global change information system, adaptation research, and the National Climate Assessment. The meeting will also discuss how attendees can become involved in USGCRP activities, and will review current and future products, tools, and services that might be useful to both scientists and decision makers.
Implementation of the decadal strategy won’t be without its challenges, however. The recent National Research Council report praises the USGCRP’s ambition in expanding its scope, but  it also points out that the Program needs greater expertise in certain areas to sufficiently undertake its new plans.

The USGCRP and its member agencies and programs are lacking in capacity to achieve the proposed broadening of the Program, perhaps most seriously with regard to integrating the social and ecological sciences within research and observational programs, and developing the scientific base and organizational capacity for decision support related to mitigation and adaptation choices. Member agencies and programs have insufficient expertise in these domains and lack clear mandates to develop the needed science.

Additionally, the NRC report notes the lack of overarching governance in the USGCRP, which prevents a cohesive foundation of research areas among the Program’s 13 contributing agencies. As a result, those agencies tend to focus on their own pet projects.
“We were hoping there would be a way to coordinate better, especially on the congressional side,” says NCAR’s Warren Washington, who chaired the NRC committee that prepared the report.
Ultimately, the NRC report notes that “a draft federal plan to coordinate research into how to respond to climate change is unlikely to succeed without added resources and new ways to manage the Program.”
“We do recognize there are some gaps in our capacity,” says the NSF’s Timothy Killeen, the USGCRP vice chair who helped develop the new strategic plan. Program officials welcomed the recommendations outlined in the report and have already made plans to bring in more expertise from academia and other agencies to augment research areas that are lacking, as well as form interagency working groups that could help unify the Program.

The Return of the Ozone Layer

It’s always nice to hear good news: The ozone layer is recovering, and by around 2032 the amount of ozone in the atmosphere should return to 1980 levels, according to the 2010 Scientific Assessment of Ozone Depletion. At last fall’s symposium on Stratospheric Ozone and Climate Change, co-sponsored by AMS, Paul Newman gave a talk about this progress–and what the world would have looked like had the landmark Montreal Protocol not been implemented in 1987.  Here’s his message, in a nutshell, courtesy of a NASA video:

(You can see Newman’s in-depth presentation on the Assessment from the Bjerknes Lecture at the AGU Fall Meeting as well).
Comprehensive data are available in the links, but a couple of highlights from Newman’s talk are that 1) amounts of chlorine and bromine in the lower atmosphere are in decline, and 2) if the Montreal Protocol had not been implemented in 1987, two-thirds of the ozone layer would be have disappeared by 2065, while the UV index would have tripled. Not only would this have led to a marked increase in occurrences of skin cancer and other health problems, but it also would have caused crop yields across the world to decline by up to 30%, potentially leading to food shortages.
The technology used in ozone research will be the topic of a number of presentations at the upcoming AMS Annual Meeting in New Orleans. One device of particular interest is the Ozone Mapper Profiler Suite (OMPS), a state-of-the-art instrument onboard the recently launched NPOESS Preparatory Project (NPP) satellite.
Angela Li of NASA and colleagues will discuss the collection and evolution of OMPS data in a presentation titled “End-to-End Ozone Mapper Profiler Suite (OMPS) Mission Data Modeling and Simulation” (Tuesday, 1:45 p.m., Room 343/344).
Glen Jaross of Science Systems and Applications, Inc. will lead an examination of the calibration of instruments like OMPS in the discussion, “Evolution of Calibration Requirements and Techniques for Total Ozone Mappers” (Tuesday, 8:30 a.m., Room 257).
Lawrence Flynn of NOAA/NESDIS will lead a talk (Monday, 5:00 p.m., Room 245) on recent advances in ozone sensors, with a focus on those that make solar Backscatter measurements in the Ultraviolet–a list that includes not only OMPS but also the EuMetSat Global Ozone Monitoring Experiment (GOME-2), the Chinese Meteorological Administration (CMA) Solar Backscatter Ultraviolet Sounders (SBUS) and Total Ozone Units (TOU), and the NOAA Solar Backscatter Ultraviolet instruments (SBUV/2).
Early results from OMPS and other instruments on NPP will be the subject of a panel discussion (Monday, 12:15 p.m., Room 343/344) of NPP science team members and designers.
 
 
 

New Study Now Quantifies the "Huge" Seafloor Movement in 2011 Japanese Earthquake

At a magnitude of 9.0, the earthquake off the Japanese coast last March was already known as one of the most powerful ever recorded, killing (in large part due to the ensuing tsunami) almost 16,000 people and damaging or destroying more than 125,000 buildings. A recent study (available here; subscription required) now quantifies just how monumental the event was: the seafloor in the Japan Trench northeast of the mainland, where the quake originated, was jolted 50 meters horizontally and 10 meters vertically–movement that was “abnormally, extraordinarily huge,” according to Toshiya Fujiwara of the Japan Agency for Marine-Earth Science and Technology.
Fujiwara led the research that used multibeam bathymetric surveys to measure the depth of the water and contouring of the seafloor. He noted that the research team did not expect to be able to use such equipment to detect the crust movement,which during most earthquakes occurs in scales of millimeters or centimeters. For example, the 2005 Miyagi earthquake, which had a magnitude of 7.2, registered a crustal shift of 10 centimeters at a geodetic station near the Japan Trench. The 2011 earthquake had a shift of 15 meters at the same station. The study also found another vertical shift of at least 4-6 meters of a slab of ocean crust between the Japan Trench and the Japanese coastline, which may have contributed to the pulsating pattern of the tsunami waves that eventually struck the country.
The researchers believe that the fault that caused the quake may extend as far as the axis of the Japan Trench.
“Previously, we thought the displacement stopped somewhere underground,” Fujiwara said, “but this earthquake destroyed the entire plate boundary.”
As we posted previously, a number of presentations at the AMS Annual Meeting in New Orleans will cover the community response to the earthquake and tsunami, including Junichi Ishida of the Japan Meteorological Agency who will discuss the earthquake’s impact, the JMA’s response to it, and lessons learned from the disaster in the keynote address for the 28th Conference on Interactive Information Processing Systems (Monday, 11:00 a.m., Room 356).

New Release: Midlatitude Synoptic Meteorology

The newest title from AMS  Books is now available: Midlatitude Synoptic Meteorology: Dynamics, Analysis & Forecasting, by Gary Lackmann of North Carolina State University, links theoretical concepts to modern technology and facilitates the meaningful application of concepts, theories, and techniques using real data. It is aimed at those planning careers in meteorological research and weather prediction, and it provides a template for the application of modern technology in the classroom. Among the topics it covers in depth are extratropical cyclones and fronts, topographically trapped flows, weather forecasting, and numerical weather prediction. The book is generously illustrated and contains study questions and problems at the end of each chapter.
Midlatitude Synoptic Meteorology–as well as other AMS publications and merchandise–can be purchased from the AMS bookstore.

Plane Has Combative Attitude toward Storms

Technological advancements don’t always involve brand-new applications; sometimes, progress can be made when older technology is utilized in new ways. Such is the case with aircraft used for scientific research. “Experienced” military aircraft have proven to be effective for many types of atmospheric studies, and with the news (subscription required) that a powerful combat plane used by the military for many years is to be reconfigured and given a new assignment, many are looking forward to even greater research capabilities. Originally developed in the 1970s, the Fairchild Republic A-10 Thunderbolt II, better known as the “Warthog” or just “Hog,” is a twin-engine jet designed for close air support of ground forces. Now it’s being prepared to take on powerful storms.
For many years, the military plane of choice for research inside thunderstorms was the T-28. But as early as 1985, scientists recognized that this aircraft lacked the altitude reach, endurance, and payload capacity to adequately address many of their questions. After a number of workshops to study other options, the A-10 Thunderbolt was identified as a prime candidate to become the Next Generation Storm-Penetrating Aircraft.  A subsequent engineering evaluation confirmed the scientists’ view of the A-10 Thunderbolt, but the U.S. Air Force was resistant to authorizing the jet for civilian use. With the advent of the Center for Interdisciplinary Remotely Piloted Aircraft Studies (CIRPAS), a research center at the Naval Postgraduate School in Monterey, California, an opportunity opened to put an A-10 Thunderbolt into service of the civilian science community.  In 2010, the U.S. Air Force agreed to transfer an A-10 Thunderbolt out of mothballs to the U.S. Navy and, with funding from the National Science Foundation (NSF), and let CIRPAS (on behalf of the Naval Postgraduate School) operate it as it has operated a Twin Otter and other aircraft for the last 15 years. CIRPAS aircraft are equipped with basic meteorological, cloud, and aerosol sensors, and have ample capacity for additional instrumentation that collaborators from other universities or national laboratories may wish to use.

The A-10 Thunderbolt

The A-10 Thunderbolt must be completely reassembled to be prepared for atmospheric research. A main part of this effort is wing replacement, but other activity includes evaluation of reinforcement and engine protection needs. The jet will also have its nose-mounted, 30-millimeter cannon removed, opening up more space for scientific instruments. The aircraft is scheduled to be ready for flight in the fall of 2012 and for flying actual scientific missions by mid-2013.
So other than its name, what makes the A-10 Thunderbolt so qualified to fly into storms? Perhaps most importantly, its heavy armor, designed and built to withstand machine-gun and cannon fire. Most planes avoid cumulonimbus clouds and thunderstorms because the hazards that may be encountered inside such clouds–such as severe turbulence, severe icing, lightning, and hail–can be fatal. Encountering hail is particularly dangerous, as striking golf-ball-size hail at 200 mph can smash windshields and damage the airframe and engines. But the A-10 Thunderbolt is rugged enough to deal with such conditions. As Brad Smull of the NSF’s Division of Atmospheric and Geospace Sciences noted, “It turns out that being able to survive wartime flak has a lot in common with being able to handle a strong storm.”
Also valuable are the A-10 Thunderbolt’s flight capabilities. Much is still unknown about cumulonimbus and thunderstorms, and the A-10 Thunderbolt has the potential to reach parts of storms that were previously off-limits. While the T-28’s maximum flying altitude is about 4.5 miles (7 kilometers), the A-10 Thunderbolt can fly at altitudes of up to almost 7 miles (11 kilometers)–high enough to reach the icy heights of thunderheads and gather data on hail formation. It also has the ability to stay in storms for up to 3 hours, compared to about 1 hour for the T-28, and because the A-10 Thunderbolt flies relatively slowly–about 342 mph (550 kilometers per hour)–the data it collects should be of particularly high quality. It can also fly lower than the T-28, making it ideal for air-sea interaction studies, and its heavy payload will support lidar, radar, and other imaging systems.
Ultimately, the versatility of the A-10 Thunderbolt may prove to be its most attractive trait. For example, it might help  meteorologists understand what governs the evolution of a storm and its eventual severity; atmospheric chemists study how storms generate chemical species, transport material through the depth of the troposphere, and modify them in the process; atmospheric physicists investigate how clouds become electrified and how electrification may feed back to influence the microphysics and dynamics of storms; and scientists who observe storms using remote sensors (radars, lidars, satellite radiometers) and who try to predict storm evolution by use of models gather in-situ measurements to validate their observations.
[Portions of this post contributed by Haf Jonsson of the Naval Postgraduate School]

What Does Climate Sound Like?

Paul Miller is a musician, artist, and author better known by his performing name, DJ Spooky. His most recent project, called Terra Nova, is an artistic interpretation of climate and climate change based on both science and his own imagination. The project grew out of Spooky’s visits to both the Arctic and Antarctica, which inspired him to share his vision of climate change through music, words, and pictures. His recently released Book of Ice combines photographs, his own artwork, and commentary on the relationship between art, science, and humanity, with a focus on Antarctica.
But Spooky is best known as a musician, and he has recently toured with a small ensemble of instrumentalists to perform music that he says is intended to make people think and talk about the environment and, specifically, climate change. The pieces for Terra Nova are unique blends of science and art; in some, he uses the music to interpret scientific data (such as the long-held idea that every snowflake has unique qualities). Combining orchestral arrangements with his own electronic contributions, the music creates what Spooky calls “acoustic portraits of the landscape.” His live shows are accompanied by background images related to climate, ice, Antarctica, and similar themes; there have also been postperformance discussions of climate and environmental issues.
A snippet of the sonic portion of Terra Nova can be seen in the video below. A full presentation of his piece titled “Sinfonia Antarctica,” performed earlier this year in Savannah, Georgia, can be found here.
 

DJ Spooky’s Sinfonia Antarctica is not the first musical extravaganza with that title. The English composer Ralph Vaughan Williams reworked his score for the 1948 movie, “Scott of the Antarctic” into a sprawling, five-movement work for orchestra (including wind machine in the percussion) and called it his Symphony No. 7, “Sinfonia Antarctica.” You can listen to it here and decide what advantages Spooky had by virtue of visiting the Antarctica in person. (In 2000, after a trip to Antarctica, British composer Sir Peter Maxwell Davies wrote his 8th symphony, nicknamed “Antarctic”; like DJ Spooky’s music, the Maxwell Davies symphony is more an abstract depiction of the loneliness and desolation of an icy expanse than the lush, dramatic Vaughan Williams symphony.)
There’s a long history of music depicting climate and specific atmospheric phenomena. Karen Aplin and Paul Williams made a methodical study of some famous classical orchestral works depicting weather and climate (including Vaughan Williams’ “Sinfonia Antartica”) in the November issue of Weather magazine (published by the Royal Meteorological Society in the U.K.). A press release about the article says that British composers are “twice as likely to have written music about climate themes” than composers from elsewhere. However, a closer examination of the article shows a limited sample size precludes such conclusions. What is interesting, though, is that Aplin and Williams take a highly analytic approach to the topic, which might eventually lead to interesting conclusions about musical methods (instrumentation, keys, etc.) or relations between composers’ nationalities and the type of weather that interests them.
Meanwhile, music has moved in radically different directions than, say, Vivaldi’s violin concerti about “The Four Seasons.” The technology and world-awareness exploited by DJ Spooky and his musical/video performance concoctions are just one avenue. For example, composer Nathalie Miebach, has lately been turning actual meteorological data into sound–and sculpture made of woven reeds. She recently took numerical observations–temperature, humidity, pressure, and so forth from 2007’s Hurricane Noel and charted them graphically, then translated the chart into musical notation.  The sculpture then depicts the charts three dimensionally. Miebach says:

I think there are a lot of us out there who need the kinesthetic, who need the touch to understand something. By bringing the complexity of meteorology back into the physical space, either through touch or through sound, I’m trying to find alternative venues or access points into that complexity….I am getting more interested in using data as a literary tool, to tell a story

Just as technology has allowed us to experience and visualize the atmosphere, so too it has allowed us to see–and hear–it differently.
 

 
 

Looking at the Sun in a New Light

With orbiting observatories and solar probes now available to scientists, it might seem that studying the Sun (and its effects on weather and climate) has largely shifted to space-based technology. But in fact, ground-based monitoring of the Sun provides significant opportunities that aren’t possible from space. And the future of looking at the Sun from Earth is primed to become brighter with the recent announcement that the National Solar Observatory (NSO) will be moving to the University of Colorado at Boulder.
The NSO currently operates in two locations: Kitt Peak National Observatory in Arizona and Sacramento Peak Observatory in Arizona. The consolidation of these two locations and the move to Colorado will be a multiyear process, with the actual physical relocation to begin around 2016.
The move will include the deactivation of older telescopes–some of which date to the 1950s–and will coincide with the construction of the Advanced Technology Solar Telescope (ATST), which when completed will be the largest optical solar telescope in the world. The ATST will be located in Hawaii, but the new NSO in Boulder will be the ATST’s science, instrument development, and data analysis center.
The dual projects should result in major advancements for solar exploration from the ground. The ATST will provide “unprecedented resolution and accuracy in studying the finescale magnetic field and solar activity that controls what happens in the solar atmosphere and beyond,” according to the NSO’s Stephen Keil.
Jeff Kuhn of the University of Hawaii explains how the ATST will be valuable in studying the Sun’s magnetic field, which drives much of the sun’s activity:

Most of the changes that happen on the Sun are caused by changes in magnetic fields,  and the ATST is a very specialized instrument that allows us to see those changes, and in fact has a sensitivity to measure changes in the magnetic field at the same kind of magnetic field strength as the . . . magnetic field that exists on the Earth that makes your compass needle work.

The high-resolution images needed to study the Sun’s magnetic field require very large telescopes that are too expensive to send into space. With the development of adaptive optics technology, ground-based observations are now much sharper than in the past, allowing for the study of “extremely small, violently active magnetic fields that control the temperature of the corona, and the solar wind, that produce flares [and] x-ray emission,” according to Eugene Parker of the University of Chicago.
Additionally, ground-based observatories have the capability of not just creating images, but also of making movies that track solar changes on time scales of minutes or even seconds.
The NSO has created a video (available on this page) that explains more about the atmospheric effects of solar activity and other advantages of ground-based solar research.

Mapping Ice Flow in Antarctica

A recently released map of the speed and direction of ice flows across Antarctica not only reveals some previously undiscovered geographical features, but also suggests a new explanation for how ice moves across the continent. Researchers constructed the map after studying billions of data points taken from a number of polar-orbiting satellites. After accounting for cloud cover, solar glare, and various land features, the scientists were able to determine the shape and speed of glacial formations across Antarctica. They found that some formations moved as much as 800 feet per year, and they also discovered a previously unknown ridge that runs east-to-west across the continent. The NASA animation below shows the ice flow patterns. “This is like seeing a map of all the oceans’ currents for the first time,” says Eric Rignot of the University of California—Irvine, who led the study (subscription required for access to the full article). “It’s a game changer for glaciology.” The observations also showed that the ice moves by slipping and sliding along the land, and not by being crushed and broken down by ice above it, as had previously been theorized by many glaciologists. That difference is critical to forecasting sea level rise in decades to come since a loss of ice at the water’s edge means “we open the tap to massive amounts of ice in the interior,” according to Thomas Wagner of NASA’s cryospheric program.