A National Network of Networks: The Discussion Continues

by James Stalker, CEO, RESPR, Inc., and Chair, R&D/Testbeds Working Group for the AMS Ad Hoc Committee on a Nationwide Network of Networks
The National Research Council (NRC) report, titled “Observing the Weather and Climate From the Ground Up: A Nationwide Network of Networks (2009),” provided the vision and inspiration for building a team of volunteers from across all three sectors (government, academia, and private) that investigated the
suggestions recommended in the report. This team, comprising six (6) working groups (Organization and Business Models, Architecture, Measurements and Infrastructure, Metadata Policy, R&D and Testbeds, and Human Dimension), spent more than two years considering how to refine the recommendations and tackle the challenges identified in the original NRC report. They also identified other challenges in shaping this type of Nationwide Network of Networks (NNoN), which will be of critical importance to our country’s weather-ready future.
This volunteer team has published additional recommendations compiled into a draft report available at the American Meteorological Society website. Several drafts of this team’s report had been made available to the larger weather and climate enterprise community for comments over many months. The most recent and final version of the report reflects the community input. The readers of this blog are encouraged to read this final report and provide their comments to the Committee Chair and/or any of the Working Group Chairs of the Ad Hoc Committee on Network of Networks.
These volunteer efforts, to date, have certainly tried to solidify the interest of the various stakeholders in a network of this magnitude and of national importance but a lot more work remains to be undertaken. Unfortunately, many challenges remain unresolved. For example, wide-spread support hasn’t been secured for the idea of a central authority for an organizing body of the NNoN. Despite the best efforts by the volunteer team, an appealing organization and business model for such a central body has not been settled on going forward. Other challenges include establishing how to:

  1. make this organizing body an autonomous body that is not unduly influenced by any one sector,
  2. make this body a financially sustainable entity in the long run,
  3. reach all the major stakeholders and get them to support this idea and contribute to its success.

With respect to the third challenge listed above, many of the sought-after stakeholders may not be actively engaged in the weather and climate enterprise community activities and so finding effective ways to reach them becomes an even bigger challenge.
On a positive note, however, the NNoN efforts are going to be discussed again and support will be sought at the AMS Washington Forum in April 2012 and also at the Summer Community Meeting in August 2012 in Norman, Oklahoma. These two venues should prove quite useful for any interested Weather and Climate Enterprise participant and other stakeholders in the overarching effort to build a national asset that the current and many future generations will help nurture and benefit from.

Plane Has Combative Attitude toward Storms

Technological advancements don’t always involve brand-new applications; sometimes, progress can be made when older technology is utilized in new ways. Such is the case with aircraft used for scientific research. “Experienced” military aircraft have proven to be effective for many types of atmospheric studies, and with the news (subscription required) that a powerful combat plane used by the military for many years is to be reconfigured and given a new assignment, many are looking forward to even greater research capabilities. Originally developed in the 1970s, the Fairchild Republic A-10 Thunderbolt II, better known as the “Warthog” or just “Hog,” is a twin-engine jet designed for close air support of ground forces. Now it’s being prepared to take on powerful storms.
For many years, the military plane of choice for research inside thunderstorms was the T-28. But as early as 1985, scientists recognized that this aircraft lacked the altitude reach, endurance, and payload capacity to adequately address many of their questions. After a number of workshops to study other options, the A-10 Thunderbolt was identified as a prime candidate to become the Next Generation Storm-Penetrating Aircraft.  A subsequent engineering evaluation confirmed the scientists’ view of the A-10 Thunderbolt, but the U.S. Air Force was resistant to authorizing the jet for civilian use. With the advent of the Center for Interdisciplinary Remotely Piloted Aircraft Studies (CIRPAS), a research center at the Naval Postgraduate School in Monterey, California, an opportunity opened to put an A-10 Thunderbolt into service of the civilian science community.  In 2010, the U.S. Air Force agreed to transfer an A-10 Thunderbolt out of mothballs to the U.S. Navy and, with funding from the National Science Foundation (NSF), and let CIRPAS (on behalf of the Naval Postgraduate School) operate it as it has operated a Twin Otter and other aircraft for the last 15 years. CIRPAS aircraft are equipped with basic meteorological, cloud, and aerosol sensors, and have ample capacity for additional instrumentation that collaborators from other universities or national laboratories may wish to use.

The A-10 Thunderbolt

The A-10 Thunderbolt must be completely reassembled to be prepared for atmospheric research. A main part of this effort is wing replacement, but other activity includes evaluation of reinforcement and engine protection needs. The jet will also have its nose-mounted, 30-millimeter cannon removed, opening up more space for scientific instruments. The aircraft is scheduled to be ready for flight in the fall of 2012 and for flying actual scientific missions by mid-2013.
So other than its name, what makes the A-10 Thunderbolt so qualified to fly into storms? Perhaps most importantly, its heavy armor, designed and built to withstand machine-gun and cannon fire. Most planes avoid cumulonimbus clouds and thunderstorms because the hazards that may be encountered inside such clouds–such as severe turbulence, severe icing, lightning, and hail–can be fatal. Encountering hail is particularly dangerous, as striking golf-ball-size hail at 200 mph can smash windshields and damage the airframe and engines. But the A-10 Thunderbolt is rugged enough to deal with such conditions. As Brad Smull of the NSF’s Division of Atmospheric and Geospace Sciences noted, “It turns out that being able to survive wartime flak has a lot in common with being able to handle a strong storm.”
Also valuable are the A-10 Thunderbolt’s flight capabilities. Much is still unknown about cumulonimbus and thunderstorms, and the A-10 Thunderbolt has the potential to reach parts of storms that were previously off-limits. While the T-28’s maximum flying altitude is about 4.5 miles (7 kilometers), the A-10 Thunderbolt can fly at altitudes of up to almost 7 miles (11 kilometers)–high enough to reach the icy heights of thunderheads and gather data on hail formation. It also has the ability to stay in storms for up to 3 hours, compared to about 1 hour for the T-28, and because the A-10 Thunderbolt flies relatively slowly–about 342 mph (550 kilometers per hour)–the data it collects should be of particularly high quality. It can also fly lower than the T-28, making it ideal for air-sea interaction studies, and its heavy payload will support lidar, radar, and other imaging systems.
Ultimately, the versatility of the A-10 Thunderbolt may prove to be its most attractive trait. For example, it might help  meteorologists understand what governs the evolution of a storm and its eventual severity; atmospheric chemists study how storms generate chemical species, transport material through the depth of the troposphere, and modify them in the process; atmospheric physicists investigate how clouds become electrified and how electrification may feed back to influence the microphysics and dynamics of storms; and scientists who observe storms using remote sensors (radars, lidars, satellite radiometers) and who try to predict storm evolution by use of models gather in-situ measurements to validate their observations.
[Portions of this post contributed by Haf Jonsson of the Naval Postgraduate School]

Dealing With a Challenging Science Policy Environment

by William Hooke, AMS Policy Program Director. Adapted from posts (here, herehere, and here) on the AMS Project,
Living on the Real World, discussing this week’s AMS workshop in Washington, D.C.

Our community suddenly finds the larger host society fiscally constrained and bitterly divided politically. And this seems to be true not just for America but for much of the world. The sources of funding that have fueled the progress in Earth observations, science and services in recent decades are not drying up – but they are looking to be intermittent, unreliable. And reductions – perhaps deep cuts – may well lie ahead. Historic bipartisan support for our work is fraying a bit; here and there we experience criticism, some of it harsh.
We face a twofold challenge. The work we do has never been more urgent…but the underpinnings for that work are in jeopardy. And – this is sobering – it seems this conjunction may not be accidental. Instead, these twin trials are related; they stem from the same cause. A population of seven billion people, on its way to nine, is straining both the Earth’s resources and its own intrinsic innovative capacity. And all of us are getting nervous and snippy with one another. If we’re not careful, worse lies ahead.
Discussions this past week at the AMS workshop on Earth Observations, Science, and Services for the 21st Century showed two divergent approaches to this challenging societal context. What was striking, without going into the details, was the contrast between work underway to (1) augment networks of surface meteorological sensors and (2) to deploy sensors in space. Both have had their recent successes. Shortly we’ll enjoy a substantial augmentation of surface carbon dioxide measurements – far sooner than most people had thought possible. And the successful NPP launch clears a huge hurdle for the world of aerospace and remote sensing of the Earth from space.
The distinction lies in what happens next. Those working on the surface networks see each sensor as seeding further sensors. They make comments like “…put this out in one state, and pretty soon other communities in that state will want their own sensor, and over time the network will build…” They’re looking to probing above the surface, characterizing not just conditions adjacent to the ground, but throughout the depth of the boundary layer (think the inversion layer that traps pollutants, or the layer just beneath cloud formation).
The folks at the satellite end find themselves by contrast on autopilot settings that don’t look as if they’ll change significantly until around 2025. The JPSS missions that will succeed NPP are scheduled to follow a script that’s relatively cut-and-dried. In the meantime, everything else in the host society that wants these space-based Earth observations will be morphing constantly, rapidly – if anything, at an accelerating rate. And this rigidity brings costs.
A big key? Being able to change direction…to recognize, acknowledge, and correct mistakes. How to accomplish this? Still up in the air.

Gliders Do the Wave, in the Air and in the Ocean

One would think that the time when gliders were considered cutting-edge technology for science would have long passed. Yet this durable technology remains at the forefront of research, even today.
Where daredevil pilots once pushed the boundaries of engine-less flight into the upper reaches of the troposphere to study mountain waves, now the Perian Project looks to send its pilots into the stratosphere–30,000 meters up–in the extreme reaches of mountain-perturbed winds. With a special glider that has a pressurized  cabin, organizers of the Perian Project hope to double the world’s sailplane altitude record that they set in 2006 with a different sailplane.
Elizabeth Austin of WeatherExtreme, Ltd. (of Fallbrook, California), the forecast provider for the Perian Project, will speak at the AMS Annual Meeting (Monday 23 January at 5 p.m.) about the high-altitude sailplane flights. Tests of the new, Phase 2 glider will begin in 2012 in California. Austin writes,

This two-seat sailplane is a one-of-a-kind, carbon fiber, pressurized sailplane that will utilize the polar night jet associated with the polar vortex to achieve an altitude of 90,000 feet (27.4 kilometers). The phase two glider has a wing span of 84 feet and will weigh 1,800 pounds loaded with two pilots and equipment. The windows are polycarbonate and do not get brittle at low temperatures. A special drogue chute is being designed that will not degrade rapidly with high levels of ozone exposure.

While piloted sailplanes are basically an extension of the daredevil mountain-wave research that’s been going on since before World War II, robotic devices have also recently been extending the art of research gliding far into the oceans.
You may remember that the cover of the August issue of BAMS featured an underwater glider as part of the article on the Alaska Ocean Observing System. At the upcoming Annual Meeting will be several oceanographic presentations involving the use of ocean gliders–for example here for P. Chu and C.W. Fan on thermocline measurements (Monday, 11:30 a.m.) and here for Phelps et al. on conditions for Arctic ice concentrations (Tuesday, 9:45 a.m. poster session).
Thanks to an open-source contest by Liquid Robotics, Inc., you don’t have to wait for the Annual Meeting to find out what it’s like to use the latest robotic gliders in oceanographic and meteorological observing. As a demonstration of robotic gliders powered by wave action, the Sunnyvale, California, company is launching four of its remote controlled craft in San Francisco on 17 November. Their goal: to cross the Pacific Ocean while collecting a variety of oceanic and atmospheric parameters.
The company is calling this record-breaking robotic the PacX Challenge and it involves a prize for the scientist–that could be you!–who comes up with the best use of the data streaming back from the robots as they make their way westward and, hopefully, avoid sharkbite (which has happened to one of the company’s gliders in the past).
The gliders (featured in today’s New York Times), only move at about one knot or so, and will split into pairs in Hawaii. In about 300 days, one pair is expected to reach Japan; the other pair, Australia.

While at sea, the Wave Gliders will be routed across regions never before remotely surveyed and will continuously transmit valuable data on salinity and water temperature, waves, weather, fluorescence, and dissolved oxygen. This data will be made available in near real-time to all registered individuals.
Oceanographic organizations already planning to use the data gathered during the Pacific crossing include Scripps Institution of Oceanography, Woods Hole Oceanographic Institution, and the Monterey Naval Post Graduate School.

If you submit an abstract by 23 April 2012, you can design a scientific mission for the gliders and hope for this:

The grand prize winner will receive six months of free Wave Glider data services and will work with Liquid Robotics to chart the course and mission for the six month deployment, including configuration of onboard sensors.

Not a bad way to let robots do the work for you.

Looking at the Sun in a New Light

With orbiting observatories and solar probes now available to scientists, it might seem that studying the Sun (and its effects on weather and climate) has largely shifted to space-based technology. But in fact, ground-based monitoring of the Sun provides significant opportunities that aren’t possible from space. And the future of looking at the Sun from Earth is primed to become brighter with the recent announcement that the National Solar Observatory (NSO) will be moving to the University of Colorado at Boulder.
The NSO currently operates in two locations: Kitt Peak National Observatory in Arizona and Sacramento Peak Observatory in Arizona. The consolidation of these two locations and the move to Colorado will be a multiyear process, with the actual physical relocation to begin around 2016.
The move will include the deactivation of older telescopes–some of which date to the 1950s–and will coincide with the construction of the Advanced Technology Solar Telescope (ATST), which when completed will be the largest optical solar telescope in the world. The ATST will be located in Hawaii, but the new NSO in Boulder will be the ATST’s science, instrument development, and data analysis center.
The dual projects should result in major advancements for solar exploration from the ground. The ATST will provide “unprecedented resolution and accuracy in studying the finescale magnetic field and solar activity that controls what happens in the solar atmosphere and beyond,” according to the NSO’s Stephen Keil.
Jeff Kuhn of the University of Hawaii explains how the ATST will be valuable in studying the Sun’s magnetic field, which drives much of the sun’s activity:

Most of the changes that happen on the Sun are caused by changes in magnetic fields,  and the ATST is a very specialized instrument that allows us to see those changes, and in fact has a sensitivity to measure changes in the magnetic field at the same kind of magnetic field strength as the . . . magnetic field that exists on the Earth that makes your compass needle work.

The high-resolution images needed to study the Sun’s magnetic field require very large telescopes that are too expensive to send into space. With the development of adaptive optics technology, ground-based observations are now much sharper than in the past, allowing for the study of “extremely small, violently active magnetic fields that control the temperature of the corona, and the solar wind, that produce flares [and] x-ray emission,” according to Eugene Parker of the University of Chicago.
Additionally, ground-based observatories have the capability of not just creating images, but also of making movies that track solar changes on time scales of minutes or even seconds.
The NSO has created a video (available on this page) that explains more about the atmospheric effects of solar activity and other advantages of ground-based solar research.

Oklahoma Mesonet Station Stands Tall in EF-4 Tornado

The morning after the tornado: still standing tall.

by Chris Fiebrich, Oklahoma Climatological Survey
It was bound to happen eventually.  The Oklahoma Mesonet has 120 weather stations across the state, about one every 30 km.  Since 1994, we’ve had a lot of close calls with severe weather, but the highest wind speed ever recorded had been 113 m.p.h. at our Lahoma station during a thunderstorm in August 1994.  That all changed on May 24, 2011 when a strong tornado clipped our El Reno station.   The graph below shows that winds gusted to 151 m.p.h. shortly after 4:20 PM.  Along with the wind gust, the station recorded a strong pressure drop.

At this time, the tornado has been rated as “at least EF4”  (see http://www.srh.noaa.gov/oun/?n=events-20110524-pns1 for the latest on the tornado ratings).  The tornado was on the ground for 75 miles.  It’s center was likely several hundred yards north of our station as it blew through.
A piece of flying debris sheared off the station’s 2 m anemometer just after it reported a wind gust of 126 mph.  The station’s temperature aspirator was also damaged, and one of the tower’s guy wires was snapped. A piece of metal debris was found wrapped around the tower. Despite minor damage, the tower stood tall and the official 10 m anemometer survived in perfect condition with a piece of metal debris wrapped around it.  A large nearby tree was found uprooted and thrown across the roadway.
More pictures can be found on the Mesonet Facebook page at http://www.facebook.com/mesonet.

Nationwide Network of Networks–Now Is the Time for Your Input

by George Frederick, Chair, AMS Ad Hoc Committee on Network of Networks
Today’s Town Hall (WSCC 606, 12:15-1:15 pm) on the Nationwide Network of Networks (NNoN) coincides with the availability of a draft report by our committee, available online for comment and review.
The report is a result of the AMS’s intensive response to the 2009 National Research Council (NRC) report entitled, Observing the Weather and Climate FROM THE GROUND UP A Nationwide Network of Networks. It summarized the work of a committee of the NRC’s Board on Atmospheric Sciences and Climate charged with developing “…an overarching vision for an integrated, flexible, adaptive, and multi-purpose mesoscale meteorological observation network….”  In “… identifying specific steps…that meet(s) multiple national needs…” the committee was given five guidelines:

  • Characterize the current state of mesoscale observations and purposes;
  • Compare the US mesoscale atmospheric observing system to other observing system benchmarks
  • Describe desirable attributes of an integrated national mesoscale observing system;
  • Identify steps to enhance and extend mesoscale meteorological observing capabilities so they meet multiple national needs; and
  • Recommend practical steps to transform and modernize current, limited mesoscale meteorological observing capabilities to better meet the needs of a broad range of users and improve cost effectiveness.

The committee focused on the planetary boundary layer extending from 2 meters below the surface to 2-3 kilometers above in the United States, including coastal zones.  Forecast time scales ranged up to 48 hours.  It considered the roles of federal, state and local governments as well as the private sector.  The goal was to guide development of “an integrated, multipurpose national mesoscale observation network.”



In reaction to the NRC report the AMS formed an ad hoc committee under its Commission on the Weather and Climate Enterprise to address the report’s recommendations and provide venues for community discussion and response.  The committee launched its effort at the AMS Community Meeting in Norman, Oklahoma, in August 2009.  Subsequently, six working groups have been busy addressing the recommendations in the NRC report.
The committee shares the vision of the NRC study, in which, ultimately, a “central authority” is required for the success of any nationwide network of networks. Traditional public-private-academic relationships will need to adjust to this new way of doing business—this will be a challenge for the entire community.
Other key recommendations include

  • A stakeholder’s summit should be convened at an early date to foment the NNoN initiative and continue the momentum achieved to date.  Implementation plans should be a follow-on result of this summit.
  • As funding for a NNoN will be a challenge, an implementation strategy should be developed that prioritizes systems based on their economic benefits; e.g., it was evident that systems to improve observations of the earth’s boundary layer would benefit multiple users (wind energy, aviation, forecasting onset of convective activity) and should be given a high priority.
  • Ongoing R&D and treating all networks (new and old) as perennial testbeds will be essential to success in constantly assessing and improving the member networks of the NNoN and developing new and innovative methods for observing earth’s boundary layer.
  • That the NNoN adopt the Unidata Local Data Manager to provide the communications backbone for the NNoN.
  • Metadata will be mandatory for applying data from the NNoN, and a combination of ISO 19115-2 and SensorML is recommended for the NNoN’s adopted metadata standards. Minimal and recommended sets of metadata elements should be adopted and well documented by the NNoN
  • The human dimension must be considered when developing the NNoN and is key to engaging stakeholders and network operators as the market is developed.  User assessments and education will be key parts of this effort.

Air Quality Monitoring Gets Smart

Smartphones continue to get more popular and, well, smarter, making them ideal for large-scale data-gathering projects. The concept is called crowdsourcing, and there are an increasing number of benefits that a large group of people with smartphones and other mobile devices could provide.
Case in point: Researchers at the University of Southern California have created an application for Android phones (an iPhone app is in development) that they hope will enhance air pollution monitoring. The “Visibility” app allows the public to send pictures they take of the sky to a central database, where the pollution levels for the pictures’ locations can be estimated and recorded. 
As long as the picture is predominantly of the sky and taken on a sunny day, the app can compare it to accepted models of the luminance of the sky for that location. This provides an estimate of visibility, which in turn helps in calculating the amount of certain types of aerosols in the atmosphere. With the help of some of the phone’s features–such as its accelerometer, compass, and GPS–the app can calculate the orientation of the camera and the sun and the time the picture was taken and send that information and the actual picture to a computer, which then estimates the pollution level for the area shown in the picture. The application sends this data back to the picture-taker while simultaneously recording it in a database. (A paper on the research is available here.)
More than 250 people downloaded the app in the first three days it was available. The app has already shown promising results in both Phoenix and the Los Angeles basin when the photo-derived observations are compared to air quality data collected by the EPA. With air pollution monitoring currently limited to sparse distribution of monitor stations, the potential exists for this new app and its successors to change the way we monitor the sky.
The app’s developers hope that its popularity continues to increase, which would help them to refine and update its performance. So if you’re reading this on your smartphone, here’s an opportunity to collect some air quality data of your own.

NPOESS Imager Delivered

This week, while the National Polar-orbiting Operational Environmental Satellite System was a hot topic at the AMS Annual Meeting, Northrop Grumman delivered a critical NPOESS sensor, the Visible Infrared Imager Radiometer Suite (VIIRS).
The VIIRS aboard NPOESS will provide highly detailed imagery of clouds, vegetation, snow cover, dust storms, and other environmental phenomena.
“The delivery of VIIRS enables us to move ahead on an advanced system consisting of spacecraft, sensors, and a ground segment that is already well underway,” said Dave Vandervoet, NPOESS program manager for Northrop Grumman Aerospace Systems.  “This program made terrific progress last year, and the vast majority of the development risk is behind us now.  The sensor that was delivered will be integrated on to the NPOESS Preparatory Project spacecraft, which will be launched next year.”
Raytheon built the instrument under contract to NPOESS prime contractor, Northrop Grumman.
A second VIIRS flight unit scheduled for deployment on the first NPOESS spacecraft, known as C1, is progressing as well.
For samples of next-generation satellite imagery from NPOESS, check out the NexSat web page from Naval Research Labs in Monterey, which was described in the April 2006 issue of BAMS.

Image from NexSat.

Making a MoPED with Big Rigs

Speaking of the future of weather information along transportation corridors, one of the presenters in the Weather and Transportation sessions Monday (1:30 p.m.) is Brian Bell of Global Science & Technology, an exhibitor at the upcoming Annual Meeting. His topic is a project that will show NOAA how well the commercial trucking fleet can serve as an automated system to gather and report weather information on the road, just as airliners do in the air.
Last year at the AMS meeting GS&T demonstrated a novel mobile weather station with an inflatable satellite dish for easy deployment. But this fall the West Virginia company won the  9-month contract from NOAA to build the Mobile Platform Environmental Data observation network (MoPED). To learn more about the project before the Annual Meeting, see this article from the Times West Virginian.