by Paul Higgins, AMS Policy Program Director

In June, the U.S. House of Representatives voted to cut funding for Earth system science and services. One bill cuts roughly 5% each from NASA’s Earth Sciences and the total NOAA budget. That same bill also cuts more than 16% from the total funding for NSF’s Geoscience and Social, Behavioral, and Economics (SBE) directorates. In a separate bill, the House voted to cut more than 9% from Biological and Environmental Research (BER) in the Department of Energy’s Office of Science (though the name can be misleading to some, BER houses the research most relevant to our community).

Both bills passed with support from nearly all House Republicans and opposition from nearly all House Democrats. So the cuts suggest that House Republicans do not think as favorably of the Earth sciences as the AMS community might like.

To become law, the Senate and the President would need to sign off on these cuts. That doesn’t appear likely at this time but the paths to agreement for any funding bill—which must be approved every year—are far too complicated to predict. The good news is that Republicans in the Senate (and Democrats in both chambers) appear more predisposed to fund weather, water, and climate research and the President pushed for increases in weather and climate research through his budget proposal earlier this year. The bad news is that the House, Senate, and President must all ultimately agree on funding decisions and even a compromise does not look like good news for our community.

Two contributing factors to the House funding bills are particularly noteworthy. First, the funding for Earth sciences is at least partially reflective of the differing views on how best to deal with the larger budget situation.

The Federal budget consists of two types of spending: 1) mandatory programs (e.g., Medicare and social security), and 2) discretionary programs. Discretionary spending is often further divided into defense and non-defense spending. Much of the funding for science (e.g., NSF, NASA, NOAA, DoE, and USGS) is in the non-defense discretionary (NDD) category.

The President’s proposed budget for NDD spending in FY 2016 of $526 billion exceeds by $33 billion the proposals created by Republican leadership in the House and Senate of $493 billion. Note, however, that even the President’s proposed budget remains roughly $15 billion (2.8 percent) below FY 2010 levels (assuming a rate of inflation of 1.7% per year). So the Federal budget for research (along with everything else) is under pressure even under the President’s higher numbers.

Some of this ties back to the Budget Control Act (BCA) of 2011, which made direct cuts to discretionary spending (e.g., funding for science) along with even deeper spending cuts through “sequestration”—automatic across-the-board cuts to both defense and NDD that took hold because Congress failed to agree to a more comprehensive deficit reduction plan (which would have involved a combination of tax increases and more targeted spending cuts). The sequestration cut to NDD is an additional reduction of about six percent.

This overarching budget situation (or conflict) is both a justification for cutting programs and politically expedient cover for those who want to make funding cuts for other reasons, which brings us to the second factor.

The second contributing factor to the House funding bills is the President’s aggressive efforts at climate change risk management. These efforts, which have increased over the last year or so, appear to have angered some, particularly in the House. That anger seems to be being expressed in funding decisions for all of the Earth sciences. At first look, that may not seem to make sense because climate science is a tiny fraction of the Earth sciences and climate change risk management is only tangentially related to climate science. However, the Earth sciences are somewhat easier politically and procedurally for members of congress to focus on than climate science would be alone.

No matter the cause, our community has a strong interest in helping Congress better understand the value of the Earth sciences to the nation and the world. AMS sent a letter to all members of Congress to raise awareness of our contribution (http://ametsoc.org/sss/letters_geosciences_support_may_2015.pdf) but similar efforts from individual scientists throughout the country will likely be needed if policy makers are to view the Earth sciences in a more favorable light.

Strong positive messages, such as those in the AMS letter, are most likely to convey effectively the importance of our science and services to the nation.

Weather and climate information helps society manage risks and realize opportunities associated with existing weather patterns and changes to the climate system (natural and human caused). The services that result can include weather forecasts and warnings, flood and drought prediction and monitoring, natural hazard preparedness and response, public health monitoring, disease prevention and control, assessment and management of fire risk, and decision support for water resources, agriculture, transportation, and other key economic sectors.

Thoughtful engagement with the policy process has the potential to help shift the focus in Congress to the critical role the Earth sciences play in advancing the national agenda. That would help secure the support and resources that our community needs to make critical information and services available.

{ 0 comments }

The 43rd Conference on Broadcast Meteorology and the Third Conference on Weather Warnings and Communication kicked off on Tuesday in the Raleigh Convention Center, attracting about 230 attendees. This annual meeting of meteorologists, social scientists, and other practitioners produced some exciting content and conference firsts.

Some of the AMS Communications Department staff sat down with presenters to talk about their research and presentations.

More videos with experts can be viewed on the AMS YouTube channel.

For the first time at an AMS conference there was a live stream of a panel discussion. Marking the 10th anniversary of Hurricane Katrina, a panel of experts took part in the conversation about the deadly storm and how we can learn from it going forward.

If you missed it live, you can watch the video here. We did experience visual technical difficulties twenty minutes in but the video recovers at minute thirty.

{ 0 comments }

Last week’s Google hangout on extreme precipitation touched on a number of different topics related to preparing for extreme weather events and the larger goal of building a Weather-Ready Nation. It’s noteworthy that one of the key themes that recurred throughout the hangout was “communication,” as a healthy discussion was evident on Twitter during the event. We’ve captured some of the highlights here, just below the full video of the hangout.


 

{ 0 comments }

By the AMS Committee on Satellite Meteorology, Oceanography, and Climatology

Accurate forecasting and creation of weather products require large amounts of input data. Satellite data and imagery provide a large percentage of that time-critical information, including the basis of timely warnings of tornadoes and hurricanes, solar storm-induced electric currents, and the spread and concentration of volcanic ash clouds.

But the role of satellites in saving lives and preventing havoc from atmospheric events is not limited to originating essential data and imagery. Satellites make possible reliable and continuous transmission of data to the meteorologists who issue warnings, watches, and forecasts. For example, warning and water-management data from remotely located, geographically diverse terrestrial sensors in streams, rivers, lakes, and coastal areas are transmitted via the GOES Data Collection System. Thanks to satellites, these data get to first-responders and disaster managers anywhere in the country via the Emergency Managers Weather Information Network (EMWIN).

Many government agencies and the private sector have partnered on an NWS initiative called “StormReady®,” which requires multiple methods—including satellite transmissions—to receive NWS and hydrometeorological monitoring of data. Rapid and reliable communications leading to life- and property‐saving responses have never been better.

Unfortunately, the improvements made by the NWS StormReady® initiative may be threatened by recent and future radio‐frequency spectrum auctions prompted by the growing demand to share federal spectrum. Sharing between commercial broadband and sensitive satellite ground stations may be a source of radio frequency interference, which will disrupt weather product dissemination. For the first time, there is a real threat of these warnings not being received by first-responders because of potential interference caused by commercial broadband providers who will now share the same bands as StormReady® participants.

Private sector and federal users receive the imagery and science data from GOES/GOES-R satellites to guarantee data availability with rapid receipt time. If terrestrial infrastructure is degraded, the direct broadcast guarantees continuation of data.

As AMS Fellow Michael Steinburg put it at a recent webinar (see link at the end of this post): “On the one hand . . . we recognize the continued need to evaluate and optimize federal radio spectrum assignments and allocations as consumer electronics, mobile technology, and the Internet of Things experience explosive growth–sector growth that in fact results in significant growth for America’s weather industry, as new devices and platforms arise all over the world. On the other hand . . . this growth cannot put in jeopardy the core delivery methods that are used by governments and America’s weather industry to reliably collect, aggregate, and deliver foundational weather data because what those do is they provide mission-critical, lifesaving weather products. We cannot–as a Weather Enterprise united in our common goals of saving lives and improving the quality of lives for the world’s citizens–allow this to occur.”

The products developed from these satellites lead to the answers for the following questions:

  • “How many miles of coastal population should we evacuate ahead of landfall for a tropical storm or hurricane?”
  • “When does a severe storm forecast need to alter operations for the energy production or generation industry in a region under imminent threat for severe weather?”
  • “How does a mariner obtain the best possible data to enable ocean freight to safely arrive at our ports?”
  • “At what point do volcanic ash clouds, severe turbulence, or near-Earth radiation demand changes in the heading, altitude, and direction of a commercial or private aircraft to protect the safety of passengers and crew?”

Based on the results of the recent auction, which generated more than $40 billion in revenue, the temptation of government officials to focus more exclusively on the enormous revenue these auctions can create will be great. We, who provide the American people with reliable and accurate weather forecasting and warnings, along with the state and local disaster managers who rely on this information, must make our voices heard.

We urge you to be vigilant as recommendations are made for radio spectrum auctions, which may be shared between the nation’s weather satellites and commercial use. Your input to the Federal Communications Commission on the importance of meteorological products to industry segments will be necessary in the next few months to communicate the importance this spectrum plays in weather forecasting. Comments to the FCC Office of Engineering and Technology can be directed to Julius.Knapp@fcc.gov.

Two recent AMS-sponsored events discuss this situation in considerable detail. See https://ams.confex.com/ams/95Annual/webprogram/Session37898.html and http://swfound.org/events/2015/challenges-in-sharing-weather-satellite-spectrum-with-terrestrial-networks/.

{ 0 comments }

Today at her keynote address to the AMS Washington Forum, U.S. Secretary of Commerce Penny Pritzker announced that NOAA is forming five new alliances to help bring its vast data resources to the public. The partnerships with Amazon Web Services, Microsoft Azure, IBM, Google, and the Open Cloud Consortium address the growing need for access to NOAA’s huge—and rapidly growing—environmental data resource.

That Secretary Pritzker’s announcement came at the opening of this year’s Forum is a testament to the sustained focus of these annual AMS gatherings in Washington, D.C. The Forum revisits recurring themes to build year-to-year unity—and progress—to the discussions. Last year, for example, the AMS Washington Forum participants focused on how data integration across disciplines and sectors drives the effectiveness of the weather, water, and climate enterprise. The Forum found that

Working across agencies and across sectors (e.g., health, energy) is becoming a new “normal” for solving problems. All agree the needs and demands for data, information and forecasts are continuing to change, so our enterprise must remain flexible and agile.

Though the context last year was more about the use of commercially provided data, this continuing Forum theme resonates with Secretary Pritzker’s announcement today. The new government-private sector partnerships are part of the overall movement toward “open government”–accessible, consistent data practices—that should enhance the flexibility and agility emphasized at the AMS Forum last year.

Forum participants also generally agreed last year that “while the private sector needs to take on a bigger role in the provision of weather data, the public and private sectors need more time to jointly determine the best path forward.” And indeed at that time NOAA was in an information-gathering phase preparing for the partnerships announced today. The agency issued a Request for Information (RFI) in February 2014 to see who might be able to help move NOAA data onto the cloud. Commercial partnerships would, according to the RFI, help pull together disparate NOAA sources and web sites and help people “find and integrate data from these sources for cross-domain analysis and decision-making.”

Data integration was not the only motivation. Being the main provider of its own data saddles government agencies with burgeoning information technology needs.

In a separate email news letter today, NOAA Administrator Kathryn Sullivan elaborated on the scope of the Big Data need:

Of the 20 terabytes of data NOAA gathers each day — twice the data of the entire printed collection of the United States Library of Congress — only a small percentage is easily accessible to the public.

The cloud was a way to alleviate this situation, as the RFI stated:

NOAA anticipates these partnerships will have the ability to rapidly scale and surge; thus, removing government infrastructure as a bottleneck to the pace of American innovation and enabling new value-added services and unimaginable integration into our daily lives.

Private sector cloud services have a history of meeting such challenges. The cloud services are able not only to store the huge quantities of data NOAA produces each day but also to provide opportunities for cloud-based applications. This means information processing is possible remotely so that each user does not need to have his or her own advanced infrastructure to move and manipulate vast troves of data. Thus, working in parallel with traditional NOAA data distribution channels, cloud services are expected to enable widespread use of Big Data and to drive private-sector development of applications.

The continued AMS discussions here in D.C. over Wednesday and Thursday will further amplify such continuing themes as Big Data, providing an especially rewarding venue for participants who can return year after year to the Forum.  For example sessions tomorrow on “Rail and Trucking” and “Information Needs for Water Related Extremes” hinge in part on data dissemination. Surface transportation was one of the panel topics last year, as well, meaning repeat participants this year will have an opportunity to update their earlier impressions and find out how opportunities in that field are progressing.

By reaching out to the innovators of the cloud, NOAA stated it was

looking for partners to incite creative uses and innovative approaches that will tap the full potential of its data, spur economic growth, help more entrepreneurs launch businesses, and to create new jobs.

That’s pretty much the same reason leaders of the weather, water, and climate enterprise return year after year to the AMS Washington Forum.

 

{ 0 comments }

Today’s Google hangout on “Women in Weather,” cohosted by AMS and the American Astronautical Society and presented by Northrop Grumman, featured an insightful and wide-ranging discussion about what it means to be a woman in the atmospheric sciences. But the conversation wasn’t just among the panelists–it was also active on Twitter. If you missed the live broadcast of the hangout, you can watch it below, and check out the sampling of tweets underneath the video.

 

 

{ 0 comments }

At 10:36 a.m. on 22 March 2014, near Oso, Washington, the earth began to move.  At first the lower section of slope rising from the North Fork Stillaguamish River slipped. Then the rise above that collapsed, ultimately sliding so fast that nothing could stand in its way. An eyewitness near the river saw water tossed aside and turn black. A 30 m high wall of turbulent earth roared across and along the valley. About 8 million cubic meters of dirt and rock buried the village of Steelhead Haven and killed 43 people. The slide ultimately dammed the river as it raced at 60 km/h along a 1 km wide, 1 km long swath. USGS_MR_Oso_Aerial_clipped_adjusted

The Oso landslide (aftermath photo above, Mark Reid/USGS) was a scientific mystery. There was no obvious geological trigger, like an earthquake. And the slope itself, while prone to slides, was not precariously steep. Meteorologically, it was a rain-free day in a week of no precipitation. However, two new studies—one of them forthcoming soon in the Journal of Hydrometeorology—show why Steelhead Haven was in the wrong place at the wrong time, both geologically and meteorologically.

An overview paper this January in Earth and Planetary Science Letters showed how the Oso landslide underwent two stages of motion. The lower slope slipped slowly for about 50 seconds until the more radical collapse from above led to a high mobility liquid state called a “debris avalanche.” As the landslide spread across the river the debris picked up more moisture. The flow of dirt and rock spread the damage far beyond the initial slip of earth. The gushing mud and rock actually splashed against the opposite slope across the river and spread back upslope on top of itself.

Previous landslides in the Oso area had never attained that extremely mobile second stage. The slope of the 180 m high rise above the river is less than 20 degrees, and scientists have found highly mobile landslides usually start with greater than 20 degree slope—typically more than 30 degrees. What made this one different?

The paper’s authors, Iverson et al. say one reason was the porous geology of local sediments and silt. This porosity may have increased suddenly as the base of the slope started to slip. Then as ground slid the pores contract, raising water pressure and increasing liquefaction that greases the skids for faster movement and more contraction. Furthermore, as rock and dirt overran the river, the slide picked up another 50,000 cubic meters of water and scoured the river bed for more debris.

But if a critical sensitivity to initial geological conditions existed why did the land give way on a sunny day like 22 March 2014 instead of during an earlier, rainier part of the season?

The analysis by Brian Henn et al. in Journal of Hydrometeorology shows that the precipitation in the three weeks before the landslide was unexceptional (such periods are expected every two years or so) if compared to the soaking that the area can get during the rainy season. But the rain was exceptional (an 88-year expected return period) when compared to similar March periods of the past, and that is a bad time to get wet.

Since March is late for the rainy season, this meant additional water charged deep soils that were already wet. Heavy rains earlier in the year encountered soils that contained less moisture. The late rains came on top of an already wet season as well as four wet years before that.

OsoFigure

As a result, six days before the landslide soil moisture for the water year peaked and was wetter than would be expected every 40 years at that date. The soil moisture had surged beyond median levels in just a few weeks. [See figure above from Henn et al. 2015]

In other words, Oso was primed for a landslide, even on a dry day, partly because some of the rain had fallen late in the season—poor meteorological timing for the village of Steelhead Haven.

{ 0 comments }

The Florida Center for Investigative Reporting published allegations this week that the terms “climate change” and “global warming” were banned from state government communications in Florida, including state-agency sponsored research studies and educational programs. The Washington Post followed with claims, for example, that a researcher was required by state officials to strike such words before submitting for publication a manuscript about a epidemiological study.

No evidence of a written policy or rule has been reported, and state officials have denied any policy of the sort. Meanwhile, the media are hunting through Florida websites trying to find state documents produced during the administration of Gov. Rick Scott with contents that would contradict the charges of an unwritten policy, imperfectly enforced.

The controversy is one in a string of recent events reminding us how much scientists rely on their freedom of expression. Most often the problem has been the freedom of government scientists to speak about their work with the public. Lately this has caused a media blizzard in Canada.

Science ethicists may argue one way or another about where the limits of public expression are for government scientists when they contradict policy goals. And certainly—as well seen most obviously in the Cold War—such goals can include national security concerns. But the AMS stance on the filtering or tampering of science for nonscientific purposes is quite clear in the Statement on Freedom of Expression:

The ability of scientists to present their findings to the scientific community, policy makers, the media, and the public without censorship, intimidation, or political interference is imperative.

Freedom of expression is essential to scientific progress. Open debate is a necessary part of science and takes place largely through the publication of credible studies vetted in peer review. Publication is thus founded on the need for freedom of expression, and it is in turn a manifestation of freedom of expression.

One might think the job of journals is to screen out unwanted science, but it’s quite the opposite. Papers are published not because they are validated as “right” so much as they are considered “worthy” of further scientific consideration. In addition, the publication process itself—which AMS knows well in its 11 scientific journals—is not just for authors to report and interpret their work. It relies on free discussion. The peer review process usually allows reviewers maximum protection of anonymity to preserve the ability to speak freely about the manuscripts being scrutinized. The papers that pass review are then the starting point for documenting objections, alternative interpretations, and confirmation, among other expressions that only matter if made accessible to other scientists through peer reviewed journals.

The AMS Statement recognizes that such freedom implies responsibility:

It is incumbent upon scientists to communicate their findings in ways that portray their results and the results of others, objectively, professionally, and without sensationalizing or politicizing the associated impacts.

Scientists are not the only ones to treasure such freedoms, of course. Society benefits from the progress of science every day. This only happens when scientists freely, promptly, and prolifically report what they find—and that means exactly what they find, not what they are told to find. The alternative is to compromise the pursuit of truth and the very foundations of our health and prosperity.

We all become victims when science is not shared and cannot flourish. The fact that climate change has deep social, economic, and political implications today means it is even more important to recognize that with increasing value of climate change science comes the increasing temptation for policy makers to co-opt and alter that science. As the AMS Statement warns, the principles of free expression “matter most—and at the same time are most vulnerable to violation—precisely when science has its greatest bearing on society.”

 

{ 0 comments }

On February 25, the AMS released its new policy on citations for data sources in journal articles. We were all set to tell authors about it when sadly, far bigger news stole the attention of scientists everywhere. The great creator of Spock, actor Leonard Nimoy, had died. Within two days, the story of data policy had become the story of Star Trek.

“That’s not logical,” you say.

OK, we’re not Vulcan, but even a human can see this. Data. Spock. Now is the time to bring them together.

Nimoy made an improbable—some would say illogically great—impact on society masquerading as a half-Vulcan, half-human creature named Spock hurtling through space on both the small and big screens. The tributes following Nimoy’s death last week have spoken of his ability to transcend the seeming limitations of such a curious role. Nimoy embodied racial ambiguity in a time of prejudice, ennobled diplomacy and rationality in an age of war, and gave voice to those who feel alien in their own neighborhoods and schools.

Of all the dualities in Spock’s character—so brilliantly portrayed by an immigrant’s son who skipped college—arguably the most explicit was as the science officer on bridge of the “Enterprise.” His struggle to remain true to the Vulcan creed of logic without emotion was a perfect expression of science in its time. For nerds of the 1960s and ‘70s, Spock’s reliance on logic echoed the haughty aloofness with which popular culture characterized scientists of the Cold War. But through his formidable devotion to knowledge, truth, and teamwork—working through all the pointy-eared social awkwardness he faced among his crew-mates– Spock somehow made science a new kind of “cool” long before geeks made billions of bucks with computers.

The thing is, scientists are a duality, much as Spock and Captain Kirk were two sides of a coin. They get emotional about two things. One is logic. Scientists, like mathematicians, get dewy-eyed about beautiful theories, elegant proofs, and ingenious solutions. The other is data. Unlike Spock, they work themselves into a frenzy over data. The best way to make scientists swoon is to produce data that reveal secrets.

For science to live long and prosper, that data need to be treasured like a home planet. For a long time, most scientific publishers thought it was good enough that journal authors would casually mention data archives in their Acknowledgments. In this age of computer models and constantly updating technology, that’s not good enough. Now authors must use carefully sourced and dated formal citations and references that in turn lead to safeguarded, easily accessible repositories. The author’s guide online gives some helpful examples.

The new citation policy is just one step of many advancing data archive practices that were recommended in the AMS Statement on Full and Open Exchange of Data adopted in December 2013. That statement also calls on funding agencies to recognize the costs of managing data. It recognizes that data preservation and stewardship should be emphasized and discussed at meetings. It says AMS should promote conventions and standards for metadata to increase interoperability and usage, and that the Society should foster ways of deciding what data should be kept to improve preservation practices in the future.

AMS is not alone in this shift. There are others in the chain of research, publication, and archiving trying to do for data what Spock did for logic. Our Society is one of the original members of a year-old team of publishers, data facilities, and consortia called the Coalition on Publishing Data in the Earth and Space Sciences. COPDESS is working to ensure that data are preserved through proper, secure funding, and that careful decisions are made about what should be saved.

Most importantly, this international movement toward protecting and providing data is meant to preserve the scientific process. Science needs published studies to lead to more studies that can confirm or reject findings. According to the AMS Statement,

AMS should strongly encourage an environment in which scholarly papers published in scientific journals contain sufficient detail and references to data and methodology to permit others to test each paper’s scientific conclusions.

All that depends on data being available in the review process as well as in perpetuity, with published results closely aligned with open archives.

Logic and Data: the duality of the scientific spirit. It is easy to celebrate one without the other, but it would not be proper. Spock would understand.

{ 0 comments }

A recent article in the New Yorker tried in vain to dissect and understand the term “wintry mix,” only to grimly report it’s a weather phenomenon vile and disgusting and that forecasters state it to cover their backsides when a variety of winter precipitation is to descend upon man.

Far from vile and disgusting, a wintry mix is just that: a mixture of winter precipitation—snow, sleet, freezing rain—falling from the sky. No more, no less. Its mention will return to forecasts this weekend as a moisture-laden storm in the nation’s midsection plows into Arctic air and treks across the inland South and into the East next week. Rest assured: research and new technology are ready and are allowing forecasters to view wintry mix in amazing detail, better than ever before, improving predictions of the phenomena by leaps and bounds.

Recently published research on dual polarization (dual pol) weather radar in use, in a handful of AMS journals, is shining a spotlight on its capability to determine different types of precipitation falling at the same time, including the once-dreaded wintry mix. Instead of shying away from such forecasts, meteorologists using the nation’s network of Doppler radars, upgraded in recent years to include polarimetric technology, are beginning to get really good at chronicling the wintry mix in their forecasts.

While the New Yorker implied meteorologists disdain for the term, wintry mix actually is looking more beautiful than ever to scientists–so nice we put the words on the cover of the latest BAMS: “Snow Globe: Dual Pol Deciphers Wintry Mix.”

This cover article in BAMS, by Picca et al., looks at New England’s monster blizzard of 9 February 2013, which unloaded more than 3 feet of snow on much of central Connecticut and Long Island. Dual pol radar’s unique modes deciphered the wintry mix inside an intense snowband producing lightning and snowfall rates of 3-6 inches per hour.

DualPol_BAMS7 A composite of products from the dual pol radar on Long Island, New York (KOKX) shows reflectivity (ZH; top), differential reflectivity (ZDR; middle), and correlation coefficient (CC; bottom) of a heavy band of now and ice in the Northeast blizzard of 9 February 2013 (from Picca et al., BAMS). (Top) Reports of precipitation types around the time of the radar products provide ground-truth to the radar signatures. The speckled areas of reduced CC in southern Connecticut and around KOKX are a result of ground clutter. The black dot indicates the location of KOKX, and the star represents the location of the Stony Brook University surface observations. The dashed and dotted outlines indicate the two areas 1 and 2 of mixed phase precipitation. The underlined “LS” is the location of a “large sleet” report.

 

A similar article in Weather and Forecasting, by Griffin et al., documents for the first time polarimetric radar signatures of the same intense convective band of snow. The transition zone from freezing to non-freezing air (0°C isotherm) was exceptionally distinct in the radar signatures.

PPI displays of the polarimetric variables at (a)–(c) 2216 UTC 8 Feb and (d)–(f) 0236 UTC 9 Feb 2013 at 0.58 elevation. The 08C RAP model TW at the surface is overlaid (boldface, dashed). At 2216 UTC, pure dry snow was located within colder temperatures north of the 08C isotherm, while wet snow and mixed-phase hydrometeors occurred within warmer temperatures south of the 08C isotherm in (a)–(c). The solid black line indicates the location of the 1448 azimuth RHI. At 0236 UTC, dry snow was predominant, while wet snow and ice pellets were also observed within the max ZH region, within negative surface temperatures, north of the 08C isotherm in (d)–(f). Displays of the polarimetric (i.e., dual pol) variables at (a)–(c) 2216 UTC 8 Feb and (d)–(f) 0236 UTC 9 Feb 2013 — during the Northeast blizzard (from Griffin et al., WAF). At 2216 UTC, pure dry snow was falling within colder temperatures north of the model-indicated 0°C isotherm (bold black dashed line), while wet snow and mixed-phase hydrometeors occurred within warmer temperatures south of the 0°C isotherm in (a)–(c).  At 0236 UTC, dry snow was predominant, while wet snow and ice pellets were also observed within the max ZH region, within below-freezing surface temperatures north of the 0°C isotherm in (d)–(f).

 

In the Journal of Applied Meteorology and Climatology (JAMC), the article by Kumjian et al. discusses the use of intensive radar measurements to study the finescale structure of more than a dozen Colorado Front Range snowstorms. And in Monthly Weather Review, Geerts et al. explain in their article how a specifically synthesized dual Doppler radar technique in an airborne platform was able to directly measure hydrometeor vertical motion, improving the accuracy of the radar.

CSU-CHILL RHI along the 181.998 azimuth at 0852 UTC 9 Apr 2013 for (a) ZH and(b) ZDR. Arrows show the locations of generating cells. Vertical slices through a 9 April 2013 Colorado snowstorm from Colorado State University’s CHILL dual-pol radar show (a) reflectivity (ZH) as well as (b) differential reflectivity (ZDR), which indicates particle shape and size (from Kumjian, JAMC). Arrows show the locations of generating cells.

 

Conceptual model of a vertical slice through a generating cell with a shroud echo with example particle types present. The shroud of large ZDR and low ZH values (yellow color) indicates the presence of pristine anisotropic crystals with platelike or dendritic habits. The core of the generating cell (bluish color) is characterized by more snow aggragates or rimed crystals, the larger of which are descending (blue dashed lines) The core is also where the strongest updraft speeds (and thus supersaturations with respect to ice) are located, indicated the black vertical arrow). In Kumjian’s JAMC article, a conceptual model of a vertical slice through a generating snow cell reveals example particle types. The yellow color indicates the presence of pristine anisotropic snow crystals with platelike or dendritic habits. The core of the generating cell (bluish color) is characterized more by snow aggragates or rimed crystals, the larger of which are descending (blue dashed lines) The core is also where the strongest updraft speeds (black arrow) are located.

 

{ 0 comments }