New Study Now Quantifies the "Huge" Seafloor Movement in 2011 Japanese Earthquake

At a magnitude of 9.0, the earthquake off the Japanese coast last March was already known as one of the most powerful ever recorded, killing (in large part due to the ensuing tsunami) almost 16,000 people and damaging or destroying more than 125,000 buildings. A recent study (available here; subscription required) now quantifies just how monumental the event was: the seafloor in the Japan Trench northeast of the mainland, where the quake originated, was jolted 50 meters horizontally and 10 meters vertically–movement that was “abnormally, extraordinarily huge,” according to Toshiya Fujiwara of the Japan Agency for Marine-Earth Science and Technology.
Fujiwara led the research that used multibeam bathymetric surveys to measure the depth of the water and contouring of the seafloor. He noted that the research team did not expect to be able to use such equipment to detect the crust movement,which during most earthquakes occurs in scales of millimeters or centimeters. For example, the 2005 Miyagi earthquake, which had a magnitude of 7.2, registered a crustal shift of 10 centimeters at a geodetic station near the Japan Trench. The 2011 earthquake had a shift of 15 meters at the same station. The study also found another vertical shift of at least 4-6 meters of a slab of ocean crust between the Japan Trench and the Japanese coastline, which may have contributed to the pulsating pattern of the tsunami waves that eventually struck the country.
The researchers believe that the fault that caused the quake may extend as far as the axis of the Japan Trench.
“Previously, we thought the displacement stopped somewhere underground,” Fujiwara said, “but this earthquake destroyed the entire plate boundary.”
As we posted previously, a number of presentations at the AMS Annual Meeting in New Orleans will cover the community response to the earthquake and tsunami, including Junichi Ishida of the Japan Meteorological Agency who will discuss the earthquake’s impact, the JMA’s response to it, and lessons learned from the disaster in the keynote address for the 28th Conference on Interactive Information Processing Systems (Monday, 11:00 a.m., Room 356).

The Services Response to the Tōhoku Disaster a Focus of the 2012 AMS Meeting

The science ministry in Japan reported last week that more than 30,000 square km–eight percent of the country–is contaminated by radioactive caesium from the Fukushima nuclear plant disaster that stemmed from the Tohoku earthquake and tsunami in March. The radiation was washed out of the skies by rain and snow. As much as four-fifths of the caesium ended up in the ocean–much of it having blown northeastward toward Alaska–and currents carried it to the U.S. coastal waters within a week of reactor releases. By one week later some of the micron-sized particles had traveled around the world.
Because the geophysical dimensions of the earthquake-tsunami-meltdown last March are evident in so many ways, so are the demands it placed on scientific services–from the warnings of giant waves to forecasts of tainted precipitation and groundwater to modeling global ocean currents. Not surprisingly, the disaster literally redefined the job of the Japanese Meteorological Agency.
On the first day of full sessions at the upcoming 2012 AMS Annual Meeting in New Orleans, the epic Tōhoku cataclysm will be discussed from numerous angles, particularly the premium it put on enhanced operational response. “The earthquake and tsunami increased vulnerabilities to meteorological disasters such as sediment disasters, flood, and inundations, in the affected area, by shaking and loosening the soils and damaging the embankments and drainage facilities,” notes JMA’s Junichi Ishida.
Ishida’s presentation is the special keynote address of the Interactive Information Processing Systems (IIPS) conference (11 a.m. Monday, 23 January, Room 356). Ishida will talk about how JMA took increased vulnerabilities into account, by

  • changing criteria for heavy rain warnings to account for runoff and landslide vulnerabilties
  • lowering criteria for coastal inundation warnings (the earthquake actually lowered coastal ground levels, changing tidal configurations)
  • introduced extreme temperature warnings to account for reduced electricity capacity
  • enhanced aviation support (in particular due to traffic for relief flights) because of flight dangers including radioactive clouds

11 March Tsunami sweeps through Sendai Airport, where waters reached the second level of buildings, destroying key operations equipment, scattering mud and debris, and stranding more than a thousand people for two days. The airport eventually reopened as a hub of relief work. Photos copyright Japan Meteorological Agency, with thanks to Junichi Ishida, who will deliver the IIPS conference keynote at the 2012 AMS Annual Meeting.

At the same time (11 a.m. Monday, in Room 338) Yukio Masumoto of the Japan Agency for Marine-Earth Science and Technology will kick off a session devoted to the March 2011 disaster as part of the Coastal Environment symposium. Masumoto will speak about ocean dispersion of radioactive Caesium-137 and Iodine-131 after the Fukushima releases, including relationships with tides, surface winds and, in one case study, atmospheric fallout. In his abstract, Masumoto reports, “In the near-shore region, the wind forcing is a dominant factor that controls the flow field, while large-scale currents and eddies advect the radionuclides in the off-shore region.”
Several other Monday morning presentations in the Coastal Environment session feature rapid American responses last spring to adapt and construct viable modeling systems to depict Japan’s waterborne radiation hazards–speakers include Ronald Meris of the Defense Threat Reduction Agency, William Samuels of Science Applications International Corp (SAIC), and  Matthew Ward of Applied Science Associates.
After lunch, in the same session (2 p.m., Room 338) Gayle Sugiyama of Lawrence Livermore National Laboratory will talk about how the U.S. Department of Energy’s National Atmospheric Release Advisory Center provided analyses and predictions of the radioactive plume, estimating the exposure in both Japan and the United States. Guido Cervone of George Mason University (2:15 p.m., Room 338) will show how dispersion modeling helped reconstruct the otherwise unknown sequence of radioactive releases at the Fukushima nuclear plant. Masayuki Takigawa  (1:45 p.m., Room 338) will discuss results from regional transport modeling of the radioactivity dispersion on land and ocean, while Teddy R. Holt of the U.S. Naval Research Laboratory will show passive tracer modeling capabilities with the Fukushima events in a coupled ocean-atmosphere mesoscale modeling system (1:30 p.m., Room 338).
In a parallel session of the Coastal Environment Conference next door (1:45 p.m., Room 337) Nathan Becker of NOAA/NWS will discuss calculations of detection times for various configurations of the sensors for the Pacific tsunami warning system, concluding that, “for global tsunami hazard mitigation the installation of about 100 additional carefully-selected coastal sea-level gauges could greatly improve the speed of tsunami detection and characterization.”
Interestingly, Monday’s Space Weather posters (2:30 p.m.-4 p.m., Hall E) include a presentation by Tak Cheung of the ionospheric disruptions caused by the great Japanese earthquake last March. Forecasts of ionospheric disturbances affect yet another service in the wake of the disaster: the communications provided by shortwave radio operators. And that will be a topic for Kent Tobiska (Utah State Univ.) in the Space Weather session at 5 p.m. (Room 252/253

New Tools for Predicting Tsunamis

The SWASH (Simulating Waves until at Shore) model sounds like something that would have been useful in predicting the tsunami in Japan. According to the developer Marcel Zijlema at Delft University of Technology, it quickly calculates how tall a wave is, how fast it’s moving, and how much energy it holds. Yet, Zijlema admits that unfortunately it wouldn’t have helped in this case. “The quake was 130 kilometers away, too close to the coast, and the wave was moving at 800 kilometers per hour. There was no way to help. But at a greater distance the system could literally save lives.”
SWASH is a development of the SWAN (Simulating Waves Near SHore), which has been around since 1993 and is used by over 1,000 institutions around the world. SWAN calculates wave heights and wave speeds generated by wind and can also analyze waves generated elsewhere by a distant storm.  The program can be run on an ordinary computer and the software is free.
According to Zijlema, SWASH works differently than SWAN. Because the model directly simulates the ocean surface, film clips can be generated that help in explaining the underlying physics of currents near the shore and how waves break on shore. This makes the model not only an extremely valuable in an emergency, but also makes it possible to construct effective protection against a tsunami

Like SWAN, SWASH will be available as a public domain program.
Another tool recently developed by seismologists uses multiple seismographic readings from different locations to match earthquakes to the attributes of past tsunami-causing earthquakes. For instance, the algorithm looks for undersea quakes that rupture more slowly, last longer, and are less efficient at radiating energy. These tend to cause bigger ocean waves than fast-slipping subduction quakes that dissipate energy horizontally and deep underground.
The system, known as RTerg, sends an alert within four minutes of a match to NOAA’s Pacific Tsunami Warning Center as well as the United States Geological Survey’s National Earthquake Information Center. “We developed a system that, in real time, successfully identified the magnitude 7.8 2010 Sumatran earthquake as a rare and destructive tsunami earthquake,” says Andrew Newman, assistant professor in the School of Earth and Atmospheric Sciences. “Using this system, we could in the future warn local populations, thus minimizing the death toll from tsunamis.”
Newman and his team are working on ways to improve RTerg in order to add critical minutes between the time of the earthquake and warning. They’re also planning to rewrite the algorithm to broaden its use to all U.S. and international warning centers.

Lessons of Sendai: The Need for Community Resilience

by William Hooke, AMS Policy Program Director, adapted from two posts (here and there) for the AMS Project, Living on the Real World
Events unfolding in and around Sendai – indeed, across the whole of Japan – are tragic beyond describing. More than 10,000 are thought to be dead, and the toll continues to rise. Economists estimate the losses at some $180B, or more than 3% of GDP. This figure is climbing as well. The images are profoundly moving. Most of us can only guess at the magnitude of the suffering on the scene. Dozens of aftershocks, each as strong as the recent Christchurch earthquake or stronger, have pounded the region. At least one volcanic eruption is underway nearby.
What are the lessons in Sendai for the rest of us? Many will emerge over the days and weeks ahead. Most of these will deal with particulars: for example, a big piece of the concern is for the nuclear plants we have here. Are they located on or near fault zones or coastlines? Well, yes, in some instances. Are the containment vessels weak or is the facility aging, just as in Japan? Again, yes. So they’re coming under scrutiny. But the effect of the tsunami itself on coastal communities? We’re shrugging our shoulders.
It’s reminiscent of those nature films, You know the ones I’m talking about. We watch fascinated as the wildebeests cross the rivers, where the crocodiles lie in wait to bring down one of the aging or weak. A few minutes of commotion, and then the gnus who’ve made it with their calves to the other side return to business-as-usual. They’ll cross that same river en masse next year, same time, playing Russian roulette with the crocs.
It should be obvious from Sendai, or Katrina, or this past summer’s flooding in Pakistan, or the recent earthquakes in Haiti or Chile, that what we often call recovery isn’t really that at all. Often the people in the directly affected area don’t recover, do they? The dead aren’t revived. The injured don’t always fully mend. Those who suffer loss aren’t really made whole. When we talk about “resilience” we instead must talk at the larger scale of a community that has been struck a glancing blow. Think of resilience as “healing.” A soldier loses a limb in combat. He’s resilient, and recovers. A cancer patient loses one or more organs. She’s resilient, and recovers.
What happens is that the rest of us–the rest of the herd–eventually are able to move on as if nothing as happened. Nonetheless, if we spent as much energy focusing on the lessons from Sendai as we spend on repressing that sense of identification or foreboding, we’d be demonstrably better off.
The reality is that resilience to hazards is at its core a community matter, not a global one. The risks often tend to be locally specific. It’s the local residents who know best the risks and vulnerabilities, who see the fragile state of their regional economy and remember what happened the last time drought destroyed their crops, and on and on.
Similarly, the benefits of building and maintaining resilience are largely local as well, so let’s get real about protecting our communities against future threats. Leaders and residents of every community in the United States, after watching the news coverage of Sendai in the evenings, might be motivated to spend a few hours the morning following building community disaster resilience through private-public collaboration.
What a coincidence! There’s actually a National Academies Natural Research Council report by that same name. It gives a framework for private-public collaborations, and some guidelines for how to make such collaborations effective.
Some years ago, Fran Norris and her colleagues at Dartmouth Medical School wrote a paper that has become something of a classic in hazards literature. The reason? They introduced the notion of community resilience, defining it largely by building upon the value of collaboration:

Community resilience emerges from four primary sets of adaptive capacities–Economic Development, Social Capital, Information and Communication, and Community Competence–that together provide a strategy for disaster readiness. To build collective resilience, communities must reduce risk and resource inequities, engage local people in mitigation, create organizational linkages, boost and protect social supports, and plan for not having a plan, which requires flexibility, decision-making skills, and trusted sources of information that function in the face of unknowns.”

Here’s some more material on the same general idea, taken from a website called learningforsustainability.net:

Resilient communities are capable of bouncing back from adverse situations. They can do this by actively influencing and preparing for economic, social and environmental change. When times are bad they can call upon the myriad of resouces [sic]that make them a healthy community. A high level of social capital means that they have access to good information and communication networks in times of difficulty, and can call upon a wide range of resources.

Taking the texts pretty much at face value, as opposed to a more professional evaluation, do you recognize “resilience” in the events of the past week in this framing?
Maybe yes-and-no. No…if you zoom in and look at the individual small towns and neighborhoods entirely obliterated by the tsunami, or if you look at the Fukushima nuclear plant in isolation. They’re through. Finished. Other communities, and other electrical generating plants may come in and take their place. They may take the same names. But they’ll really be entirely different, won’t they? To call that recovery won’t really honor or fully respect those who lost their lives in the flood and its aftermath.
To see the resilience in community terms, you have to zoom out, step back quite a ways, don’t you? The smallest community you might consider? That might have be the nation of Japan in its entirety. And even at that national scale the picture is mixed. Marcus Noland wrote a nice analytical piece on this in the Washington Post. He notes that after a period of economic ascendancy in the 1980s, Japan has been struggling for the two decades with a stagnating economy, an aging demographic, and dysfunctional political leadership. He notes the opportunity to jump start the country into a much more vigorous 21st century role. We’re not weeks or months from seeing how things play out; it’ll take weeks just to stabilize the nuclear reactors, and decades to sort out the longer-term implications.
In a sense, even with this event, you might have to zoom out still further. Certainly the global financial sector, that same sector that suffered its own version of a reactor meltdown in 2008, is still nervously jangled. A globalized economy is trying to sort out just which bits are sensitive to the disruption of the Japanese supply chain, and how those sensitivities will ripple across the world. Just as the tsunami reached our shores, so have the economic impacts.
This is happening more frequently these days. The most recent Eyjafjallajokull volcanic eruption, unlike its predecessors, disrupted much of the commerce of Europe and Africa. In prior centuries, news of the eruption would have made its way around the world at the speed of sailing ships, and the impacts would have been confined to Iceland proper. Hurricane Katrina caused gasoline prices to spike throughout the United States, not just the Louisiana region. And international grain markets were unsettled for some time as well, until it was clear that the Port of New Orleans was fully functional. The “recovery” of New Orleans? That’s a twenty-year work-in-progress.
And go back just a little further, to September 11, 2001. In the decade since, would you say that the United States functioned as a resilient community, according to the above criteria? Have we really bounced back? Or have we instead struggled mightily with “build(ing) collective resilience, communities … reduc(ing) risk and resource inequities, engag(ing) local people in mitigation, creat(ing) organizational linkages, boost(ing) and protect(ing) social supports, and plan(ning) for not having a plan, which requires flexibility, decision-making skills, and trusted sources of information that function in the face of unknowns.”
Sometimes it seems that 9-11 either made us brittle, or revealed a pre-existing brittleness we hadn’t yet noticed…and that we’re still, as a nation, undergoing a painful rehab.
All this matters because such events seem to be on the rise – in terms of impact, and in terms of frequency. They’re occurring on nature’s schedule, not ours. They’re not waiting until we’ve recovered from some previous horror, but rather are piling one on top of another. The hazards community used to refer to these as “cascading disasters.”
Somehow the term seems a little tame today.