Workshop: Mapping Unstable Ground

I spent Friday at the Mapping Unstable Ground workshop, an experiment in getting local professionals in one place talking about the landslide hazard of British Columbia. The day was split into a series of talks with plenty of unstructured time for the always-invaluable hallway chatter.

The purpose of the workshop is to discuss the effectiveness of current processes (both theoretical and actual) of scientists creating knowledge about hazards and passing that knowledge along (through hazard maps, risk assessment, disaster mitigation planning, and so on) to eventually provide a basis to make decisions that reduce deaths.

One reoccurring theme that requires more consideration for me (thus will be glossed over here) is rephrasing hazard assessments as probabilities within a given timespan instead of return periods. This allows for, in ways I’ll get to another day, analogies to lottery tickets (Someone has to win. Someone has to lose.).

Current Events

Talk Summary

John Clague started the day with his version of a current events in catastrophes talk.

With earthquakes, location is everything. Vancouver is not close to any identified active fault, but this may be due more to lack of research than actual geology. Victoria and Seattle both have identified active faults in close proximity (check Figure 4). A megaquake (M8-9) would produce shaking over a hundred kilometers from the source, lasting for several minutes, while a smaller, shallower earthquake (M6-7) would produce shaking over a much smaller area (radius ~<25 km) for a much shorter time. Megaquakes do happen in the Pacific Northwest (on average, every 300 to 500 years, but that can be as little as 100 years apart, and as long as 1,000 years apart); but smaller earthquakes are much more likely (deep events happen roughly every 30 to 50 years; shallow events on average every 10 to 20 years). Parallels for megaquakes are Chile and Japan; parallels for smaller earthquakes are the 2010 and 2011 New Zealand events.

The speaker started with drawing parallels between February 2010 Chile earthquake and possible Victoria earthquakes, as the distances between epicenter and major population center are roughly parallel. In Chile, the buildings survived the 2-3 minutes of strong shaking, although they did suffer significant structural damage. Few landslides were observed, probably as the earthquake happened during the dry season (an unfamiliar concept in British Columbia).

The more recent Sendai earthquake & tsunami came next in the talk, noting that significant subsidence after the earthquake further reduced the freeboard on seawalls engineered to protect against tsunami. Vancouver is fairly well protected from tsunami due a very large offshore island (and a whole bunch of smaller islands) that should attenuate wave energy.

The two New Zealand events emphasize the importance of location. The 2010 earthquake happened in a scarcely-populated area, and caused some damage, but by shear chance the 2011 earthquake (which was smaller than 2010) was extremely poorly located such that the most intense shaking coincided with the highest population density. As previously discussed, the suburban landslides, liquefaction, and disruption to communications and transportation infrastructure could all happen here with an equally badly-placed earthquake.

Discussion

A Sequence of Unfortunate Events.
Hazards are often treated as independent events, yet in reality are often interlinked in a sequence of disaster: an earthquake triggers a submarine landslide, which triggers a local tsunami (local application). We also briefly touched on current research on earthquake triggering (30 years of data suggests M7 events do not trigger M5+ events; potential for megaquake temporal clusters).

inSAR Subsidence Mapping

Talk Summary

inSAR is a technique for using satellite images to detect changes in ground height. By using stacks of images instead of pairs, Bernhard Rabus and his team have created algorithms for separating out different forms of error (using spacial filters to separate long-range and short-range error, then temporal filters to separate atmospheric, topography, and motion signatures). The filters can be enhanced by grouping regions instead of specific target points (using local knowledge to identify regions that move together, smoothing data). This permits automated processing of images with nonlinear motion using relatively thin stacks (15 images), and will be quite cool when the data moves from preliminary results to something that can be more publicly discussed. For now, using SRTM data as the underlying topography leads to resolution errors (averaging across building heights), and for high-resolution images, a trade-off exists between noise and highly dynamic effects.

Discussion

Where on Earth…
The discussion resembled a game of the ongoing Where on (Google) Earth…? in that members of the audience broke into each case-study-zoom to identify what underlying geoscience features were leading to subsidence. It was a nice affirmation that cross-disciplinary teams are highly effective, with the speaker providing technical skills and data, and the audience providing local knowledge and interpretation.

Field Mapping Debris Flows

Talk Summary

Matthias Busslinger summarized how to field map debris flows (page 8-20) in a systematic manner. He recommends starting at the toe of the event to ensure the deposit wasn’t demobilized, eroded, or otherwise unsuitable for a research case, and to map in segments of similar characteristics.

The speaker went into detail on the difficulty of accurately estimating volume. The tendency to overestimate volume is related to the sensitivity of depth errors (Volume = length x width x depth, but as depth is much, much smaller than length, a small change in depth measurements results in a big change in volume). One way to avoid this is to record average depths, not maximum depths, of a segment, and to dig or probe to confirm depth estimates where possible. He referenced Wise 1997, Hungr et al. 2005, and his thesis for further discussion of volume error estimates, and bulking between initial failure volume and final deposit volume.

The talk ended with a plea I can echo: when mapping, continue beyond the termination point in order to provide full topography for use in dynamic modelling of events.

Discussion

Technology is wonderful.
With the wide availability of satellite imagery, it is usually simple to procure an image to overlay with a grid and use as a basemap for mapping. Use GPS to precisely geotag observations and photographs on the map.

Statistics: length-volume ratios.
Performing regional statistical analysis of volumes to areas (or lengths) of debris flow deposits has utility in initial hazard assessment. Depending on the local geology, a debris flow may be long and thin, or run out shorter but thicker — this analysis could be done even without field time as area may be extracted entirely from satellite images.

2007 Chehalis Lake

Talk Summary

This topic was split between two speakers across lunch, with Martin Lawrence discussing remote mapping and Nick Roberts covering the field mapping of the 2007 landslide into Chehalis Lake. This research project is a multi-group project, approximately coordinated by DSIG @ CEATI.

Remote mapping required blending LiDAR and bathymetric data for an overall digital elevation model both above and below the lake surface. Mapping was complicated by the large amount of floating wood debris on the lake impeding marine navigation.

Field mapping was complicated by extreme access issues, requiring the generation of detailed safety plans that covered extreme events while overlooking more mundane considerations. Observations were made by teams of observers taking GPS-controlled traverses, ferried to different points around the lake. Additional photographs were collected by trawling popular photography sites (Facebook, Flickr, and Picasa), a neat trick for getting pre-event photography at popular tourist destinations.

Discussion

Rules are rules are rules.
Most discussion focused on why Chehalis Lake is currently closed to public access (even by research teams), and the practicality of this decision in the context of the probable ongoing hazard at the location (in isolation, and compared to other locations within the province).

Mount Meager

Talk Summary

Rick Guthrie described the Mt. Meager landslide event near Pemberton in August 2010. The paper I want to link to is still being written, but something over 40 million cubic meters of material raced down the valley in one of the largest landslides in Canadian history. Smaller events have previously occurred in the same location, with hazard assessments identifying the possibility of large events like this one. Even in a remote location with no casualties, just through destruction of roads, a bridge, and forestry-licensed trees, the landslide caused approximately $10 million in damages.

Discussion

Realistic hazard tolerance.
The speaker pointed out that British Columbia has more hazardous area than most countries have area at all, making hazard management a different proposition than for other locations. The situation is compounded by the low population density, which means that fewer people are at risk yet we have a smaller tax base to mitigate hazards. Discussion turned around what is a realistic tolerance limit on hazards, and how to accurately and meaningfully communicate what hazards exist. Risk changes in relation to assumed benefits versus the consequences of a catastrophe; science has a responsibility to communicate the hazard so societies can make informed responses in determining acceptable risk. A balance exists between what federal, provincial, and municipal governments could reasonably do to mitigate risk, and the need for personal responsibility to assume residual risk in a given circumstance.

River Channels & Landslides

Talk Summary

Terry Rollerson spoke on air photo interpretation and field observations between mass wasting and fluvial morphology in the Anderson River between 1961 and 2005.

Cheekye Fan

Talk Summary

Pierre Friele summarized changes in risk perception on the Cheekye Fan between 1959 and the present, covering various hazard maps (including research in 1995, 2000, and 2003, and 2010), and later multi-hazard approaches (2005, and the ongoing Smart Growth BC assessment).

Discussion

Frequency and Cost
Adding cost to the omnipresence balance of mitigating high-magnitude/low-frequency events (catastrophes) versus low-magnitude/high-frequency events (nuisance hazards) shifts the discussion of hazard mitigation to one of balancing annual socioeconomic losses, infrequent long-term socioeconomic losses, and infrequent long-term loss of life. This is a social choice, but one that can be informed by clear multi-hazard assessments to aid decision-making in how to allocate disaster mitigation funding to those most important to the community.

Related Reading

Inevitability of the Improbable – risk on geologic time scales.

This entry was posted in Geoscience, Practice of Science and tagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , , . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *