The demise of the Maya civilization: Water shortage can destroy cultures

Something really drastic must have happened to end the Classic Maya Period in the 9th Century. Within a short period of time, this advanced civilisation in Central America went from flourishing to collapsing — the population dwindling rapidly and monumental stone structures, like the ones built at Yucatán, were no longer being constructed. The reason for this demise remains the subject of debate even today. Model calculations by TU Wien may have found the explanation: the irrigation technology that served the Mayans well during periods of drought may have actually made their society more vulnerable to major catastrophes.

The lessons learnt may also help us to draw important conclusions for our own future. We need to be careful with our natural resources — if technical measures simply deal with the shortage of resources on a superficial level and we do not adjust our own behaviour, society is left vulnerable.

Socio-hydrology

“Water influences society and society influences water,” says Linda Kuil, one of Prof. Günter Blöschl’s PhD students of the Vienna Doctoral Programme on Water Resource Systems, funded by the Austrian Science Fund, at TU Wien. “The water supply determines how much food is available, so in turn affects the growth of the population. Conversely, population increases may interfere with the natural water cycle through the construction of reservoirs, for example.”

Since water and society have such a direct influence on each other, it will not suffice to describe them by separate models. This is why researchers at TU Wien explore the interactions between sociology and hydrology and represent them by coupled mathematical models. The emerging field of socio-hydrology establishes mathematical interrelationships, e.g., between food availability and birth rate, or between recent water shortages that are still fresh in our memories and society’s plans for building water reservoirs. These kinds of interrelationships, combined with a large amount of historical and current data, ultimately yield a complex system that produces different scenarios of human-nature interactions.

The water reservoir: a blessing and a curse

“It’s well-known that the Mayans built water reservoirs in preparation for dry spells,” Linda Kuil says. “With our model, we can now analyse the effects of the Mayans’ water engineering on their society. It is also possible to simulate scenarios with and without water reservoirs and compare the consequences of such decisions.”

As it turns out, water reservoirs can actually provide substantial relief during short periods of drought. In the simulations without reservoirs the Mayan population declines after a drought, whereas it continues to grow if reservoirs provide extra water. However, the reservoirs may also make the population more vulnerable during prolonged dry spells. The water management behaviour may remain the same, and the water demand per person does not decrease, but the population continues to grow. This may then prove fatal if another drought occurs resulting in a decline in population that is more dramatic than without reservoirs.

Sustainable use of resources

We will probably never know all the reasons for the decline of the Mayans. After all, wars or epidemics may have played their part too. The socio-hydrological model developed by the Günter Blöschl-led team of researchers at TU Wien does, however, tell us that droughts and water issues are one possible explanation for their demise and shows us just how vulnerable an engineered society can be. “When it comes to scarce resources, the simplest solutions on the surface are not always the best ones,” Linda Kuil believes. “You have to change people’s behaviour, reassess society’s dependency on this resource and reduce consumption — otherwise society may in fact be more vulnerable to catastrophes rather than safer, despite clever technical solutions.”

Story Source:

The above post is reprinted from materials provided by Vienna University of Technology. Note: Content may be edited for style and length.

Nanofur for oil spill cleanup

Some water ferns can absorb large volumes of oil within a short time, because their leaves are strongly water-repellent and, at the same time, highly oil-absorbing. Researchers of KIT, together with colleagues of Bonn University, have found that the oil-binding capacity of the water plant results from the hairy microstructure of its leaves. It is now used as a model to further develop the new Nanofur material for the environmentally friendly cleanup of oil spills.

Damaged pipelines, oil tanker disasters, and accidents on oil drilling and production platforms may result in pollutions of water with crude or mineral oil. Conventional methods to clean up the oil spill are associated with specific drawbacks. Oil combustion or the use of chemical substances to accelerate oil decomposition cause secondary environmental pollution. Many natural materials to take up the oil, such as sawdust or plant fibers, are hardly effective, because they also absorb large amounts of water. On their search for an environmentally friendly alternative to clean up oil spills, the researchers compared various species of aquatic ferns. “We already knew that the leaves of these plants repel water, but for the first time now, we have studied their capacity to absorb oil,” Claudia Zeiger says. She conducted the project at KIT’s Institute of Microstructure Technology.

Aquatic ferns originally growing in tropical and subtropical regions can now also be found in parts of Europe. As they reproduce strongly, they are often considered weed. However, they have a considerable potential as low-cost, rapid, and environmentally friendly oil absorbers, which is obvious from a short video at http://www.kit.edu/kit/english/pi_2016_115_nanofur-for-oil-spill-cleanup.php.

“The plants might be used in lakes to absorb accidental oil spills,” Zeiger says. After less than 30 seconds, the leaves reach maximum absorption and can be skimmed off together with the absorbed oil. The water plant named salvinia has trichomes on the leaf surface — hairy extensions of 0.3 to 2.5 mm in length. Comparison of different salvinia species revealed that leaves with the longest hairs did not absorb the largest amounts of oil. “Oil-absorbing capacity is determined by the shape of the hair ends,” Zeiger emphasizes. The largest quantity of oil was absorbed by leaves of the water fern salvinia molesta, whose hair ends are shaped like an eggbeater.

Based on this new knowledge on the relationship between surface structure of leaves and their oil-absorbing capacity, the researchers improved the ‘Nanofur’ material developed at their institute. This plastic nanofur mimics the water-repellent and oil-absorbing effect of salvinia to separate oil and water. “We study nanostructures and microstructures in nature for potential technical developments,” says Hendrik Hölscher, Head of the Biomimetic Surfaces Group of the Institute of Microstructure Technology of KIT. He points out that different properties of plants made of the same material frequently result from differences of their finest structures.

Story Source:

The above post is reprinted from materials provided by Karlsruhe Institute of Technology. Note: Content may be edited for style and length.

Epcor Water acquires 130 Pipeline Project in Texas


EPCOR’s wholesale water business expands into the US market. Photo: EPCOR Water.

PHOENIX, AUGUST 22, 2016 — EPCOR Water (USA) Inc. (EPCOR USA), a wholly owned subsidiary of EPCOR Utilities Inc. (EPCOR), today announced that it has acquired the 130 Pipeline Project (130 Pipeline), a 53-mile wholesale water supply pipeline that delivers groundwater from Burleson County to eastern Travis County in the northeastern Austin, Texas metropolitan area.

“This acquisition is a natural extension of our U.S. business platform and builds off of our expertise in developing and providing water solutions in some of North America’s most challenging arid environments,” said Joe Gysel, President of EPCOR USA. “We are extremely pleased to be doing business in the state of Texas and look forward to developing and acquiring other similar businesses in the future.”

EPCOR USA purchased the 130 Pipeline infrastructure and associated contracts and debt for total consideration of up to approximately USD $71 million, including future capacity-related payments.

The acquisition of the 130 Pipeline adds the state of Texas and wholesale water services to EPCOR USA’s business platform.

EPCOR owns and operates more than 3,200 miles of water distribution and sanitary collection mains and transmission pipe infrastructure across its 20 regulated water and wastewater districts in the United States. In Canada, the company is responsible for more than 2,420 miles of distribution and transmission water mains in the City of Edmonton, located in Alberta’s capital region. EPCOR also supplies wholesale water from its system to a vast regional network that serves over 290,000 people in more than 60 communities surrounding Edmonton.

“EPCOR has deep experience in water distribution and transmission development and operations. Leveraging that expertise for our U.S. business platform is a natural next step,” Gysel noted.

Designed to deliver nearly 18 million gallons of water daily, the 130 Pipeline supplies private groundwater to municipal customers in Travis County (Texas) under long-term contracts. Additional wholesale water supply customers can be supported by the 130 Pipeline, which delivers water from the Carrizo-Wilcox Aquifer – one of the largest and most prolific aquifer systems in Texas – with a high degree of quality, reliability and resistance to drought.

The 130 Pipeline is the company’s seventh acquisition since entering the United States, bringing EPCOR USA’s total investment since 2011 to USD $736 million.

In 2011, the company acquired Chaparral City Water Company, followed by the 2012 acquisition of American Water’s Arizona and New Mexico assets and operations. In 2013, EPCOR USA acquired North Mohave Valley Corporation in Arizona and Thunder Mountain Water Company in New Mexico, as well as existing agreements and master-planning responsibilities to provide wastewater and recycled water services to a 7,000-acre development corridor in Glendale, Arizona. In 2016, it acquired Willow Valley Water Company in Arizona.

Today, EPCOR USA is among the largest private water utilities in the Southwest. In addition to providing wholesale water services in Texas, EPCOR is the largest regulated water utility in Arizona and New Mexico, providing water and wastewater services to more than 350,000 customers across 22 communities and seven counties.

WaterWorld Weekly Newscast, August 22, 2016

The following is a transcript of the WaterWorld Weekly Newscast for August 22, 2016. 

Hi, I’m Angela Godwin for WaterWorld magazine, bringing you water and wastewater news headlines for the week of August 22. Coming up…

Harvard study finds toxic chemicals in 33 U.S. water systems

Tank malfunction triggers sewage overflow at Texas plant

Tap water sickens thousands in New Zealand

Failed sensor causes wastewater spill in Michigan

Horrific side effect to Louisiana flooding

Polyfluoroalkyl and perfluoralkyl substances were the focus of a recent Harvard study that looked at US EPA drinking water data from 36,000 water samples across the nation, as well as industrial sites and wastewater treatment plants.

The researchers identified 194 water supplies in 33 states where PFAS was detectable at the minimum reporting level.

About a third of those — 66 — indicated levels of PFOA and PFOS above the EPA’s recently established safety limit of 70 ppt.

Fluorinated compounds are used in the manufacture of commercial and industrial products and have been linked to a number of illnesses including cancer.

You can read more about the Harvard study in the journal Environmental Science & Technology Letters.

Last week, a malfunction at an Austin-area wastewater plant caused about 25,000 gallons of raw sewage to overflow an equalization basin.

Workers reportedly recovered the waste, and none of the sewage made it into a nearby waterway.

The Lost Creek plant, operated by Austin Water, has had a number of operational issues over the past few years and residents have filed numerous odor complaints.

The city says it is planning improvements, including to the equalization basin, by the end of 2016.

Meantime, the utility is planning to replace the tank’s damaged cover.

Last week, an outbreak of campylobacter bacteria in the water supply of North Havelock, New Zealand, sickened around 4,000 residents, causing a variety of gastric illnesses.

At least 17 people were hospitalized. The autopsy of an elderly woman who died last week showed the presence of the bacteria but health officials could not say for certain whether it was the cause of death.

Officials don’t know how the bacteria — which is usually spread by animal feces — got into the water supply but a full investigation is underway.

Meanwhile, the water system is being treated with chlorine and a boil water advisory is in effect.

More than 500,000 gallons of partially treated wastewater spilled from a treatment plant in Kalamazoo, Michigan, last week — reportedly caused by a failed sensor.

Excessive amounts of rain had almost doubled the plant’s daily flow — from 26 MGD to 47. Apparently, a third pump should have kicked on but didn’t because of a failed sensor.

Workers caught the error during a routine inspection and started the pump.

Plant engineers have since fixed the faulty sensor.

As if recent flooding in Louisiana weren’t bad enough, families in Denham Springs are discovering an unexpected consequence: the caskets of loved ones have been washed away by floodwaters. Josh Replogle has the full story.

For WaterWorld magazine, I’m Angela Godwin. Thanks for watching.

Poseidon working on interagency agreement to streamline permitting for Huntington Beach desalination project

HUNTINGTON BEACH, CA, AUGUST 22, 2016 — Poseidon Water has announced it is working on an agreement with state permitting agencies to streamline the approval process for the proposed Huntington Beach Desalination Project. The Coastal Commission originally planned to consider the Project’s Coastal Development Permit on September 9; however, Poseidon and Commission staff agreed to defer consideration of the Project’s CDP in order for an interagency agreement clearly defining the remaining permitting process to be finalized.

The Huntington Beach Project will produce 56,000 acre feet per year (50 million gallons per day) of locally controlled, drought-proof drinking water that will reduce Orange County’s need to import water from Northern California and the Colorado River. The Huntington Beach Project is the single largest source of new, local drinking water supply available to the region and is identified in County water planning documents as a planned future water supply. In May 2015, Poseidon and the Orange County Water District reached agreement on the terms for the District to purchase the facility’s full 50 million gallons-per-day capacity.  

“California continues to suffer from the effects of the worst recorded drought in state history,” said Poseidon Vice President Scott Maloni. “Consistent with Governor Brown’s directive to ‘help local water agencies reduce the time required to comply with state-required environmental reviews,’ we are grateful to the staffs of the Coastal Commission, State and Regional Water Boards and State Lands Commission for working with us on the most efficient, orderly and timely permitting of the Huntington Beach Project.”

In September 2015, Poseidon resubmitted a new application to the Coastal Commission and proposed technological enhancements to the Project’s seawater intake and discharge facilities in order to comply with the State Water Board’s seawater desalination policy, adopted in May 2015. Poseidon first applied for the Coastal Development Permit in 2006. Earlier this month Coastal Commission staff requested that Poseidon postpone the planned September permit hearing and reconsider the sequence of Project permits and approvals. 

In June, Poseidon submitted an application to the Santa Ana Regional Water Quality Control Board to renew and amend its existing 5-year operating permit, which was last issued in 2012 and is due to expire in February 2017.  In July, Poseidon submitted an application to the California State Lands Commission to amend a lease agreement for the Project’s seawater intake and discharge facilities, which was first approved in 2010.  Poseidon’s Coastal Commission, State Lands Commission and Regional Board applications all include a common description of the Project’s seawater intake and discharge technological enhancements in compliance with the State Water Board’s Desalination Amendment.

Poseidon’s proposed Huntington Beach Project will be the first desalination facility in the world to include 1mm (1/25th inch, approximately the thickness of a credit card) slot width seawater intake screens and through-screen water velocity of less than 0.5 feet per second in an open-ocean setting. The plant will also include state-of-the-art diffuser technology that will ensure that the salinity level in the plant’s seawater discharge meets the State Water Board’s stringent new receiving water quality requirements.  These technologies will minimize the intake and mortality of all forms of marine life.  Earlier this month Poseidon announced the Huntington Beach plant will be the first large-scale water treatment plant in California to be 100% carbon neutral. 

About Poseidon
Poseidon Water specializes in developing and financing water infrastructure projects, primarily seawater desalination and water treatment plants in an environmentally sensitive manner. These projects are implemented through innovative public-private partnerships in which private enterprise assumes the developmental and financial risks. For more information on Poseidon Water and the Huntington Beach desalination facility, visit http://HBfreshwater.com.

Urban water pumping raises arsenic risk in Southeast Asia

Large-scale groundwater pumping is opening doors for dangerously high levels of arsenic to enter some of Southeast Asia’s aquifers, with water now seeping in through riverbeds with arsenic concentrations more than 100 times the limits of safety, according to a new study from scientists at Columbia University’s Lamont-Doherty Earth Observatory, MIT, and Hanoi University of Science.

Normally, groundwater levels in this monsoon region are higher than the rivers, so water flows from aquifers into adjacent waterways. A few years ago, however, scientists began noticing that large-scale groundwater pumping around cities like Hanoi was lowering the groundwater level, so much so that the flow had reversed in some areas and river water was making its way into the aquifers instead.

The scientists have since tested the water and riverbed along the Red River near Hanoi and discovered dangerously high concentrations of dissolved arsenic, far higher than expected, but they also found clear patterns of contamination that may be able to help farmers and communities locate lower-risk sites for wells.

The findings, appearing in the American Geophysical Union journal Water Resources Research, carry important lessons for groundwater management in a region that has long struggled with health effects of arsenic contamination. Arsenic in groundwater is a problem in many countries, including parts of the United States, but it is widespread in Southeast Asia, where its impact on poor communities has been described as the largest mass poisoning in history. Long-term exposure can cause liver and kidney damage and skin cancers that cause sores on the hands and feet. Arsenic can also affect crop yields.

“We have this perception of groundwater as this giant underwater lake, a nearly infinite resource. But even in places were the water is rapidly recharged, using it a lot can move water around in ways that affect the location and extent of contamination,” said coauthor Ben Bostick, a geochemist at Lamont, who with coauthor Alexander van Geen of Lamont has been working for over a decade with communities in Vietnam, Cambodia and Bangladesh to avoid arsenic contamination and locate safer water sources.

In Hanoi, groundwater pumping doubled during the 2000s to an estimated 240 million gallons a day by 2010, and groundwater levels there have been dropping by about 1 meter per year. The effects have become evident in the village of Van Phuc, about 10 kilometers downstream, where arsenic from riverbed sediments has started to contaminate an older aquifer that had long been considered clean.

For the new study, the scientists traveled along the Red River using a device that looks like a syringe with a very long needle to take samples of riverbed sediment and water at a depth of 1 meter. They found the highest arsenic levels in areas where the river flow was slow and new sediment was being deposited, typically next to land inside a river bend. Young sediments can be highly reactive and susceptible to releasing arsenic as water flows through them. The sediments in these slow-water areas were less than 10 years old in places, and they were releasing arsenic into groundwater at concentrations exceeding 1,000 micrograms per liter, 100 times higher than the World Health Organization considers safe.

In contrast, almost all of the wells near faster-flowing water — where little new sediment was accumulating — tested below the WHO limits.

“Prior to intensive pumping, the water would not have been flowing through these sediments, and the aquifer would not have been drawing in this much arsenic,” said Mason Stahl, lead author of the study and a recent Ph.D. graduate of MIT. “If groundwater pumping continues — and it will probably intensify — the contamination will continue to migrate.”

South and Southeast Asia are especially susceptible to arsenic poisoning because their low-lying deltas are largely made up of young sediments and have plenty of organic matter that contributes to the release of arsenic into water.

The Red River’s arsenic generally comes from iron oxides carried downstream from the mountains. When iron oxides are deposited along the riverbed, they are in an environment that is low in oxygen and high in organic matter, from sources such as plant matter and sewage. Bacteria reduce the iron oxide to release oxygen, and that natural process allows the arsenic to enter the water. The process is fast. Within a few months, the scientists measured concentrations up to 1,500-2,000 micrograms per liter from new sediment.

The results suggest that for pumping, communities should still target the oldest aquifers, where sediments are no longer being deposited and most of the arsenic has leached out. But they also need to think about other potential arsenic sources — such river bends with fresh sediment.

“The good news is that rivers normally meander, and cities very seldom are the size of one meander,” Bostick said. Cities can put their well fields in areas where the aquifers aren’t being recharged through young river sediment, he said. Damming a river would also keep sediment back and control the river’s height, but dams can pose other challenges by changing sedimentation, ecology and the water budget.

Water treatment and filtration are also becoming more common, and Bostick believes this may be the most effective solution. The arsenic is not going to go away overnight, and pumping for irrigation will have to continue so farms can feed the region’s large population. “Although groundwater has a lot of advantages, you also have to really think about how much you want to use it. With technology, it’s pretty easy to clean surface water to use it now,” Bostick said.

Michael Puma, an expert on water and food security at NASA’s Goddard Institute for Space Studies who was not involved in the study, noted that “Arsenic contamination of groundwater threatens well over 100 million people worldwide. These important findings will help us as we strive to manage and improve the quality of drinking water, especially for those living in extreme poverty around the world.”

The other coauthors of the new study are Charles F. Harvey of MIT; Jing Sun of Lamont; and Pham Thi Kim Trang, Vi Mai Lan, Thao Mai Phuong and Pham Hung Viet of Hanoi University of Science.

Does owning a well foster environmental citizenship? A new study provides evidence

Kansans who own water wells show more awareness of state water policy issues than those who rely on municipal water supplies, according to a study that could have implications for groundwater management and environmental policies.

Brock Ternes, a University of Kansas doctoral student in sociology, found that well owners prioritized issues related to the depletion of the High Plains Aquifer — which is the underground reservoir of freshwater beneath much of the western half of the state.

Based on a survey he conducted of 864 Kansans, Ternes discovered that well owners were significantly more aware of water supplies and water-related policies and agencies, including the Kansas Water Office, Groundwater Management Districts, and the Governor’s Long-Term Vision for the Future of Water Supply in Kansas.

“The people who use private wells for water are more likely to hear about water-related policy issues and pay attention to them,” said Ternes, who will present his study at the 111th Annual Meeting of the American Sociological Association (ASA).

For example, he found that well owners also tended to be more aware of the Kansas Aqueduct proposal, an $18 billion undertaking that would divert water from the Missouri River to western Kansas. Non-well owners were less familiar with this enormous waterworks project.

Like so many regions suffering from recent droughts, rural Kansas has been particularly hard-hit by the scarcity of water. The High Plains Aquifer has been over-pumped for its valuable irrigation water, and researchers estimate that unless pumping is curtailed, the aquifer will no longer support irrigation wells in portions of southwestern Kansas within 25 years, Ternes said.

“Sociological studies are imperative for understanding the mindsets of well owners, who are a distinct group of Kansans who will continue to influence the availability of groundwater,” Ternes said.

As part of his survey, Ternes found that private well owners highly prioritize conserving water for the future.

“Most well owners believe securing water is one of the top political challenges facing Kansas, and water policies are more likely to influence their vote in local and state elections than Kansans who don’t own wells,” said Ternes. In this study, he coins the term “groundwater citizenship,” which emphasizes the stewardship of aquifers and deliberate water conservation in order to conserve supplies of groundwater.

“My data suggest that well owners have different political priorities than non-well owners and conserve water with the hopes of extending their supply, which makes them a unique type of citizen,” he said.

This research could be valuable for policymakers and water officials in Kansas as they seek to examine possible solutions for protecting the High Plains Aquifer. If they understand the importance of engaging well owners who are passionate about these issues, it can help bring water conservation more to the public forefront.

“Water supply infrastructure is clearly connected to how in-tune people are with their natural resources, which is profoundly important for environmental policymaking and survival in the Anthropocene,” Ternes said.

The study has broader implications for environmental stewardship as many states grapple with vulnerability to drought, he added.

“Technologies might grant us access to natural resources and make them seem more readily available when they are in reality much more scarce,” Ternes said. “This is why we need to analyze the systems that provide access to finite resources like water.”

Story Source:

The above post is reprinted from materials provided by American Sociological Association. Note: Content may be edited for style and length.

EPA and U. S. Steel to discuss proposed Great Lakes Legacy Act cleanup plan for Spirit Lake site

News Releases

News Releases from Region 05

08/22/2016

Contact Information: 

DULUTH, MINN. – (Aug. 22, 2016) On Thursday, Aug. 25, the U.S. Environmental Protection Agency and U. S. Steel will hold a public information meeting about a proposed cleanup plan for the U. S. Steel Duluth Works/Spirit Lake site.

 

The public meeting will be held at Denfeld High School, 401 N. 44th Ave. W. An open house session with posters explaining the project will begin at 3 p.m. and will be immediately followed by a formal presentation of the cleanup plan at 6 p.m. Residents will have the opportunity to ask questions after the presentations.

 

EPA is accepting comments until Sept. 25. Comments should be submitted in writing to: Scott Cieniawski, Great Lakes National Program Office (G-17J) – EPA Region 5, 77 W. Jackson Blvd., Chicago, Illinois 60604-3590; or email to cieniawski.scott@epa.gov. Documents related to the proposed plan are available at West Duluth Public Library, 5830 Grand Ave.

 

Sediment in Spirit Lake near the former U. S. Steel Duluth Works on the St. Louis River was polluted with polynuclear aromatic hydrocarbons and heavy metals left over from the area’s industrial past. EPA and U. S. Steel are working together through the Great Lakes Legacy Act to complete the sediment cleanup project. EPA is also coordinating with the Minnesota Pollution Control Agency to ensure that the proposed remedy meet both state and federal requirements for the projection of human health and the environment.

 

For more information about the proposed cleanup plan, visit www.epa.gov/st-louis-river-bay-aoc/spirit-lake-legacy-act-cleanup<http://www.epa.gov/st-louis-river-bay-aoc/spirit-lake-legacy-act-cleanup.

 

For more information about the Great Lakes Legacy Act, visit www.epa.gov/great-lakes-legacy-act<http://www.epa.gov/great-lakes-legacy-act.

2014 Napa earthquake continued to creep, weeks after main shock

Nearly two years ago, on August 24, 2014, just south of Napa, California, a fault in Earth suddenly slipped, violently shifting and splitting huge blocks of solid rock, 6 miles below the surface. The underground upheaval generated severe shaking at the surface, lasting 10 to 20 seconds. When the shaking subsided, the magnitude 6.0 earthquake — the largest in the San Francisco Bay Area since 1989 — left in its wake crumpled building facades, ruptured water mains, and fractured roadways.

But the earthquake wasn’t quite done. In a new report, scientists from MIT and elsewhere detail how, even after the earthquake’s main tremors and aftershocks died down, earth beneath the surface was still actively shifting and creeping — albeit much more slowly — for at least four weeks after the main event. This postquake activity, which is known to geologists as “afterslip,” caused certain sections of the main fault to shift by as much as 40 centimeters in the month following the main earthquake.

This seismic creep, the scientists say, may have posed additional infrastructure hazards to the region and changed the seismic picture of surrounding faults, easing stress along some faults while increasing pressure along others.

The scientists, led by Michael Floyd, a research scientist in MIT’s Department of Earth, Atmospheric and Planetary Sciences, found that sections of the main West Napa Fault continued to slip after the primary earthquake, depending on the lithology, or rock type, surrounding the fault. The fault tended to only shift during the main earthquake in places where it ran through solid rock, such as mountains and hills; in places with looser sediments, like mud and sand, the fault continued to slowly creep, for at least four weeks, at a rate of a few centimeters per day.

“We found that after the earthquake, there was a lot of slip that happened at the surface,” Floyd says. “One of the most fascinating things about this phenomenon is it shows you how much hazard remains after the shaking has stopped. If you have infrastructure running across these faults — water pipelines, gas lines, roads, underground electric cables — and if there’s this significant afterslip, those kinds of things could be damaged even after the shaking has stopped.”

Floyd and his colleagues, including researchers from the University of California at Riverside, the U.S. Geological Survey, the University of Leeds, Durham University, Oxford University, and elsewhere, have published their results in the journal Geophysical Research Letters.

Right time, right place

Floyd and co-author Gareth Funning, of UC Riverside, have been studying fault motions in northern California for the past seven years. When the earthquake struck, at about 3:20 a.m. local time, they just happened to be stationed 75 miles north of the epicenter.

“At the time, I did stir, thinking, ‘C’mon, go back to sleep!'” Floyd says. “When we woke up, we turned on the news, figured out what happened, and immediately got back in our cars, picked up the instruments we had in the field, drove down the freeway to American Canyon, and started to put out instruments at sites we had measured just a few weeks before.”

Those instruments made up a network of about a dozen GPS receivers, which the team placed on either side of the fault line, as close to the earthquake’s epicenter as they could. They left most of the instruments out in the field, where they recorded data every 30 seconds, continuously, for three weeks, to observe the distance the ground moved.

“The key difference between this study and other studies of this earthquake is that we had the additional GPS data very close to the epicenter, whereas other groups have only been able to access data from sites farther away,” Floyd says. “We even had one point that was 750 meters from the surface rupture.”

Creeping faults, silent shadows

The team combined its GPS data with satellite measurements of the region to reconstruct the ground movements along the fault and near the epicenter in the weeks following the main earthquake. They found that the fault continued to slip — one side of the fault sliding past the other, like sandpaper across wood — at a steady rate of several centimeters per day, for at least four weeks.

“The widespread and rapid afterslip along the West Napa Fault posed an infrastructure hazard in its own right,” the authors write in the paper. “Repeated repairs of major roads crosscut by the rupture were required, and in some areas, water pipes that survived the [main earthquake] were subsequently broken by the afterslip.”

The earthquake and the afterslip took many scientists by surprise, as seismic data from the area showed no signs of movement along the fault prior to the main shock.

Regarding the afterslip’s possible effects on surrounding faults, the researchers found that it likely redistributed the stresses in the region, lessening the pressure on some faults. However, the researchers note that the afterslip may have put more stress on one particular region near the Rodgers Creek Fault, which runs through the city of Santa Rosa.

“Right now, we don’t think there’s any significantly heightened risk of quakes happening on other nearby faults, although the risk always exists,” Floyd says.

Curiously, the scientists identified a large region beneath the West Napa Fault, just northwest of Napa, which they’ve dubbed the “slip and aftershock shadow” — a zone that was strangely devoid of any motion during both the earthquake and afterslip. Floyd says this shadow may indicate a buildup in seismic pressure.

“The fact that nothing happened there is almost more cause for concern for us than where things actually happened,” Floyd says. “It would produce a fairly small quake if that area was to rupture, but there’s just no knowing if it would continue on to start something more.”

Floyd says that in developing seismic hazard assessments, it’s important to consider afterslip and slowly creeping faults, which occur often and over long periods of time following the more obvious earthquake.

“There are some earthquakes where we think we might be seeing some activity even 15 years after the main quake,” Floyd says. “So the more examples of an earthquake happening followed by afterslip that we can study, the better we can understand the entire process.”

NASA monitors the “new normal” of sea ice

This year’s melt season in the Arctic Ocean and surrounding seas started with a bang, with a record low maximum extent in March and relatively rapid ice loss through May. The melt slowed down in June, however, making it highly unlikely that this year’s summertime sea ice minimum extent will set a new record.

“Even when it’s likely that we won’t have a record low, the sea ice is not showing any kind of recovery. It’s still in a continued decline over the long term,” said Walt Meier, a sea ice scientist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “It’s just not going to be as extreme as other years because the weather conditions in the Arctic were not as extreme as in other years.”

“A decade ago, this year’s sea ice extent would have set a new record low and by a fair amount. Now, we’re kind of used to these low levels of sea ice — it’s the new normal.”

This year’s sea ice cover of the Barents and Kara seas north of Russia opened up early, in April, exposing the surface ocean waters to the energy from the sun weeks ahead of schedule. By May 31, the extent of the Arctic sea ice cover was comparable to end-of-June average levels. But the Arctic weather changed in June and slowed the sea ice loss. A persistent area of low atmospheric pressure, accompanied by cloudiness, winds that dispersed ice and lower-than-average temperatures, didn’t favor melt.

The rate of ice loss picked up again during the first two weeks of August, and is now greater than average for this time of the year. A strong cyclone is moving through the Arctic, similar to one that occurred in early August 2012. Four years ago, the storm caused an accelerated loss of ice during a period when the decline in sea ice is normally slowing because the sun is setting in the Arctic. However, the current storm doesn’t appear to be as strong as the 2012 cyclone and ice conditions are less vulnerable than four years ago, Meier said.

“This year is a great case study in showing how important the weather conditions are during the summer, especially in June and July, when you have 24 hours of sunlight and the sun is high in the sky in the Arctic,” Meier said. “If you get the right atmospheric conditions during those two months, they can really accelerate the ice loss. If you don’t, they can slow down any melting momentum you had. So our predictive ability in May of the September minimum is limited, because the sea ice cover is so sensitive to the early-to-mid-summer atmospheric conditions, and you can’t foresee summer weather.”

As scientists are keeping an eye on the Arctic sea ice cover, NASA is also preparing for a new method to measure the thickness of sea ice — a difficult but key characteristic to track from orbit.

“We have a good handle on the sea ice area change,” said Thorsten Markus, Goddard’s cryosphere lab chief. “We have very limited knowledge how thick it is.”

Research vessels or submarines can measure ice thickness directly, and some airborne instruments have taken readings that can be used to calculate thickness. But satellites haven’t been able to provide a complete look at sea ice thickness in particular during melting conditions, Markus said. The radar instruments that penetrate the snow during winter to measure thickness don’t work once you add in the salty water of the melting sea ice, since the salinity interferes with the radar.

The Ice, Cloud and land Elevation Satellite-2, or ICESat-2, will use lasers to try to get more complete answers of sea ice thickness. The satellite, slated to launch by 2018, will use a laser altimeter to measure the heights of Earth’s surface.

In the Arctic, it will measure the elevation of the ice floes, compared to the water level. However, only about one-tenth of sea ice is above the water surface; the other nine-tenths lie below.

To estimate the entire thickness of the ice floe, researchers will need to go beyond the above-water height measurements, and perform calculations to account for factors like the snow on top of the ice and the densities of the frozen layers. Scientists are eager to see the measurements turned into data on sea ice thickness, Markus said.

“If we want to estimate mass changes of sea ice, or increased melting, we need the sea ice thickness,” he said. “It’s critically important to understanding the changes in the Arctic.”

Story Source:

The above post is reprinted from materials provided by NASA/Goddard Space Flight Center. The original item was written by Maria-José Viñas and Kate Ramsayer. Note: Content may be edited for style and length.