Thursday, June 30, 2011

The Quantico Watershed Study and the Chesapeake Bay TMDL

On June 23, 2011 I attended the Prince William Department of Public Works meeting to hear the results of the recently completed Quantico Watershed Study . Warren High from the County’s engineering consultants, MACTEC, presented the results. The study was used to assess the current stream conditions, examine existing storm water management facilities, and to identify future Capital Improvement Projects. They used a standardized system of scoring streams and stormwater basins, ponds and retention ponds called RSAT. The meeting was attended mostly by representatives of community groups and elected and appointed officials of Quantico Bay area who have fought long and hard to try to restore the Quantico Bay that suffers from excessive sedimentation and hydrilla, an invasive aquatic plant.

Prince William County at the edge of the greater Washington DC metropolitan area, is part of the greater Chesapeake Bay watershed and has 10 major watersheds of its own covering 360 square miles that are in turn subdivided into 222 sub water sheds. The Quantico Watershed Study is the forth watershed study to be completed. The watershed studies evaluate CIP Planning and Storm Water Management Facilities, Regulatory Compliance with federal and state regulations including EPA, CWA, VA DCR and DEQ and the Army Corp of Engineers. The studies typically examine a small number of sub-watersheds to characterize the watershed as a whole. The Quantico study examined six sub-watersheds that included Quantico Creek, South Fork, Dewey’s Creek and Swans Creek. The elevations of the sub-watersheds range from sea level to 450 feet elevation and cover an area occupied by Prince William Forest Park at the northwestern most portion of the watershed to Quantico Marine Corp Base and Dumfries as well as several residential neighborhoods.

MACTEC found culverts impeding fish passage, supercharging the water flow resulting in the scrubbing the soil during rain events. There were mid channel bars caused by woody debris, lawn cuttings and trash. Utility corridors and low head dams where corridor encroachment had occurred due to the lack of a riparian buffer which has lead to severe bank erosion of 2-3 feet per year. Swans Creek has 30 foot cuts and the extreme erosion has almost buried the stream. Every time there is a storm event all the soil is mobilized. Depositing of eroded soil and sedimentation are filling in the estuary.

MACTEC recommends several steps to increase the stormwater retention volume and detention time to prevent further sedimentation in Quantico Bay. Legacy stormwater retention ponds were designed to discharge at a “2 year storm rate” unfortunately streams are shaped by 1 ½ year storm rates so that all the existing retention ponds are shaping our creeks and streams. So the most basic recommendation is that the on-site stormwater capture and retention needs to be beefed up to slow the discharge rate and capture the first flush pollutants. The older holding basins do not do that, but are nonetheless in compliance with the design standards that were in effect when they were built. Larger basins could slow the flow, provide wildlife habitat and increase groundwater infiltration and recharge.

Open channel recommendations were the other major area that MACTEC felt needs to be addressed. These include infrastructure repair, debris removal channel restoration or enhancements, riparian buffer restoration and finally preservation and monitoring of the enhancements. Overall, MACTEC identified 30 open channel problems and 16 stormwater basin maintenance problems in the 6 sub-watersheds and estimates that these improvements and repairs will cost $15.7 million. Stabilizing the stream beds to slow the catastrophic erosion rate of 2-3 feet per year to a more natural erosion rate in the area of Prince William Estates and Dewey’s Creek is the only way to ameliorate the rate of sedimentation in the Quantico Bay. The only way to restore the bay would be first to stabilize the up-stream stream beds and then dredge the bay. While a certain amount of erosion is entirely natural and is part of the natural cycle, development, pavement, destruction of the riparian buffers have resulted in extreme erosion.

Most of the repairs recommended by MACTEC are to address the open channel problems and will have to be maintained and monitored. These are the costs to address just one of the ten watershed basins in Prince William County Virginia. estimating total costs for the entire county from that number is hundreds of millions of dollars. In thinking about how to finance and maintain theses improvements, credit trading and sale of credits for the EPA mandated TMDL seems one source (beyond a direct surcharge tax on property) for funding these improvements. If these types of improvements could be quantified within the Chesapeake Bay Model modules for compliance with the TMDL targets there might be a way to fund some of these activities without resorting to command and control regulatory model so popular to the north of us. Unlike MS4 (municipal separate storm sewage system) and waste water treatment plant permits which have measured results that can be traded, these would have to be given “model credit” for open channel recommendations, on-site stormwater capture and detention etc. if they are to be traded, but like farm BMPs are probably low lying fruit. It might be possible to utilize the Scenario Builder within the Chesapeake Bay model to generate simulations to quantify watershed restoration.

Sunday, June 26, 2011

A New Round of Rebate Funding for the Virginia Department of Mines, Minerals and Energy

The Virginia Energy Efficiency Rebate Program was originally launched in late October 2009 by Governor Kane. Utilizing a portion of the stimulus dollars that Virginia received to support the purchase of energy efficient products and upgrades for Virginia homeowners and commercial businesses. Energy efficiency improvements under the program included upgrading heating and air conditioning equipment, adding insulation, replacing leaky windows, and other improvements to reduce energy consumption. Homeowners were eligible for rebates for up to 20 % of the costs of qualifying products and projects, up to a maximum of $2,000.

The first round of funding totaling about $10 million was reserved in less than three weeks when the program opened. In late March 2010, Governor Bob McDonnell announced that approximately $6.5 million was available for a second round of the rebate program to make existing homes and businesses more energy efficient. Funds for the second and final round of the Energy Efficiency Rebate Program were exhausted on March 26, 2010. Over 3,000 applicants were wait-listed, and eventually approved for rebates when much of the rebate reservations were not used. The Energy Efficiency Rebate Program was closed out on April 29, 2011 after paying out $10.4 million to Virginia homeowners and businesses.

Once more, unclaimed funds remain and are now being made available to other homeowners. Approximately $5 million will be available for a new Virginia Home Efficiency Rebate Program to make existing homes more energy efficient. Energy efficiency improvements include upgrading heating equipment, adding insulation, replacing windows, and making other improvements to existing homes that reduce energy consumption and utility costs. Under this new program, homeowners will be eligible to reserve funds for rebates for up to 20 % of the costs of qualifying products or services, up to $595 whichever is less. Also a rebate is available for energy audits for the cost of the audit or $250 which ever is less. The Virginia Department of Mines, Minerals and Energy opened up the reservation process at noon of June 20th 2011 and the money is likely to be gone quickly, but sign up on the waiting list. Any money not used within the time limit will become available to those on the wait list. The $5 million is simply unclaimed funds from other rounds.

Qualified energy efficient items and improvements purchased and installed on or after March 26, 2010 are eligible for the rebate if they meet all the other eligibility requirements. So if you were shut out of the last round of rebates and have an the required documentation, and used a Virginia business to purchase or perform the work you can apply. Items covered under the program are: oil furnace, gas, propane or oil hot water boiler, Insulation and air sealing, replacement windows and exterior doors, storm doors. Funding is available for homeowners to reserve funds for geothermal heat pump systems under our new Geothermal Heat Pump Rebate Program. The geothermal rebate is 20% of the cost or $2,000 which ever is less.

These rebates are for Virginia homeowners only, not commercial facilities and the energy efficiency products and systems must be purchased from a Virginia company. In addition, these items qualify under the Federal Home Energy Efficiency Improvement Tax Credit Program The tax credit amount was reduced to 10% of cost up to $500 on qualifying items installed in 2011 and additional restrictions were added. More information on the federal tax credits, which were extended until December 31, 2011, is available at the Department of Energy web site.

One of my most successful home improvement projects was my home insulation project. Following the recommendations by the Oak Ridge National Laboratory the attic, crawl spaces, eves, ductwork, underside of a large portion of the main level floor were insulated with cellulose. The pipes, end caps, knee wall, sump pumps and all identified areas were sealed, the garage was insulated and an insulated garage door installed. My total electricity bills for the following 12 months were 27% less than I paid in the 12 months before I added the additional insulation to the house, and the winter liquid propane usage (as measured in volume used December through March for both years) was reduced by 25%. I was very surprised at the energy savings for what was a well insulated home. The payback on this project was under 4 years, and I did not get any rebates because I completed the project in 2007.

Thursday, June 23, 2011

The Supreme Court and Carbon Dioxide

On Monday June 20th 2011 the Supreme Court unanimously (8-0 with Justice Sotomayor recused) rejected a lawsuit that had sought to force major electric utilities to reduce their greenhouse gas emissions without waiting for federal regulators to act. The case was originally filed in July of 2004 when eight states, California, Connecticut, Iowa, New Jersey, New York, Rhode Island, Vermont and Wisconsin and New York City filed the suit in Federal Court. Later New Jersey and Wisconsin dropped out. The six remaining states were joined by several Land Trusts and legal foundations dedicated to litigating their way to their envisioned better future. The suit was filed against by AEP, Xcel Energy Inc., Duke Energy Corp., Southern Co. and the Tennessee Valley Authority who were ultimately joined in their fight by legal foundations with the opposite or differing world view.

The states claimed that the utilities contribute to global warming by pumping 650 million tons of carbon dioxide into the atmosphere each year, representing about 25% of emissions from U.S. power plants and 10 % of emissions from all U.S. sources. The suit sought to have the courts force cuts in emissions from these plants.

The utilities had questioned the states legal right, or (in legal speak) standing, to sue because they couldn’t show that they were harmed by anything the utilities did or that they would benefit from a ruling against the power companies. On the standing issue the court was split 4-4 with Justice Sotomayor recused (she had heard the case on appeal in New York) so the court made no ruling. It is truly difficult to see how the plaintiffs could have shown harm from the utilities or benefit from the reduction of emissions even if the utilities shut all their plants down. First, the states would have had to demonstrate global warming is occurring and is caused by the utilities operations and then demonstrated how the states were harmed by global warming as well as demonstrating how the states would benefit from a reduction in carbon dioxide released by the plants. Nonetheless, the court did not rule on standing. The court should stick to questions of law and leave interpretation of science and scientific speculation to agencies.

In the opinion written by Justice Ruth Bader Ginsburg, the court held that the states and other plaintiffs can’t use federal public-nuisance law to seek court-imposed limits on carbon dioxide emissions. Federal common law is displaced and no nuisance claim is within the powers of the court to decide because Congress authorized EPA to regulate greenhouse gas emissions under the federal Clean Air Act (CAA). In a previous Supreme court decision in Massachusetts v EPA (2007), the Supreme Court had ruled that the Clean Air Act did authorize federal regulations on greenhouse gas emissions, and that the agency was required to issue them unless it had a scientific basis for its refusal.

Justice Ginsburg said the plaintiffs were making their case in the wrong forum, Clean Air Act authority precludes federal common law even when the agency has not exercised its statutory authority. Justice Ginsburg emphasized EPA’s plans to regulate utility greenhouse gas emissions under Section 111 of the Clean Air Act, which governs establishment of New Source Performance Standards (NSPS).

The decision noted that the Clean Air Act Section 111(d) confers authority to the EPA to set new NSPS for existing sources as well as new sources. However, EPA has infrequently used its authority under Section 111(d). Though it appears that EPA intends to regulate greenhouse gases under the Clean Air Act, EPA does have another option that it has used more frequently in the past. EPA can choose to regulate greenhouse gases and carbon dioxide under the National Ambient Air Quality Standards, NAAQS, where costs of the regulation cannot be considered. Under Section 111(d) of the Clean Air Act costs of the regulation must be considered.

When EPA regulates using the NAAQS provisions the result is more stringent regulations because costs cannot be considered under NAAQS. However, Justice Ginsburg seemed to be identifying the Clean Air Act as the source of the authority to regulate carbon dioxide in her decision. The current science, political and economic environment is one where costs must be considered. The science of climate is still beyond our full understanding and methods of regulation as well as their costs must be considered if our nation is to continue to maintain anything close to our standard of living and the financial ability to respond to natural disasters, severe weather and changes in climate. We are poorer than we once were and no other nation will race to our assistance.

Monday, June 20, 2011

Sun Spots, Solar Minimum and Climate Change

After a period of increased activity during the 20th century, the sun now appears to be in an extended solar minimum. Three teams of scientists presented their results on June 14, 2011 at the meeting of the American Astronomical Society’s Solar Physics division in New Mexico that all indicate an extended period of low solar activity.

The evidence “all indicates that the next solar cycle will be delayed and the sun is headed into a quiet period”, said Frank Hill, associate director of the National Solar Observatory. According to space weather scientist Bruce Tsurutani at NASA’s Jet Propulsion Laboratory, a solar minimum is defined by sunspot number and 2008 was identified as the period of solar minimum. However, the solar geomagnetic effects on Earth did not reach their minimum until 2009. It was expected that solar activity would begin to build again by this time in preparation for the next cycle and next solar maximum, but the sun remains clear of sun spots. The sun has been unusually quite for four years and scientists do not know how long the decline in solar energy may last.

Every 22 years, the sun's magnetic field switches north and south, creating an 11-year sunspot cycle. Jet streams on the sun's surface and below are early indicators of solar storm activity, and the jet streams have not formed yet for the 2020 cycle. According to Dr. Hill that indicates that there will be little or delayed activity in that cycle. The length of the delay is not known. In recorded history there have been three episodes where the regular 11-year solar cycle has not occurred and these correlated to cool periods on Earth 1650, 1770, and 1850.

Now solar physicists warn of possible global cooling, if solar activity falls. No sunspots were visible for a 70-year period starting in 1645, known as the Maunder Minimum. The Little Ice Age, a period of global cooling, occurred along with the Maunder Minimum after the Medieval Climate Optimum, a global warming period. Several causes have been proposed for the Little Ice Age: an extended period of low solar activity, heightened volcanic activity, changes in the circulation patterns of the oceans, an inherent variability in global climate, or decreases in the human population. The earth is such a complicated system with so many inputs and dependent cycles that an accurate model has not been developed and is unlikely to be developed to test these theories.

Climate scientists on the other hand believe that an extended solar minimum will have imperceptible impact on their global warming forecasts. Nigel Calder co-author of “The Chilling Stars, A new theory of climate change” is a believer in the Solar Irradiance theory and feels that the effects of greenhouse gases are likely to be a good deal less than advertised based on a different theory of climate change. Mr. Calder believes that climate is dependent on solar radiation and that climate models do not account for solar impact.

The best measurements of global air temperatures come from our weather satellites, and they reportedly show fluctuations, but no overall change in air temperature since 1999. (Though record warm and cold years have been recorded since that time the average seems to be the same.) The leveling off of global warming as measured by air temperature while CO2 levels continue to increase is heralded by the school of thought, which says that the sun drives climate changes more emphatically than greenhouse gases do.

Solar cosmic rays intensity and frequency affect the production rate of radiocarbon, C14. This is the substance that is used by archaeologists to date objects. The original C14 content is due to the amount present in the air at the time of death or encapsulation. The atoms gradually decay back to nitrogen and thus provide a method for dating materials. It was discovered in 1958 by Hessel de Vries of Gronigen that the rate of C14 production varies. The C14 variations allowed the variation in the solar production of cosmic rays to be measured. Roger Bray of the New Zealand Department of Scientific and Industrial Research then Jack Eddy of High Altitude Observatory in Colorado documented the correlation of solar cosmic rays and the earth’s climate changes.

There is more than one theory of climate change and I certainly do not know the true importance of the factors that can impact global temperature change, but I am unwilling to dismiss any theory or area of investigation at this stage. The world is not black and white and our knowledge of the interlacing systems that make up the ecosystems of the earth is truly limited. The real world is the one that is subtly interconnected where we have limited knowledge and understanding. If this extended solar minimum occurs it will likely be possible to determine the impact of solar radiation and sun spots on climate and global temperatures.

Thursday, June 16, 2011

California Struggles with Implementing Their Cap and Trade Program

AB 32 the Global Warming Solutions Act of 2006 is a California law that establishes a wide reaching program of regulatory and market mechanisms to achieve quantifiable, reductions of greenhouse gases (GHG) that were intended to be cost effective. This law establishes a statewide GHG emissions cap for 2020, based on 1990 emissions. Though Cap and Trade was not part of the actual law, it was added by the California Air Resource Board in their Regulations. California sees itself as leading the way in cap and trade legislation and serves as an example to the nation of the concerns and problems with this particular approach to attempt to prevent climate change by controlling CO2 emissions.

The cap and trade program as outlined the Air Resource Board represents only about 20% of the greenhouse gas emissions reductions required by AB 32, and will go into effect in 2012. Almost 80 % of the decrease in carbon reductions in the state will be achieved through higher fuel-efficiency standards for vehicles, increased energy efficiency and conservation, renewable-energy mandates and other measures throughout the California economy.

The Air Resource Board had been sued by environmental justice groups concerned that poor communities located near California’s largest emitters could actually face increased exposure to pollution under cap and trade. The lawsuit contends that the air board failed to do an adequate analysis of possible alternatives to the cap-and trade program, as required by the California Environmental Quality Act.

The Judge in the case ruled that the Air Resource Board had only done a cursory consideration of alternative only giving serious consideration to cap and trade as part of the state’s plan to reduce greenhouse gas emissions. The Judge in the case ordered the Air Quality Control Board to stop work on the regulations and consider other options. The Air Resource Board appealed that decision to the appeals court and continued setting up cap and trade while they have whipped out an expanded more thorough analysis of the alternatives considered in the original report. The appeals court Judge has allowed them to continue implementing the cap and trade program.

Both Sierra Club California and the environmental justice organizations said they could support alternatives that include requiring large emitters to cut back on the amount of carbon dioxide at the site of their plants. The cap and trade outline creates credits for California emitters that exceed emission limits but fund emissions- reduction programs, even if they are in other states or countries. Instead some are pushing for consideration of alternatives to cap and trade could include increased energy efficiency in buildings, significantly increasing renewable- energy sources in the state, and a traditional tax that would be placed on carbon emissions.

A carbon tax is straight forward, honest, not subject to the same manipulation that can distort the allocation of carbon credits and finally it raises funds directly for the state of California that is so in need of revenue. The problem with a straight forward carbon tax which on the surface appears to have merit as a way to reduce carbon dioxide release while raising state revenue is California is essentially California can only create or raise a tax by a supermajority. In 1978 Proposition 13 the "People's Initiative to Limit Property Taxation" passed and became article 13A of the California state constitution. The California form of direct democracy allows initiatives that obtain enough signatures to be placed on the ballot and voted on directly by the residents. The proposition decreased property taxes by restricting annual increases of assessed value of real property to an inflation factor, not to exceed 2% per year while using either the 1975 value or the sale price which ever was later. It also prohibited reassessment of a new base year value except for a change in ownership. To protect themselves from increasing property taxes from rapidly increasing property values, the people essentially eliminated the most stable source of revenue and made the imposition of any tax all but impossible.

So as the California Air Resource Board points out, in the case of a carbon tax, the uses of the revenues are restricted by state law and thus could not be used to offset increases in energy costs to low income consumers or encourage approved industries. “The most challenging constraint for a tax approach owes to the requirement that taxes must be approved either by legislative supermajority or voter initiative. Such measures would require time and potentially substantial resources to pass, and may be politically infeasible.”

Cap and trade is an immensely controversial proposal that at the national level has split along the traditional lines of environmentalists versus industry. It was the centerpiece of a plan to reduce greenhouse gases that passed the House in 2009 but subsequently failed in the Senate. It was always reported that support of cap and trade was pro-environment and those apposing were polluters wanting to trash the world. There were always problems with cap and trade and similar programs and always environmentalist who supported a direct tax on carbon and various European Union members have used carbon taxes as part of their strategies. At this time supporters of AB 32’s cap-and trade program tend to be clean-energy businesses, investment firms and other business groups in California with a vested financial interest in profiting from the green economy and regulators. That pretty much tells you who will benefit most from such a program; however the alternatives are limited by Article 13A. Small business interests in California were very alarmed by a Center for Small Business at California State University report titled “Cost of AB 32 on California Small Business-Summary Report of Findings” that outlines the financial impacts of implementing AB 32 to the economy and people of California. The report concluded that when the program is fully implemented, the average annual loss in gross state output from small businesses would be $182.6 billion, approximately a 10% loss in total gross state output.

Even when fully implemented AB 32 is not going to stop climate change, though it may contribute to an amelioration of the increase in carbon dioxide. Even this modest reduction in CO2 emissions will not be met if the production of CO2 just moves out of state along with the economic activity that is producing the emissions. Greenhouse gases absorb the sun’s energy, allowing less heat to escape back to space, and 'trapping' it in the lower atmosphere. Many greenhouse gases occur naturally in the atmosphere, such as carbon dioxide, methane, water vapor, and nitrous oxide, while others are synthetic. Water vapor is the most pervasive of the greenhouse gasses and subject to weather and changes in temperature. The cap and trade program is probably not going to have any discernable impact on total greenhouse gas concentrations on earth. However, creating a cap and trade system does create a market where “clean-energy” businesses, investment firms and other business groups that can profit from the marketing of these credits and regulators can increase their authority and funding. Instead it would be much better to directly tax carbon and use that money to prepare for the earth of our future.

Monday, June 13, 2011

Food, Water and the Environment

World wide there seems to be a lot of extreme weather lately. This may be variability in weather that has been widely reported, the result of changing climate, migrating magnetic fields or something else. Trends in weather are very difficult to see while they are happening because of the natural variability of weather. No matter what the cause, the growing population of the planet needs to survive each devastating storm, earth quake, tsunami, drought, and volcano that impacts our countries. As the population of the planet grows our resource reserves and flexibility to respond to crisis shrinks. Farmers need to withstand whatever weather and natural disasters come their way while continuing to increase the amount of food they produce to meet rising demand. As the population of the earth has grown and the “developing world” grows richer the demand for food has increased markedly. Richer nations add more dairy, meat and fruit to their diets which require more water and cultivated feed to produce than a subsistence diet of grain. Millions of people in Asia have added meat and dairy products to their diets, requiring considerable amounts of grain as feed and vast amounts of water. While this was going on, US energy policy resulted in the conversion of much of the American corn crop into ethanol.

Russia has been hit with the worst drought in a half century. Australia has suffered years of drought only to be hit by torrential flooding so that the lack of water has been replaced by too much water. India’s falling water table and water shortages have been well documented in the world news. Even U.S. grain forecasts have been reduced as much of the Mississippi plane has been flooded, and Texas is mired in a drought. On a bright note, California has received a reprieve from their multi-year drought, but the water deliveries to the farmers is at 75% of water allocations. Another location of unusual weather has been California where unseasonably cold weather over the past month has frozen the Sierra snowpack in place long after it would have normally melted. This is the deepest snow pack at the Donner Pass in June since 1946 when records were first kept. A summer heat wave could cause melting snow in the Sierra to cascade down from the mountains all at once, but the Department of Water Resource believes that the Yuba, Feather and Sacramento rivers would be able to handle higher flows if it became necessary to dump water out of the big reservoirs during a mass melt-off to prevent flooding. So the California agricultural crops should not be impacted.

With appropriate planning and reaction to weather disruptions, the United States has enough agricultural productive capacity and a large enough continent to survive most regional weather extremes. Not everyone does. The agricultural output of the earth needs to increase while the increasing population takes a larger share of the available fresh water and while reducing the environmental damage caused by the business of agriculture, by maintaining river flows, groundwater tables, and limiting chemical use and nutrient contamination. We cannot treat farming as if it were a dangerous polluter to be driven out of our geographic regions. Certainly, all farms should have nutrient management plans and utilize agricultural BMPs, it will take education, money and work to improve the environmental performance of agriculture. These costs loom large to the farmers, but ultimately will be borne by all of us.

In 1970 a third of the population in the developing world was undernourished. By the mid-1990s, the share had fallen below 20 percent, and the absolute number of hungry people dipped below 800 million for the first time in modern history. However, growth in food production fell behind population growth and increasing demand for meat and dairy. The World Bank estimates that the number of hungry people this year as 940 million. World hunger is back and appears likely to continue to grow. The increased demand for food must somehow be met on a planet where little new land is available for farming, where water supplies are tightening, where the temperature is believed to be rising or at least where the weather is erratic and where the food system is already showing serious signs of instability. There is a world food crisis building. Virginia has been blessed with a moderate climate and adequate rainfall, but agriculture is under assault by environmentalist popular opinion. Eliminating agriculture from the Chesapeake Bay Watershed is short sighted and quite frankly a really bad idea on so many levels. Nonetheless, farm practices and land management need to change and improve. The TMDLs required reduction in total nutrient load in the Chesapeake Bay Watershed is not just about maintaining the beauty of the Bay, it is about our water and our life.

Like all estuaries the Chesapeake Bay is an incredibly complex ecosystem that we are only beginning to understand. Estuaries are productive ecosystems and habitats. The Chesapeake Bay serves as a nursery ground for the fish and shellfish industry and protects the coast from storm surges and filters pollution. The estuary filters water that is carrying nutrients and contaminants from the surrounding watershed. The nutrients in proper balance bring fertility, but excess nutrient contamination to the Chesapeake Bay has caused degradation in the habitat. Excess nutrients and sediment from sewage treatment plants, farm fields and animal pastures, urban and suburban run off from roads and landscaping can cause eutrophication. As the ecosystem of estuaries declines, species die out, coastlines experience excessive erosion by wind, tidal action and ice. To restore the damaged portions of the Bay reductions in nutrient contamination will have to take place. Wastewater treatment plants, agricultural nutrient management plans and BMPs, stormwater managements and reductions in population and economic activities are the only sources of these reductions. Agriculture is generally considered the least cost method of reducing sediment, nitrogen and phosphorus. Implementing these changes will reportedly allow us to feed more people with the same land resources, bringing agriculture to the next level. We will carry this cost in either increased cost of food, or hidden in a nutrient trading program as an overall tax to economic activity. Nutrient contamination is about population head count, our waste, our food, our roadways, our landscaping.

Thursday, June 9, 2011

The American Way to a Better World, Litigation

A coalition of environmental groups have announced that they have filed a motion in federal court to oppose the efforts of the American Farm Bureau Federation and their group. The environmental coalition includes the Chesapeake Bay Foundation (CBF), Citizens for Pennsylvania's Future, Defenders of Wildlife, the Jefferson County Public Service District, the Midshore Riverkeeper Conservancy, and the National Wildlife Federation. These groups are seeking to intervene in a lawsuit filed earlier this year by the American Farm Bureau and other agricultural groups. Intervening is a legal tactic that, if successful, would make the environmentalists a party in the case.

American Farm Bureau Federation and the Pennsylvania Farm Bureau went to federal court in Pennsylvania have since been joined the Fertilizer Institute, the National Pork Producers Council, the National Corn Growers Association, the National Chicken Council, the U.S. Poultry and Egg Association, and the National Turkey Federation. The American Farm Bureau and their groups argues that the EPA’s “allocation” of pollutant loads among sources in a TMDL exceeds EPA’s authority under the Clean Water Act; the assigned TMDLs are based on erroneous information that was input into computer models that are unsuitable for determining such loads even if accurate information had been used. Finally, the Farm Bureau contends that during the comment period the public did not have access to the information it needed to comment effectively on the modeling results and the assumptions in the Final TMDL.

The Chesapeake Bay Model is really made up of several models that are added together to create the whole: the Watershed Model, the Estuary Model, the Scenario Builder, the Airshed Model, the Land Change Model and the Land Use Models. The Watershed Model incorporates information about land use, fertilizer applications, wastewater plant discharges, septic systems, air pollution, farm animal populations, weather and other variables to estimate the amount of nutrients and sediment reaching the Chesapeake Bay and which of the major land uses produce these pollutants. This is the most robust and calibrated portion of the model sequence because it is calibrated and validated on the major tributary basin levels where there is decades of measured water quality data available. The Watershed Model divides the 64,000-square-mile Chesapeake Bay watershed into more than 2,000 segments. According to the 2010 versions of the EPA models that were used to derive the TMDLs, cropland accounts for 25% of sediment in the bay, 32% of the nitrogen and 27.5% of the phosphorus while accounting for only 10% of the Chesapeake Bay watershed acreage.

The waste load allocations in the TMDL are based to a large extent on land use data, and the amount that is impervious area. The EPA used satellite photographs to derive the amount of impervious surface. An analysis of Geographic Information System (GIS) land use data sampled in the Hampton Roads area of Virginia showed that the satellite imagery used by EPA for its land use inputs to the watershed model had underestimated the amount of paved surfaces in the region by an average of 48% compared to their GIS information. Neither EPA nor Hampton Roads provided an explanation of why these numbers are so different. Mike Rolband of Wetland Studies and Solutions, Inc. reported that his organization found that 2010 version of the model had used approximately 675,917 acres for the impervious surface area and 1,885,915 acres for the pervious surface area in the Virginia segments of the model. His organization reviewed the EPA’s own data from another sources and found that there were 1,569,377 impervious acres and 3,442,346 pervious acres in the urban areas in the Virginia segments of the model. This aligns with the Hampton Roads data.

Pollutions loads for nitrogen, phosphorus and sediment in the urban areas are calculated using a constant pounds/acre/year for impervious acres as a fixed input, and the pervious load is based on total fertilizer sales data. Pollutions loads for nitrogen, phosphorus and sediment in the urban areas are calculated using a constant pounds/acre/year for impervious acres as a fixed input, and the pervious load is based on total fertilizer sales data. Thus, if the EPA used their own data instead of the satellite data, the total current load for the urban areas would increase by 2,238,449 pounds of nitrogen per year, 636,097 pounds of phosphorus/year and 137,680 pounds of sediment/per year in Virginia. However, the total watershed loads for the overall model would remain the same since they were based on sampling results. So if the urban area loads increase, other area loads will have to decrease to keep the model’s output consistent with sampling data. The waste water treatment plants numbers are based on constant sampling necessary for their permits so their overall total contaminant load will not change. The forest lands number is also believed to be a “good” number, so that leaves the agricultural sector

In the regulatory world the model is reality. If the EPA chooses to not fully correct this error, and instead stays with the under reported amount of impervious surfaces the result would be MS4 permits that would be calculated based on a fraction of the total paved areas, and will have to reduce their urban runoff loads based on modeling data which assumes less impervious area than they actually have. In other words, the urban land area that will have to be treated in order to attain their mandated waste limits would be almost twice the land area assumed in the TMDL. In addition, the model would require a more extensive implementation of Agricultural BMPs to meet the required reduction in nitrogen, phosphorus and sediment than if the urban/suburban segment had correctly reflected the amount of pavement. This will unnecessarily increase the cost to the states of compliance with the TMDL. The Chesapeake Bay Phase 5.3.2 Model is due from the EPA on July 1, 2011.

The Farm Bureau claims that their lawsuit challenges a specific, unlawful EPA regulatory action. It is about federal government overreaching into state rights to self govern across seven jurisdictions. The States within the watershed have estimated that implementation will cost billions of dollars making this a very high stakes argument for the States, cities and farmers. At their news conference the Chesapeake Bay Foundation described the Farm Bureau and their coalition as "big ag" and described their lawsuit as an attempt to derail the latest bay cleanup program for profit. Earning a profit is not necessarily evil.

The EPA model’s allocation of pollution origination is one of sources for the current “green community” anti agriculture stance. The agricultural sector is being viewed as an excessive polluter, though farm management practices have improved over the years as output has increased to feed the ever growing population. The farmers (both family and corporate) feel they have not been given full credit for that improvement. The Chesapeake Bay Watershed Model is a good tool in understanding how nitrogen, sediment, and phosphorus loads from different sources are delivered to the Bay. On a major tributary basis, real world data has been used to calibrate and validate the watershed portion of the model. Thus, it can provide predictive results of implementing best management practices, a useful tool to help make decisions about tradeoffs to control the loads of nutrients and sediment in the Chesapeake Bay Watershed. Implementing and maintaining best management practices and conservation plans on farms is difficult, because it involves changing often long established practices and the way that farmers manage their land and operations and requires a management plan for each operation no matter the size, but it is still probably the most cost effective method of meeting the TMDL. Instead of paying for lawyers maybe both of these groups should consider funding agricultural wells for animal operations so that water way exclusion fencing (which has available cost share dollars) can be built.

Monday, June 6, 2011

Military Brownfields in CA

A few weeks back I saw an article in the San Francisco Chronicle titled “Development Progress Hard to See at Former Navy Bases.” For a decade spanning the end of the 20th century and the beginning of the 21st century a significant portion of my work was Brownfield redevelopment, so I was very interested in the article. The entire gist of the article can be summed up in a single quote from Pat Keliher, Vice President of SunCal, one of the largest privately owned developers of mixed use and master planned communities in California. "I can't think of one base reuse project in California that's gone well, in all honesty,…The issue is that when the bases closed, everyone came up with reuse plans and set all these expectations, but then no one sat down and said, 'Can we really build this?' "

I went back to look at the some of the military projects I worked on in California. One of my favorite military installations was the former Marine Corp Air Station Tustin, located in southern California. The installation was approximately 40 miles south of LA and approximately 100 miles north of the California and Mexico border. MCAS Tustin encompassed about 1,600 acres of land mostly within the city of Tustin; with approximately 80 acres in the southeast corner of the station within the City of Irvine.

MCAS Tustin was one of my favorite sites because of its history. It was commissioned from agricultural land as a Department of the Navy “lighter-than-air” base in 1942. The installation was used for blimps performing antisubmarine patrols off the coast of southern CA during World War II. Blimp patrols were used to protect costal convoys. Between the end of World War II and 1949 Navy Blimps continued to use the station. In 1950, when the Korean War began, the station was converted to helicopter use and support. The major pollution that occurred at the site was due to the routine maintenance of helicopters, ground support equipment and vehicles engines. Maintenance operations generated wastes such as engine oil, hydraulic fluids, and solvents from cleaning and degreasing operations, jet fuel and hydraulic fluid. There was a main fuel line that transversed the base, the line that ran from the tank farm to the fueling mats (over 7,000 feet of line) and was left in place after the base was closed. In addition, there was a history of spills off the edge of the aircraft parking aprons and temporary storage aprons. Excavation and soil removal took place in the areas adjacent to the aprons, but contaminated soils were left in place under the apron and no testing was done there before the military wanted to transfer ownership of the property to the city.

MCAS Tustin was not a Superfund site, so its remediation and site closure could be handled by California regulators and the Navy. However, for twenty-six years the BRAC (base realignment and closure) Cleanup Team had coordinated cleanup and closure activities at the base. The BRAC Cleanup Team consists of representatives from the Department of the Navy, the US EPA, the Santa Ana Regional Water Quality Control Board, and DTSC. These agencies reviewed and commented on the required documents for closure of the individual areas of concern and contamination. The Department of the Navy acted as the lead federal agency for environmental restoration and the DTSC was the lead regulatory agency providing oversight except where the California Water Quality Control Board was the lead agency. The first problem with the military cleanups was the Department of Defense obligation to minimize costs for the Taxpayer and DTSC trying to maximize the quality of the cleanup. The second problem wasthe dueling agencies. The DTSC and the Water Quality Control Board did not share the same regulatory view point and mission. The groundwater contamination resulting from the solvent and jet fuel releases was left in place and remains to this day. This has resulted in continuing remediation, monitoring and restrictions on use.

The reality is that when a military base is assessed, the history of the base is studied so that potentially problem areas can be identified. Contaminated soil and groundwater are identified after testing in areas likely to be impacted. Many areas received regulatory closure after assessment if there was determined to be no significant impact on the area from historical use. Most soil (except that with PCB and lead contamination) was excavated and treated with the on-site thermal desorption unit (cooked to get rid of the solvent and oil), then returned to the excavations. The soil was tested to confirm that the remediation had removed enough of the contaminants for residential use. The ground water extraction system began pumping on January 3, 2002 and has treated millions of gallons of ground water. However, the Navy did not have a lot of success in eliminating or even reducing the contaminated groundwater plumes. Restrictions on land use and containment of the plume have been used as solutions.

Though the city of Tustin still envisions an 820-acre master planned urban activity center, a place (according to their web site) to “Live, Work, Shop and Play.” The timeline envisioned by the community and the city itself was far too aggressive and optimistic. The remediation and regulatory process has taken years more than they anticipated. In addition, the recession and housing crash forced the Master developer Tustin Legacy Community Partners, a company made up of Shea Homes and Shea Properties II, to walk away from the master planned development. However, most of the land was turned over to the city of Tustin in 2002. The Navy carved out the most contaminated parcels and the source of contamination on 100 acres still held by the Navy and a 14 acre parcel held by the Reserves. This allowed Tustin to begin building housing when the regional housing market and economy was still booming. The housing downturn and the severe economic impact of the recession on the city finances and various developers including SunCal have slowed the project. Future plans still include development of an additional 2,105 new homes, 6.7 million square feet of non-residential commercial space, new roadways and infrastructure and significant parkland and open spaces, including a 2 mile long linear park, but Tustin still has not named a new master developer. .

So far $130 million Infrastructure has been completed. This includes storm drains, water and sewer facilities, dry utilities, traffic signals, and a number of roadways. A 1 million square foot shopping center, “The District at Tustin Legacy” has been open since 2007. South Orange County Community College District has developed an initial phase of its Advanced Technology Education campus. Orange County Social Services has completed the Tustin Family campus on site, and the Village of Hope, a transitional facility for the homeless. Over 1,700 homes of various types have been built in four distinct neighborhoods. For now the former MCAS Tustin is a mix of developed shopping center, housings, fields filled with weeds, some old rundown military buildings, the Village of Hope and two massive blimp hangars. Your prospective determines whether this is a success or failure of a Brownfield redevelopment. Certainly, it is not the shinny and new master development, but it has a good start and with time they might have a triving community. More of a concern is what will happen over time with the long term containment of the contaminated plume and restricted use of the contaminated parcels.

Thursday, June 2, 2011

Water Sustainability and Charles Fishman’s “The Big Thirst”

“The Earth's surface is 71 percent covered in water, and water is the primary force shaping every element of the character of the planet — the geology, the weather, the range and variety of life, the planet's gleaming profile in space…”
…”The total water on the surface of Earth (the oceans, the ice caps, the atmospheric water) makes up 0.025 percent of the mass of the planet — 25/100,000ths of the stuff of Earth.”
“…Scientists don't agree on the precise age of the water on Earth, but it's certainly 4.3 or 4.4 or 4.5 billion years old. It's one of the more astonishing things about water — all the water on Earth was delivered here when Earth was formed, or shortly thereafter…in the first 100 million years or so. There is, in fact, no mechanism on Earth for creating or destroying large quantities of water. What we've got is what's been here, literally, forever…”

The quotes above are from Charles Fishman’s book, The Big Thirst: The Secret Life and Turbulent Future of Water. It is a very elegant and well researched story of how water is used throughout our economies and is the basis of all life and wealth. However, his discussion of water does not clearly focus on the sustainability of our water supply. Mr. Fishman clearly identifies that the water infrastructure is not being adequately maintained in the United States and does not adequately exist in much of the rest of the world. Vast amounts of water leaks from our delivery system, but is not necessarily lost from the water cycle. The real problem is that we are not only mining our water reserves, we are destroying the methods that nature stores fresh water that allows us to have a predictable and reliable supply of water. We as a nation and mankind need to address both problems. The need for water is constant it does not come and go with the weather. The need for water grows with population and wealth. All the ways that water supports our lives are discussed in the book making it well worth reading. There is adequate fresh water in the United States, but it is not delivered uniformly or when we need it. The Mississippi has flooded vast portions of the Midwest while Texas has been having a drought.

Water is our most valuable resource and how we manage its use or allow its abuse may determine the fate of our country and mankind. According to the US Geological Survey about 26 % of the freshwater used in the United States in 2000 came from ground-water sources; the other 74 % came from surface water. Groundwater is an important natural resource, especially in those parts of the country that don't have ample surface-water sources, such as the arid West and in times of drought. Groundwater is a renewable resource, but not in the way that sun light is. Groundwater recharges at various rates from precipitation. The actions of man can impact the recharge rate of groundwater. Changing land use and increasing the amount of impermeable area by paving or building can reduce groundwater recharge.

When you withdraw the groundwater from fine-grained compressible confining beds of sediments and do not replace it, the land subsides. In the pursuit of wealth the ground water in the incredibly fertile Central Valley was pumped to such an extent that the ground subsided more than 75 feet in some places. The area was identified by the research efforts of Joseph Poland as the location of maximum subsidence in the United States due to groundwater mining. Once the land subsides, it looses its water holding capacity and will never recover as an aquifer. Groundwater mining in the Central Valley was believed to have slowed in the past few decades, but it continues as documented by the recent data from the University of California’s Center for Hydrologic Modeling Gravity Recovery and Climate Experiment, GRACE.

The twin satellites of the GRACE program monitor each other while orbiting the Earth, and produce some of the most precise data ever collected on the planet’s gravitational variations. This information is used to determine the changes in ice, snow, groundwater basins, and surface water from season to season and over time. Though the amount of water on Earth is static, the location of the water and its availability for use by man does change. The GRACE program reports that from October 2003 to March 2010, aquifers under the state’s Central Valley were drawn down by 25 million acre-feet — almost enough to fill Lake Mead, California’s and the nation’s largest reservoir. The GRACE program also identified several other areas of the earth where groundwater levels have fallen. These areas include northern India, North Africa, and northeastern China.

California is my usual canary in the mine for water resource management and mismanagement. They have all the resources of knowledge and wealth available to mankind and yet struggle with the politics of addressing their impending water crisis. California local water agencies have invested in water recycling, conservation, groundwater storage and other strategies to stretch supplies, but the demand for cheap water exceeds supply as evidenced by the unsustainable groundwater usage. Year round agriculture that supplies food to the nation (grapes, almonds, avocados, lemons, melons, peaches, plums, and strawberries, oranges, apricots, dates, figs, kiwi fruit, nectarines, olives, pistachios, prunes, and walnuts, garlic, tomatoes, lettuce, cattle and calves) has been made possible by the ample supply of water used for irrigation. The limit to California’s agricultural bounty and the wealth of the ranch owners is water availability.

The water available is a combination of surface water diversions and groundwater pumping. In 2006 before the beginning of the last drought, California used almost 31 billion gallons of water a day for irrigation. This is 351 gallons of water a day for each agricultural dollar earned each year and represents almost 80% of the water used in the state each year (excluding non consumptive power usage). All attempts to reduce water usage have been directed to California residential communities to reduce their per capita water use 20% by 2020. The water that is allocated to agriculture remains cheap water. Food needs to reflect the real cost of the water and the permanent loss of ground water. There is not enough water to support the total level of agriculture in the state. Even as the per capita water usage falls the total water used will grow with the population, but there will be no growth in the water supply for the state and if climate projects are at all true, then there will be less water delivered by snowfall and rain. As documented by GRACE California has continued to make up the short fall in water by using more groundwater than recharges and the groundwater table continues to fall. Water is wealth and life. California is spending its wealth on agriculture in the Central Valley growing cheap walnuts for China and grapes and strawberries for me and when it is gone they will leave behind a desert with water pipes running south to Los Angeles.