Wednesday 15 January 2014

ACID RAIN


WHAT IS ACID RAIN?


Acid rain is the common name for acidic deposits that fall to Earth from the atmosphere. The term was coined in 1872 by the Scottish chemist Robert Angus Smith (1817–1884) to describe the acidic precipitation in Manchester, England. In the twenty-first century scientists study both wet and dry acidic deposits. Even though there are natural sources of acid in the atmosphere, acid rain is primarily caused by emissions of sulfur dioxide (SO2) and nitrous oxide (N2 O) from electric utilities burning fossil fuels, especially coal. These chemicals are converted to sulfuric acid and nitric acid in the atmosphere and can be carried by the winds for many miles from where the original emissions took place. (See Figure 5.1.) Other chemicals contributing to acid rain include volatile organic compounds (VOCs). These are carbon-containing chemicals that easily become vapors or gases. VOC sources include paint thinners, degreasers, and other solvents and burning fuels such as coal, natural gas, gasoline, and wood.

Wet deposition occurs when the acid falls in rain, snow, or ice. Dry deposition is caused by tiny particles (or particulates) in combustion emissions. They may stay dry as they fall or pollute cloud water and precipitation. Moist deposition occurs when the acid is trapped in cloud or fog droplets. This is most common at high altitudes and in coastal areas. Whatever its form, acid rain can create dangerously high levels of acidic impurities in water, soil, and plants.

Measuring Acid Rain
The acidity of any solution is measured on a potential hydrogen (pH) scale numbered from zero to fourteen, with a pH value of seven considered neutral. (See Figure 5.2.) Values higher than seven are considered more alkaline or basic (the pH of baking soda is eight); values lower than seven are considered acidic (the pH of lemon juice is two). The pH scale is a logarithmic measure. This means that every pH change of one is a tenfold change in acid content. Therefore, a decrease from pH seven to pH six is a tenfold increase in acidity; a drop from pH seven to pH five is a one hundredfold increase in acidity; and a drop from pH seven to pH
four is a one thousandfold increase.

Pure, distilled water has a neutral pH of seven. Normal rainfall has a pH value of about 5.6. It is slightly acidic because it accumulates naturally occurring sulfur oxides (SOx) and nitrogen oxides (NOx) as it passes through the atmosphere. Acid rain has a pH of less than 5.6.

Figure 5.3 shows the average rainfall pH measured during 2005 at various locations around the country by the National Atmospheric Deposition Program (NADP), a cooperative project between many state and federal government agencies and private entities. Rainfall was most acidic in the mid-Atlantic region and upper Southeast, particularly Ohio, Pennsylvania, West Virginia, Maryland, Delaware, Virginia, eastern Tennessee, and Kentucky. The areas with the lowest rainfall pH contain some of the country's most sensitive natural resources, such as the Appalachian Mountains, the Adirondack Mountains, Chesapeake Bay, and Great Smoky Mountains National Park. Overall, precipitation is much more acidic in the eastern United States than in the western United States because of a variety of natural and anthropogenic (human-caused) factors that are discussed below.

SOURCES OF SULFATE AND NITRATE IN THE ATMOSPHERE
Natural Sources
Natural sources of sulfate in the atmosphere include ocean spray, volcanic emissions, and readily oxidized hydrogen sulfide, which is released from the decomposition of organic matter found in the Earth. Natural sources of nitrogen or nitrates include NOx produced by micro-organisms in soils, by lightning during thunderstorms,

FIGURE 5.1

and by forest fires. Scientists generally speculate that one-third of the sulfur and nitrogen emissions in the United States comes from these natural sources. (This is a rough estimate as there is no way to measure natural emissions as opposed to those that are manmade.)

Sources Caused by Human Activity
According to the U.S. Environmental Protection Agency (EPA), in "What Is Acid Rain?" (June 8, 2007, http://www.epa.gov/acidrain/what/index.html), the primary anthropogenic contributors to acid rain are SO2 and NOx, resulting from the burning of fossil fuels, such as coal, oil, and natural gas.

The EPA notes in "Clearinghouse for Inventories and Emissions Factors" (http://www.epa.gov/ttn/chief/trends/trends06/nationaltier1upto2006basedon2002finalv2.1.xls) that approximately 70% of SO2 emissions produced in 2006 were because of fuel combustion by fossil-fueled electric utilities. Fuel combustion at industrial facilities contributed another 13%. Lesser sources included transportation vehicles and industrial processes. Highway vehicles were the primary source of NOx emissions, accounting for 36% of the total in 2006. Off-highway vehicles (such as bulldozers) contributed 22%. Fuel combustion in power plants was another major source,

FIGURE 5.2

accounting for 20% of the total. Lesser sources included industrial processes and waste disposal and recycling

NATURAL FACTORS THAT AFFECT ACID RAIN DEPOSITION
Major natural factors contributing to the impact of acid rain on an area include air movement, climate, and topography and geology. Transport systems—primarily the movement of air—distribute acid emissions in definite patterns around the planet. The movement of air masses transports emitted pollutants many miles, during which the pollutants are transformed into sulfuric and nitric acid by mixing with clouds of water vapor.

FIGURE 5.3

In drier climates, such as those of the western United States, windblown alkaline dust moves more freely through the air and tends to neutralize atmospheric acidity. The effects of acid rain can be greatly reduced by the presence of basic (also called alkali) substances. Sodium, potassium, and calcium are examples of basic chemicals. When a basic and an acid chemical come into contact, they react chemically and neutralize each other. By contrast, in more humid climates where there is less dust, such as along the eastern seaboard, precipitation is more acidic.

Areas most sensitive to acid rain contain hard, crystalline bedrock and thin surface soils. When no alkaline-buffering particles are in the soil, runoff from rainfall directly affects surface waters, such as mountain streams. In contrast, a thick soil covering or soil with a high buffering capacity, such as flat land, neutralizes acid rain better. Lakes tend to be most susceptible to acid rain because of low alkaline content in lake beds. A lake's depth, its watershed (the area draining into the lake), and the amount of time the water has been in the lake are also factors.

EFFECTS OF ACID RAIN ON THE ENVIRONMENT
In nature the combination of rain and oxides is part of a natural balance that nourishes plants and aquatic life. However, when the balance is upset by acid rain, the results to the environment can be harmful and destructive. (See Table 5.1.)

Aquatic Systems
Even though pH levels vary considerably from one body of water to another, a typical pH range for the lakes and rivers in the United States is six to eight. Low pH levels kill fish, their eggs, and fish food organisms. The

TABLE 5.1

Effects of acid rain on human health and selected ecosystems and anticipated recovery benefits
Human health and ecosystem Effects Recovery benefits
SOURCE: "Appendix I. Effect of Acid Rain on Human Health and Selected Ecosystems and Anticipated Recovery Benefits," in Acid Rain: Emissions Trends and Effects in the Eastern United States, U.S. General Accounting Office, March 2000, http://www.gao.gov/archive/2000/rc00047.pdf (accessed July 27, 2007)
Human health In the atmosphere, sulfur dioxide and nitrogen oxides become sulfate and nitrate aerosols, which increase morbidity and mortality from lung disorders, such as asthma and bronchitis, and impacts to the cardiovascular system. Decrease emergency room visits, hospital admissions, and deaths.
Surface waters Acidic surface waters decrease the survivability of animal life in lakes and streams and in the more severe instances eliminate some or all types of fish and other organisms. Reduce the acidic levels of surface waters and restore animal life to the more severely damaged lakes and streams.
Forests Acid deposition contributes to forest degradation by impairing trees' growth and increasing their susceptibility to winter injury, insect infestation, and drought. It also causes leaching and depletion of natural nutrients in forest soil. Reduce stress on trees, thereby reducing the effects of winter injury, insect infestation, and drought, and reduce the leaching of soil nutrients, thereby improving overall forest health.
Materials Acid deposition contributes to the corrosion and deterioration of buildings, cultural objects, and cars, which decreases their value and increases costs of correcting and repairing damage. Reduce the damage to buildings, cultural objects, and cars, and reduce the costs of correcting and repairing future
damage.
Visibility In the atmosphere, sulfur dioxide and nitrogen oxides form sulfate and nitrate particles, which impair visibility and affect the enjoyment of national parks and other scenic views. Extend the distance and increase the clarity at which scenery can be viewed, thus reducing limited and hazy scenes and increasing the enjoyment of national parks and other vistas.
degree of damage depends on several factors, one of which is the buffering capacity of the watershed soil—the higher the alkalinity, the more slowly the lakes and streams acidify. The exposure of fish to acidified freshwater lakes and streams has been intensely studied since the 1970s. Scientists distinguish between sudden shocks and chronic (long-term) exposure to low pH levels.

Sudden, short-term shifts in pH levels result from snowmelts, which release acidic materials accumulated during the winter, or sudden rainstorms that can wash residual acid into streams and lakes. The resulting acid shock can be devastating to fish and their ecosystems. At pH levels below 4.9, fish eggs are damaged. At acid levels below 4.5, some species of fish die. Below pH 3.5, most fish die within hours. (See Table 5.2.)

TABLE 5.2

Generalized short-term effects of acidity on fish
pH range Effect
SOURCE: "Generalized Short-Term Effects of Acidity on Fish," in National Water Quality Inventory: 1998 Report to Congress, U.S. Environmental Protection Agency, June 2000
6.5–9 No effect
6.0–6.4 Unlikely to be harmful except when carbon dioxide levels are very high (1,000 mg l-1)
5.0–5.9 Not especially harmful except when carbon dioxide levels are high (20 mg I1) or ferric ions are present
4.5–4.9 Harmful to the eggs of salmon and trout species (salmonids) and to adult fish when levels of Ca2, Na+and Cl-are low
4.0–4.4 Harmful to adult fish of many types which have not been progressively acclimated to low pH
3.5–3.9 Lethal to salmonids, although acclimated roach can survive for longer
3.0–3.4 Most fish are killed within hours at these levels
Because many species of fish hatch in the spring, even mild increases in acidity can harm or kill the new life. Temporary increases in acidity also affect insects and other invertebrates, such as snails and crayfish, on which the fish feed.

Gradual decreases of pH levels over time affect fish reproduction and spawning. Moderate levels of acidity in water can confuse a salmon's sense of smell, which it uses to find the stream from which it came. Atlantic salmon are unable to find their home streams and rivers because of acid rain. In addition, excessive acid levels in female fish cause low amounts of calcium, thereby preventing the production of eggs. Even if eggs are produced, their development is often abnormal.

Increased acidity can also cause the release of aluminum and manganese particles that are stored in a lake or river bottom. High concentrations of these metals are toxic to fish.

Soil and Vegetation
Acid rain is believed to harm vegetation by changing soil chemistry. Soils exposed to acid rain can gradually lose valuable nutrients, such as calcium, magnesium, and potassium and become too concentrated with dissolved inorganic aluminum, which is toxic to vegetation. Long-term changes in soil chemistry may have already affected sensitive soils, particularly in forests. Forest soils saturated in nitrogen cannot retain other nutrients required for healthy vegetation. Subsequently, these nutrients are washed away. Nutrient-poor trees are more vulnerable to climatic extremes, pest invasion, and the effects of other air pollutants, such as ozone.

FIGURE 5.4

Some researchers believe that acid rain disrupts soil regeneration, which is the recycling of chemical and mineral nutrients through plants and animals back to the Earth. They also believe acids suppress decay of organic matter, a natural process needed to enrich the soils. Valuable nutrients such as calcium and magnesium are normally bound to soil particles and are, therefore, protected from being rapidly washed into groundwater. Acid rain, however, may accelerate the process of breaking these bonds to rob the soil of these nutrients. This, in turn, decreases plant uptake of vital nutrients. (See Figure 5.4.)

Acid deposition can cause leafy plants such as lettuce to hold increased amounts of potentially toxic substances such as the mineral cadmium. Research also finds a decrease in carbohydrate production in the photosynthesis process of some plants exposed to acid conditions. Research is under way to determine whether acid rain could ultimately lead to a permanent reduction in tree growth, food crop production, and soil quality. Effects on soils, forests, and crops are difficult to measure because of the many species of plants and animals, the slow rate at which ecological changes occur, and the complex interrelationships between plants and their environment.

trees
trees. The effect of acid rain on trees is influenced by many factors. Some trees adapt to environmental stress better than others; the type of tree, its height, and its leaf structure (deciduous or evergreen) influence how well it will adapt to acid rain. Scientists believe that acid rain directly harms trees by leaching calcium from their foliage and indirectly harms them by lowering their tolerance to other stresses.

According to the EPA, acid rain has also been implicated in impairing the winter hardening process of some trees, making them more susceptible to cold-weather damage. In some trees the roots are prone to damage because the movement of acidic rain through the soil releases aluminum ions, which are toxic to plants.

One area in which acid rain has been linked to direct effects on trees is from moist deposition via acidic fogs and clouds. The concentrations of acid and SOx in fog droplets are much greater than in rainfall. In areas of frequent fog, such as London, significant damage has occurred to trees and other vegetation because the fog condenses directly on the leaves.

Birds
Increased freshwater acidity harms some species of migratory birds. Experts believe the dramatic decline of the North American black duck population since the 1950s is because of decreased food supplies in acidified wetlands. Acid rain leaches calcium out of the soil and robs snails of the calcium they need to form shells. Because titmice and other species of songbirds get most of their calcium from the shells of snails, the birds are also perishing. The eggs they lay are defective—thin and fragile. The chicks either do not hatch or have bone malformations and die.

In "Adverse Effects of Acid Rain on the Distribution of the Wood Thrush Hylocichla mustelina in North America" (Proceedings of the National Academy of Sciences, August 12, 2002), Ralph S. Hames et al. discuss the results of their large-scale study, which shows a clear link between acid rain and widespread population declines in the wood thrush, a type of songbird. Hames and his colleagues believe that calcium depletion has had a negative impact on this bird's food source, mainly snails, earthworms, and centipedes. The bird may also be ingesting high levels of metals that are more likely to leach out of overly acidic soils. Declining wood thrush populations are most pronounced in the higher elevations of the Adirondack, Great Smoky, and Appalachian mountains. Hames and his cohorts warn that acid rain may also be contributing to population declines in other songbird species.

Materials
Acid rain can also be harmful to materials, such as building stones, marble statues, metals, and paints. Elaine McGee of the U.S. Geological Service reports in Acid Rain and Our Nation's Capital (1997, http://pubs.usgs.gov/gip/acidrain/contents.html) that limestone and marble are particularly vulnerable to acid rain. Historical monuments and buildings composed of these materials in the eastern United States have been hit hard by acid rain.

Human Health
Acid rain has several direct and indirect effects on humans. Particulates are extremely small pollutant particles that can threaten human health. Particulates related to acid rain include fine particles of SOx and nitrates. These particles can travel long distances and, when inhaled, penetrate deep into the lungs. Acid rain and the pollutants that cause it can lead to the development of bronchitis and asthma in children. Acid rain is also believed to be responsible for increasing health risks for those over the age of sixty-five; those with asthma, chronic bronchitis, and emphysema; pregnant women; and those with histories of heart disease.

THE POLITICS OF ACID RAIN
Scientific research on acid rain was sporadic and largely focused on local problems until the late 1960s, when Scandinavian scientists began more systematic studies. Acid precipitation in North America was not identified until 1972, when scientists found that precipitation was acidic in eastern North America, especially in northeastern and eastern Canada. In 1975 the First International Symposium on Acid Precipitation and the Forest Ecosystem convened in Columbus, Ohio, to define the acid rain problem. Scientists used the meeting to propose a precipitation-monitoring network in the United States that would cooperate with the European and Scandinavian networks and to set up protocols for collecting and testing precipitation.

In 1977 the Council on Environmental Quality was asked to develop a national acid rain research program. Several scientists drafted a report that eventually became the basis for the National Acid Precipitation Assessment Program (NAPAP). This initiative eventually translated into legislative action with the Energy Security Act of 1980. Title VII (Acid Precipitation Act of 1980) of the act produced a formal proposal that created NAPAP and authorized federally financed support.

The first international treaty aimed at limiting air pollution was the United Nations Economic Commission for Europe (UNECE) Convention on Long-Range Trans-boundary Air Pollution, which went into effect in 1983. It was ratified by thirty-eight of the fifty-four UNECE members, which included not only European countries but also Canada and the United States. The treaty targeted sulfur emissions, requiring that countries reduce emissions 30% from 1980 levels—the so-called Thirty Percent Club.

The early acid rain debate centered almost exclusively on the eastern United States and Canada. The controversy was often defined as a problem of property rights. The highly valued production of electricity in coal-fired utilities in the Ohio River Valley caused acid rain to fall on land in the Northeast and Canada. An important part of the acid rain controversy in the 1980s was the adversarial relationship between U.S. and Canadian government officials over emission controls of SO2 and NO2. More of these pollutants crossed the border into Canada than the reverse. Canadian officials very quickly came to a consensus over the need for more stringent controls, whereas this consensus was lacking in the United States.

Throughout the 1980s the major lawsuits involving acid rain all came from eastern states, and the states that passed their own acid rain legislation were those in the eastern part of the United States.

Legislative attempts to restrict emissions of pollutants were often defeated after strong lobbying by the coal industry and utility companies. These industries advocated further research for pollution-control technology rather than placing restrictions on utility company emissions.

The NAPAP Controversy
In 1980 Congress established NAPAP to study the causes and effects of acid deposition and recommend policy approaches for controlling acid rain effects. About two thousand scientists worked on this unique inter-agency program, which ultimately cost more than $500 million. Even though its first report was due in 1985, the program was plagued by problems that resulted in numerous delays. In 1985 the first executive director, Christopher Bernabo, resigned and was replaced by Lawrence Kulp. In 1987 the study group released to Congress Interim Assessment: The Causes and Effects of Acidic Deposition, a massive four-volume preliminary report that caused a storm of controversy. The report contained detailed scientific information in its technical chapters about acid rain. The executive summary, written by Kulp, was released to the public and widely criticized for mis-representing the scientific findings of the report and downplaying the negative effects of acid rain. Philip Shabecoff notes in "Acid Rain Report Unleashes a Torrent of Controversy" (New York Times, March 20, 1990) that critics claimed Kulp had slanted the summary to match the political agenda of the administration of President Ronald Reagan (1911–2004), which advocated minimum regulation of business and industry.

Some of the scientific findings in the 1987 report included:

Acid rain had adversely affected aquatic life in about 10% of eastern lakes and streams.
Acid rain had contributed to the decline of red spruce at high elevations by reducing this species' cold tolerance.
Acid rain had contributed to erosion and corrosion of buildings and materials.
Acid rain and related pollutants had reduced visibility throughout the Northeast and in parts of the West.
The report concluded, however, that the incidence of serious acidification was more limited than originally feared. At that time the Adirondacks area of New York was the only region showing widespread, significant damage from acid. Furthermore, results indicated that electricity-generating power plants were responsible for two-thirds of SO2 emissions and one-third of NOx emissions.

Controversy over Kulp's role led to him being replaced by James Mulhoney. The new director ordered reassessments and revisions of the interim report. This was completed in 1991. However, by that time President George H. W. Bush (1924–) was in power, and he had made acid rain legislation a component of his election campaign. As a result, political forces, rather than NAPAP, largely drove the nation's emerging policy toward acid rain.

THE ACID RAIN PROGRAM—CLEAN AIR ACT AMENDMENTS, TITLE IV
Congress created the Acid Rain Program under Title IV (Acid Deposition Control) of the 1990 Clean Air Act Amendments. The goal of the program is to reduce annual emissions of SO2 and NOx from electric power plants nationwide. The program set a permanent cap on the total amount of SO2 that could be emitted by these power plants. According to the EPA, in Acid Rain Program: 2005 Progress Report (October 2006, http://www.epa.gov/airmarkets/progress/docs/2005report.pdf), this cap was set at 8.9 million tons (approximately half the number of tons of SO2 emitted by these plants during 1980). The program also established NOx emissions limitations for certain coal-fired electric utility plants. The objective of these limitations was to achieve and maintain a two-million-ton reduction in NOx emission levels by 2000 compared with the emissions that would have occurred in 2000 if the limitations had not been implemented.

In the 1999 Compliance Report: Acid Rain Program (July 2000, http://www.epa.gov/airmarkets/progress/docs/1999compreport.pdf), the EPA indicates that the reduction was implemented in two phases. Phase I began in 1995 and covered 263 units at 110 utility plants in 21 states with the highest levels of emissions. Most of these units were at coal-burning plants located in eastern and midwestern states. They were mandated to reduce their annual SO2 emissions by 3.5 million tons. An additional 182 units joined Phase I voluntarily, bringing the total of Phase I units to 445.

Phase II began in 2000. It tightened annual emission limits on the Phase I group and set new limits for more than two thousand cleaner and smaller units in all forty-eight contiguous states and the District of Columbia.

A New Flexibility in Meeting Regulations
Traditionally, environmental regulation has been achieved by the "command and control" approach, in which the regulator specifies how to reduce pollution, by what amount, and what technology to use. Title IV, however, gave utilities flexibility in choosing how to achieve these reductions. For example, utilities could reduce emissions by switching to low-sulfur coal, installing pollution-control devices called scrubbers, or shutting down plants.

Utilities took advantage of their flexibility under Title IV to choose less costly ways to reduce emissions—many switching from high- to low-sulfur coal—and as a result, they have been achieving sizable reductions in their SO2 emissions.

Allowance Trading
Title IV also allows electric utilities to trade allowances to emit SO2. Utilities that reduce their emissions below the required levels can sell their extra allowances to other utilities to help them meet their requirements.

Title IV allows companies to buy, sell, trade, and bank pollution rights. Utility units are allocated allowances based on their historic fuel consumption and a specific emissions rate. Each allowance permits a unit to emit one ton of SO2 during or after a specific year. For each ton of SO2 discharged in a given year, one allowance is retired and can no longer be used. Companies that pollute less than the set standards will have allowances left over. They can then sell the difference to companies that pollute more than they are allowed, bringing them into compliance with overall standards. Companies that clean up their pollution would recover some of their costs by selling their pollution rights to other companies.

The EPA holds an allowance auction each year. The sale offers allowances at a fixed price. This use of market-based incentives under Title IV is regarded by many as a major new method for controlling pollution.

From 1995 to 1998 there was considerable buying and selling of allowances among utilities. Because the utilities that participated in Phase I reduced their sulfur emissions more than the minimum required, they did not use as many allowances as they were allocated for the first four years of the program. Those unused allowances could be used to offset SO2 emissions in future years. In Acid Rain: Emissions Trends and Effects in the Eastern United States (March 2000, http://www.gao.gov/archive/2000/rc00047.pdf), the U.S. General Accounting Office (now the U.S. Government Accountability Office) notes that from 1995 to 1998 a total of 30.2 million allowances were allocated to utilities nationwide; almost 8.7 million,

FIGURE 5.5

or 29%, of the allowances were not used but were carried over (banked) for subsequent years.

Figure 5.5 shows the status of the allowance bank from 1995 through 2005. Banked allowances increased dramatically in 2000 due to the addition of the Phase II sources to the Acid Rain Program. Over the next five years the allowance bank steadily decreased in size. The EPA reports in Acid Rain Program: 2005 Progress Report that in 2005 a total of 9.5 million allowances were allocated. Another 6.9 million banked allowances were carried over from previous years. The EPA expects that the allowance bank will eventually be depleted as SO2 emissions are further restricted by the implementation of the Clean Air Interstate Rule.

PERFORMANCE RESULTS OF THE ACID RAIN PROGRAM
There are three quantitative measures that environmental regulators use to gauge the performance of the Acid Rain Program: emissions, atmospheric concentrations, and deposition amounts.

U.S. Progress Report
The following information comes from the EPA's Acid Rain Program: 2005 Progress Report.

sources and emissions
sources and emissions. The report notes that in 2005 there were 3,456 electric generating units subject to the SO2 provisions of the Acid Rain Program. Most emissions were associated with approximately 1,100 coal-fired units making up the total. In all, program sources emitted 10.2 million tons of SO2 into the air. (See Figure 5.6.) The EPA expects that the 8.9-million-ton annual cap on emissions will be achieved by 2010. SO2 emissions from sources covered by the program decreased by 41% between 1980 and 2005.

In 2005 the NOx portion of the Acid Rain Program applied to a subset of the 3,456 units mentioned earlier, specifically 982 operating coal-fired units generating at least 25 megawatts. Between 1990 and 2005 NOx emissions from power plants subject to the Acid Rain Program decreased from 5.5 million tons per year to 3.3 million tons per year. (See Figure 5.7.)

According to the report, in 2000 the program first achieved its goal of reducing emissions by at least 2 million tons; 8.1 million tons were originally predicted in 1990 to be emitted in 2000 without the program in place.

The report indicates that the SO2 and NOx emission reductions were achieved even though the amount of fuel used to produce electricity in the United States increased by more than 30% between 1990 and 2005. Coal was the

FIGURE 5.6

FIGURE 5.7

single-largest fuel source for U.S. electric generating plants in 2005, accounting for 50% of the total.

atmospheric concentrations and deposition amounts
atmospheric concentrations and deposition amounts. The EPA's Acid Rain Program uses two complementary monitoring networks to track trends in regional air quality and acid deposition: the Clean Air Status and Trends Network and the NADP's National Trends Network. Additional monitoring data are provided by national, state, and local ambient monitoring systems.

As shown in Figure 2.14 and Figure 2.6 in Chapter 2, atmospheric levels of SO2 and NO2 averaged nationwide since 1990 have been well below the national standards for these pollutants.

Table 5.3 shows trends in atmospheric concentrations and deposition for four key regions in the Acid Rain Program: mid-Atlantic, Midwest, Northeast, and Southeast. Overall, concentrations of ambient SO2 and wet sulfates averaged over the period 2003–05 declined in all these regions, compared with the period 1989–91. The most dramatic differences are evident in the Northeast, where ambient SO2 concentrations decreased by more than 50%. The results for nitrogen and nitrate compound concentrations are mixed, with decreases in some areas and increases in others. The same is true for wet inorganic nitrogen deposition, which decreased in the mid-Atlantic, Midwest, and Northeast, but increased slightly in the Southeast.

Canadian Progress Report
In November 2006 Environment Canada released a report on progress made by Canada and the United States on cross-border air pollution. The study, Canada–United States Air Quality Agreement: 2006 Progress Report (http://www.ec.gc.ca/cleanair-airpur/caol/canus/report/2006canus/toc_e.cfm), is the eighth biennial report related to the 1991 agreement between the two countries. The report states that Canada has been successful at reducing SO2 emissions below its national cap. Canada's total SO2 emissions were 2.3 million tonnes (metric tons) in 2004, which is 28% below the national cap of 3.2 million tonnes. However, Environment Canada notes that the reductions have not been sufficient to reduce acid deposition below the levels needed to ensure the recovery of ecosystems damaged by excess acidity in its eastern provinces.

ARE ECOSYSTEMS RECOVERING?
Monitoring data clearly indicate decreased emissions and atmospheric concentrations of SO2 and NOx and some reductions in deposition amounts. These improvements have not necessarily resulted in recovery of sensitive aquatic and terrestrial ecosystems. This is due, in part, to the long recovery times required to reverse damage done by acidification. The EPA reports that ecosystems

TABLE 5.3

Regional changes in air quality and deposition of sulfur and nitrogen, 1989–91 and 2003–05
Average
Measurement Unit Region 1989–1991 2003–2005 Percent change*
SOURCE: "Table 4. Regional Changes in Air Quality and Deposition of Sulfur and Nitrogen, 1989–1991 Versus 2003–2005," in Acid Rain Program: 2005 Progress Report, U.S. Environmental Protection Agency, October 2006, http://www.epa.gov/airmarkets/progress/docs/2005report.pdf (accessed June 19, 2007)
*Percent change is estimated from raw measurement data, not rounded; some of the measurement data used to calculate percentages may be at or below detection limits.
Notes: kg=kilogram. ha=hectare. mg=milligram. L=liter. µg=microgram. m3=cubic meter.
Wet sulfate deposition kg/ha Mid-Atlantic 27 20 -24
Wet sulfate concentration mg/L Mid-Atlantic 2.4 1.6 -33
Midwest 2.3 1.6 -30
Northeast 1.9 1.1 -40
Southeast 1.3 1.1 -21
Ambient sulfur dioxide concentration µg/m3 Mid-Atlantic 13 8.4 -34
Midwest 10 5.8 -44
Northeast 6.8 3.1 -54
Southeast 5.2 3.4 -35
Ambient sulfate concentration µg/m3 Mid-Atlantic 6.4 4.5 -30
Midwest 5.6 3.8 -33
Northeast 3.9 2.5 -36
Southeast 5.4 4.1 -24
Wet inorganic nitrogen deposition kg/ha Mid-Atlantic 5.9 5.5 -8
Midwest 6.0 5.5 -8
Northeast 5.3 4.1 -23
Southeast 4.3 4.4 +2
Wet nitrate concentration mg/L Mid-Atlantic 1.5 1.0 -29
Midwest 1.4 1.2 -14
Northeast 1.3 0.9 -33
Southeast 0.8 0.7 -9
Ambient nitrate concentration µg/m3 Mid-Atlantic 0.9 1.0 +5
Midwest 2.1 1.8 -14
Northeast 0.4 0.5 +20
Southeast 0.6 0.7 +17
Total ambient nitrate concentration (nitrate + nitric acid) µg/m3 Mid-Atlantic 3.5 3.0 -14
Midwest 4.0 3.5 -12
Northeast 2.0 1.7 -13
Southeast 2.2 2.1 -5
harmed by acid rain deposition can take a long time to fully recover even after harmful emissions cease. The most chronic aquatic problems can take years to be resolved. Forest health is even slower to improve following decreases in emissions, taking decades to recover. Finally, soil nutrient reserves (such as calcium) can take centuries to replenish.

The most recent comprehensive analysis of acidified ecosystems was presented by NAPAP in the National Acid Precipitation Assessment Program Report to Congress: An Integrated Assessment (2003, http://www.cleartheair.org/documents/NAPAP_FINAL_print.pdf). The report presents a literature review summarizing findings from various government and academic studies. Overall, NAPAP finds that some ecosystems affected by acid deposition are showing limited signs of recovery. For example, one study shows that more than 25% of affected lakes and streams studied in the Adirondacks and northern Appalachians are no longer acidic. However, little to no improvement has been seen in examined water bodies in other regions, including New England and portions of Virginia. The report notes that even though chemical recovery has begun in some waterways, complete recovery for aquatic life forms, such as fish, is expected to take "significantly longer."

In regards to terrestrial ecosystems, NAPAP reports that forests are under many stresses besides acid rain, such as global warming, land use changes, and air pollution from urban, agricultural, and industrial sources. The combined effect of these stressors has greatly limited forest recovery from acidification. According to NAPAP, "There are as yet no forests in the U.S. where research indicates recovery from acid deposition is occurring." However, it is expected that reduced emissions under the Acid Rain Program will benefit forests in the long term.

The report acknowledges the future benefits of continued implementation of the Acid Rain Program, but it concludes that "the emission reductions achieved by Title IV are not sufficient to allow recovery of acid-sensitive ecosystems." Recent studies support the idea that additional emission cuts 40% to 80% beyond those of the existing program will be needed to protect acid-sensitive ecosystems. NAPAP modeling indicates that even virtual elimination of SO2 emissions from power plants will be insufficient to provide this protection. It is believed that emission reductions from other sources (such as the industrial and transport sectors) will be necessary.

The Next Step: The Clean Air Interstate Rule
In 2005 the EPA issued the Clean Air Interstate Rule (CAIR; April 5, 2007, http://www.epa.gov/cair/) to address the transport of air pollutants across state lines in the eastern United States. CAIR puts permanent caps on emissions of SO2 and NOx in twenty-eight eastern states and the District of Columbia. It is expected to reduce SO2 emissions by more than 70% and reduce NOx emissions by more than 60% compared with 2003 levels. These measures should reduce the formation of acid rain and other pollutants, such as fine particulate matter and ground-level ozone.

The CAIR program will use a cap-and-trade system similar to that used in the SO2 portion of the acid rain program. The EPA projects that complete implementation of CAIR in 2015 will result in up to $100 billion in annual health benefits and a substantial reduction in premature deaths because of air pollution in the eastern United States. It should also improve visibility in southeastern national parks that have been plagued by smog in recent years.

PUBLIC OPINION ABOUT ACID RAIN
Every year the Gallup Organization polls Americans about their attitudes regarding environmental issues. The most recent poll to assess acid rain was conducted in March 2007. Participants were asked to express their level of personal concern about various environmental issues, including acid rain, water pollution, soil contamination, air pollution, plant and animal extinctions, loss of tropical rain forests, damage to the ozone layer, and global warming. The results showed that acid rain ranked last among these environmental problems.

Analysis of historical Gallup poll results shows a dramatic decline in concern about acid rain since the late 1980s. (See Table 5.4.) In 1989 Gallup found that 41% of respondents felt a great deal of concern about acid rain and 11% felt none at all. By 2007 only 25% of people polled were concerned a great deal about acid rain and 20% expressed no concern about the acid rain issue.

TABLE 5.4

Public concern about acid rain, 1989–2007
Great deal Fair amount Only a little Not at all No opinion
SOURCE: "I'm going to read you a list of environmental problems. As I read each one, please tell me if you personally worry about this problem a great deal, a fair amount, only a little, or not at all. First, how much do you personally worry about—Acid Rain," in Environment, The Gallup Organization, 2007, http://www.galluppoll.com/content/?ci=1615&pg=1 (accessed June 19, 2007). Copyright © 2007 by The Gallup Organization. Reproduced by permission of The Gallup Organization.

Quantum Theory

quantum theory

quantum theory, modern physical theory concerned with the emission and absorption of energy by matter and with the motion of material particles; the quantum theory and the theory of relativity together form the theoretical basis of modern physics. Just as the theory of relativity assumes importance in the special situation where very large speeds are involved, so the quantum theory is necessary for the special situation where very small quantities are involved, i.e., on the scale of molecules, atoms, and elementary particles. Aspects of the quantum theory have provoked vigorous philosophical debates concerning, for example, the uncertainty principle and the statistical nature of all the predictions of the theory.

Relationship of Energy and Matter

According to the older theories of classical physics, energy is treated solely as a continuous phenomenon, while matter is assumed to occupy a very specific region of space and to move in a continuous manner. According to the quantum theory, energy is held to be emitted and absorbed in tiny, discrete amounts. An individual bundle or packet of energy, called a quantum (pl. quanta), thus behaves in some situations much like particles of matter; particles are found to exhibit certain wavelike properties when in motion and are no longer viewed as localized in a given region but rather as spread out to some degree.


For example, the light or other radiation given off or absorbed by an atom has only certain frequencies (or wavelengths), as can be seen from the line spectrum associated with the chemical element represented by that atom. The quantum theory shows that those frequencies correspond to definite energies of the light quanta, or photons, and result from the fact that the electrons of the atom can have only certain allowed energy values, or levels; when an electron changes from one allowed level to another, a quantum of energy is emitted or absorbed whose frequency is directly proportional to the energy difference between the two levels.

Dual Nature of Waves and Particles

The restriction of the energy levels of the electrons is explained in terms of the wavelike properties of their motions: electrons occupy only those orbits for which their associated wave is a standing wave (i.e., the circumference of the orbit is exactly equal to a whole number of wavelengths) and thus can have only those energies that correspond to such orbits. Moreover, the electrons are no longer thought of as being at a particular point in the orbit but rather as being spread out over the entire orbit. Just as the results of relativity approximate those of Newtonian physics when ordinary speeds are involved, the results of the quantum theory agree with those of classical physics when very large "quantum numbers" are involved, i.e., on the ordinary large scale of events; this agreement in the classical limit is required by the correspondence principle of Niels Bohr. The quantum theory thus proposes a dual nature for both waves and particles, one aspect predominating in some situations, the other predominating in other situations.

Evolution of Quantum Theory

Early Developments

While the theory of relativity was largely the work of one man, Albert Einstein, the quantum theory was developed principally over a period of thirty years through the efforts of many scientists. The first contribution was the explanation of blackbody radiation in 1900 by Max Planck, who proposed that the energies of any harmonic oscillator (see harmonic motion), such as the atoms of a blackbody radiator, are restricted to certain values, each of which is an integral (whole number) multiple of a basic, minimum value. The energy E of this basic quantum is directly proportional to the frequency ? of the oscillator, or E=h?, where h is a constant, now called Planck's constant, having the value 6.63×10-34 joule-second. In 1905, Einstein proposed that the radiation itself is also quantized according to this same formula, and he used the new theory to explain the photoelectric effect. Following the discovery of the nuclear atom by Rutherford (1911), Bohr used the quantum theory in 1913 to explain both atomic structure and atomic spectra, showing the connection between the electrons' energy levels and the frequencies of light given off and absorbed.

Quantum Mechanics and Later Developments

Quantum mechanics, the final mathematical formulation of the quantum theory, was developed during the 1920s. In 1924, Louis de Broglie proposed that not only do light waves sometimes exhibit particlelike properties, as in the photoelectric effect and atomic spectra, but particles may also exhibit wavelike properties. This hypothesis was confirmed experimentally in 1927 by C. J. Davisson and L. H. Germer, who observed diffraction of a beam of electrons analogous to the diffraction of a beam of light. Two different formulations of quantum mechanics were presented following de Broglie's suggestion. The wave mechanics of Erwin Schrödinger (1926) involves the use of a mathematical entity, the wave function, which is related to the probability of finding a particle at a given point in space. The matrix mechanics of Werner Heisenberg (1925) makes no mention of wave functions or similar concepts but was shown to be mathematically equivalent to Schrödinger's theory.

Quantum mechanics was combined with the theory of relativity in the formulation of P. A. M. Dirac (1928), which, in addition, predicted the existence of antiparticles. A particularly important discovery of the quantum theory is the uncertainty principle, enunciated by Heisenberg in 1927, which places an absolute theoretical limit on the accuracy of certain measurements; as a result, the assumption by earlier scientists that the physical state of a system could be measured exactly and used to predict future states had to be abandoned. Other developments of the theory include quantum statistics, presented in one form by Einstein and S. N. Bose (the Bose-Einstein statistics) and in another by Dirac and Enrico Fermi (the Fermi-Dirac statistics); quantum electrodynamics, concerned with interactions between charged particles and electromagnetic fields; its generalization, quantum field theory; and quantum electronics.

Global warming

Global warming

Understanding the causes of and responses to global warming requires interdisciplinary cooperation between social and natural scientists. The theory behind global warming has been understood by climatologists since at least the 1980s, but only in the new millennium, with an apparent tipping point in 2005, has the mounting empirical evidence convinced most doubters, politicians, and the general public as well as growing sections of business that global warming caused by human action is occurring.

DEFINITION OF GLOBAL WARMING
Global warming is understood to result from an overall, long-term increase in the retention of the sun’s heat
around Earth due to blanketing by “greenhouse gases,” especially CO2 and methane. Emissions of CO2 have been rising at a speed unprecedented in human history, due to accelerating fossil fuel burning that began in the Industrial Revolution.

The effects of the resulting “climate change” are uneven and can even produce localized cooling (if warm currents change direction). The climate change may also initiate positive feedback in which the initial impact is further enhanced by its own effects, for example if melting ice reduces the reflective properties of white surfaces (the “albedo” effect) or if melting tundra releases frozen methane, leading to further warming. Debate continues about which manifestations are due to long-term climate change and which to normal climate variability.

SPEEDING UP THE PROCESS
Global warming involves an unprecedented speeding up of the rate of change in natural processes, which now converges with the (previously much faster) rate of change in human societies, leading to a crisis of adaptation. Most authoritative scientific bodies predict that on present trends a point of no return could come within ten years, and that the world needs to cut emissions by 50 percent by mid twenty-first century.

It was natural scientists who first discovered and raised global warming as a political problem. This makes many of the global warming concerns unique. “Science becomes the author of issues that dominate the political agenda and become the sources of political conflict” (Stehr 2001, p. 85). Perhaps for this reason, many social scientists, particularly sociologists, wary of trusting the truth claims of natural science but knowing themselves lacking the expertise to judge their validity, have avoided saying much about global warming and its possible consequences. Even sociologists such as Ulrich Beck and Anthony Giddens, who see “risk” as a key attribute of advanced modernity, have said little about climate change.

For practical purposes, it can no longer be assumed that nature is a stable, well understood, background constant and thus social scientists do not need direct knowledge about its changes. Any discussion of likely social, economic, and political futures will have to heed what natural scientists say about the likely impacts of climate change.

GROWING EVIDENCE OF GLOBAL WARMING
While originally eccentric, global warming was placed firmly on the agenda in 1985, at a conference in Austria of eighty-nine climate researchers participating as individuals from twenty-three countries. The researchers forecast substantial warming, unambiguously attributable to human activities.

Since that conference the researchers’ position has guided targeted empirical research, leading to supporting (and increasingly dire) evidence, resolving anomalies and winning near unanimous peer endorsement. Skeptics have been confounded and reduced to a handful, some discredited by revelations of dubious funding from fossil fuel industries.

Just before the end of the twentieth century, American researchers released ice-thickness data, gathered by nuclear submarines. The data showed that over the previous forty years the ice depth in all regions of the Arctic Ocean had declined by approximately 40 percent.

Five yearly aerial photographs show the ice cover on the Arctic Ocean at a record low, with a loss of 50 cubic kilometers annually and glacier retreat doubling to 12 kilometers a year. In September 2005 the National Aeronautics and Space Administration (NASA) doubled its estimates of the volume of melted fresh water flowing into the North Atlantic, reducing salinity and thus potentially threatening the conveyor that drives the Gulf Stream. Temperate mussels have been found in Arctic waters, and news broadcasts in 2005 and 2006 have repeatedly shown scenes of Inuit and polar bears (recently listed as endangered) cut off from their hunting grounds as the ice bridges melt.

In 2001 the Intergovernmental Panel on Climate Change (IPCC), the United Nation’s scientific panel on climate change, had predicted that Antarctica would not contribute significantly to sea level rise this century. The massive west Antarctic ice sheet was assumed to be stable. However, in June 2005 a British Antarctic survey reported measurements of the glaciers on this ice sheet shrinking. In October 2005 glaciologists reported that the edges of the Antarctic ice sheets were crumbling at an unprecedented rate and, in one area, glaciers were discharging ice three times faster than a decade earlier.

In 2005 an eight-year European study drilling Antarctic ice cores to measure the past composition of the atmosphere reported that CO2 levels were at least 30 percent higher than at any time in the last 65,000 years. The speed of the rise in CO2 was unprecedented, from 280 parts per million (ppm) before the Industrial Revolution to 388 ppm in 2006. Early in 2007 the Norwegian Polar Institute reported acceleration to a new level of 390 ppm. In January 2006 a British Antarctic survey, analyzing CO2 in crevasse ice in the Antarctic Peninsula, found levels of CO2 higher than at any time in the previous 800,000 years.

In April 2005 a NASA Goddard Institute oceanic study reported that the earth was holding on to more solar energy than it was emitting into space. The Institute’s director said: “This energy imbalance is the ‘smoking gun’ that we have been looking for” (Columbia 2005).

The second IPCC report in 1996 had predicted a maximum temperature rise of 3.5 degrees Fahrenheit by the end of the twenty-first century. The third report, in 2001, predicted a maximum rise of 5.8 degrees Fahrenheit by the end of the twenty-first century. In October 2006 Austrian glaciologists reported in Geophysical Research Letters (Kaser et al.) that almost all the world’s glaciers had been shrinking since the 1940s, and the shrinking rate had increased since 2001. None of the glaciers (contrary to skeptics) was growing. Melting glaciers could pose threats to the water supply of major South American cities and is already manifest in the appearance of many new lakes in Bhutan.

In January 2007 global average land and sea temperatures were the highest ever recorded for this month; in February 2007 the IPCC Fourth Report, expressing greater certainty and worse fears than the previous one, made headlines around the world. In 1995 few scientists believed the effects of global warming were already manifest, but by 2005 few scientists doubted it and in 2007 few politicians were willing to appear skeptical.

Although rising temperatures; melting tundra, ice and glaciers; droughts; extreme storms; stressed coral reefs; changing geographical range of plants, animals, and diseases; and sinking atolls may conceivably all be results of many temporary climate variations, their cumulative impact is hard to refute.

ANOMALIES AND REFUTATIONS
The science of global warming has progressed through tackling anomalies cited by skeptics. Critics of global warming made attempts to discredit the methodology of climatologist Michael Mann’s famous “Hockey stick” graph (first published in Nature in 1998). Mann’s graph showed average global temperatures over the last 1,000 years, with little variation for the first 900 and a sharp rise in the last century. After more than a dozen replication studies, some using different statistical techniques and different combinations of proxy records (indirect measures of past temperatures such as ice cores or tree rings), Mann’s results were vindicated. A report in 2006 by the U.S. National Academy of Sciences, National Research Council, supported much of Mann’s image of global warming history. “There is sufficient evidence from the tree rings, boreholes, retreating glaciers and other ‘proxies’ of past surface temperatures to say with a high level of confidence that the last few decades of the twentieth century were warmer than any comparable period for the last 400 years.” For periods before 1600, the 2006 report found there was not enough reliable data to be sure but the committee found the “Mann team’s conclusion that warming in the last few decades of the twentieth century was unprecedented over the last 1,000 years to be plausible” (National Academy of
Science press release 2006).

Measurements from satellites and balloons in the lower troposphere have until recently indicated cooling, which contradicted measurements from the surface and the upper troposphere. In August 2005 a publication in Science of the findings of three independent studies described their measurements as “nails in the coffin” of the skeptics’ case. These showed that faulty data, which failed to allow for satellite drift, lay behind the apparent anomaly.

Another anomaly was that observed temperature rises were in fact less than the modelling of CO2 impacts predicted. This is now explained by evidence on the temporary masking properties of aerosols, from rising pollution and a cyclical upward swing of volcanic eruptions since 1960.

Critics of global warming have been disarmed and discredited. Media investigations and social research have increasingly highlighted the industry funding of skeptics and their think tanks, and the political pressures on government scientists to keep silent. Estimates of the catastrophic costs of action on emissions have also been contradicted most dramatically by the British Stern Report in October 2006. Many companies have been abandoning the skeptical business coalitions. The Australian Business Round Table on Climate Change estimated in 2005 that the cost to gross domestic product of strong early action would be minimal and would create jobs.

SCIENTIFIC CONSENSUS
In May 2001 sixteen of the world’s national academies of science issued a statement, confirming that the IPCC should be seen as the world’s most reliable source of scientific information on climate change, endorsing its conclusions and stating that doubts about the conclusions were not justified.

In July 2005 the heads of eleven influential national science academies (from Brazil, Canada, China, France, Germany, India, Italy, Japan, Russia, the United Kingdom, and the United States) wrote to the G8 leaders warning that global climate change was “a clear and increasing threat” and that they must act immediately. They outlined strong and long-term evidence “from direct measurements of rising surface air temperatures and subsurface ocean temperatures and from phenomena such as increases in average global sea levels, retreating glaciers and changes to many physical and biological systems” (Joint Science Academies Statement 2005).

There are many unknowns regarding global warming, particularly those dependent on human choices; yet the consequences for society of either inadequate action or of any effective responses (through reduced consumption or enforced and subsidized technological change) will be huge. It is, for example, unlikely that the practices and values of free markets, individualism, diversity, and choice will not be significantly modified either by economic and political breakdowns or alternatively by the radical measures needed to preempt them.

INADEQUATE ACTION AND NEEDED TRANSFORMATIONS
Kyoto targets are at best a useful first step. However, even these targets, which seek to peg back emissions to 1990 levels by 2010, are unlikely to be met. World CO2 emissions in 2004 continued to rise in all regions of the world, by another 4.5 percent, to a level 26 percent higher than in 1990. A rise of over 2 degrees is considered inevitable if CO2 concentrations pass 400 ppm. At current growing emission rates, the concentration would reach 700 ppm by the end of the twenty-first century. The continuing industrialization of China, recently joined by India, points to the possibility of even faster rises than these projections indicate.

If unpredictable, amplifying feedback loops are triggered, improbable catastrophes become more likely. The Gulf Stream flow could be halted, freezing Britain and Northern Europe. Droughts could wipe out the agriculture of Africa and Australia, as well as Asia, where millions depend on Himalayan melt water and monsoon rains. If the ice caps melt completely over the next centuries, seas could rise by 7 meters, devastating all coastal cities. Will the human response to widespread ecological disasters give rise to solidarity and collective action, such as the aid that came after the 2004 Asian Tsunami or to social breakdowns, as seen in New Orleans after 2005’s Hurricane Katrina and in the Rwandan genocide?

Social and technical changes with the scale and speed required are not unprecedented. The displacement of horsepower by automobiles, for example, was meteoric. Production of vehicles in the United States increased from 8,000 in 1900 to nearly a million by 1912. Substantial regulation or differential taxation and subsidies would be indispensable to overcome short term profit motives and free riding dilemmas (where some evade their share of the cost of collective goods from which they benefit). Gains in auto efficiency in the 1980s, for example, were rapidly reversed by a new fashion for sport utility vehicles.

The debates that have emerged in the early twenty-first century have been related to responses, with different winners and losers, costs, benefits, dangers, and time scales for each response. Advocates of reduced energy consumption or increased efficiency, or energy generation by solar, wind, tidal, hydro, biomass, geothermal, nuclear, or clean coal and geo-sequestration, argue often cacophonously. Yet it seems probable that all these options are needed.

It will be essential for social and natural scientists to learn to cooperate in understanding and preempting the potentially catastrophic collision of nature and society. In order to accomplish this, market mechanisms; technological innovation; international, national, and local regulations; and cultural change will all be needed. Agents of change include governments, nongovernmental organizations, and public opinion, but the most likely front-runner might be sectors of capital seeking profit by retooling the energy and transport systems, while able to mobilize political enforcement.

French Revolution






French Revolution


The French Revolution invented modern revolution —the idea that humans can transform the world according to a plan—and so has a central place in the study of the social sciences. It ushered in modernity by destroying the foundations of the “Old Regime”—absolutist politics, legal inequality, a “feudal” economy (characterized by guilds, manorialism, and even serfdom), an alliance of church and state, and created a vision for a new moral universe: that sovereignty resides in nations; that a constitution and the rule of law govern politics; that people are equal and enjoy inalienable rights; and that church and state should be separate. That vision is enshrined in the Declaration of the Rights of Man and Citizen of 1789, whose proclamation of “natural, imprescriptible, and inalienable” rights served as the model for the 1948 United Nations Universal Declaration of Human Rights.

Eighteenth-century France experienced overlapping tensions that erupted in revolution in 1789. First, the Enlightenment contributed to an environment in which revolution was possible by its insistence on reforming institutions to comply with standards of reason and utility. Furthermore, it coincided with the rise of public opinion, which undermined the absolutist notion that political decisions required no consultation or tolerated no opposition. Second, the French state faced bankruptcy because of a regressive and inefficient tax system as well as participation in the Seven Years War (1756–1763) and the War of American Independence (1775–1783). Third, France witnessed endemic political strife in the eighteenth century. Technically absolutist monarchs who ruled by divine right and who exercised sovereignty without the interference of representative institutions, French kings in reality met with opposition to their policies from the noble magistrates of the highest law courts (Parlements), who resisted fiscal reforms in the name of protecting traditional rights from arbitrary authority. Finally, while class conflict did not cause revolution, there existed stress zones in French society, as a growing population threatened many people with destitution and as talented commoners chafed at their exclusion from high offices in the church, state, and military. Economic problems intensified after bad weather doubled the price of bread in 1789.

These tensions reached a crisis point in the “prerevolution” from 1787 to 1789. To deal with impending fiscal insolvency, the government convened an Assembly of Notables in 1787 to propose a new tax levied on all land and the convocation of advisory provincial assemblies.

Repeated resistance to reform by the notables and Parlements forced Louis XVI (ruled 1774–1792) to convene the Estates-General, a representative body composed of clergy, nobles, and the Third Estate that had not met since 1614. The calling of the Estates-General in 1789 led to a debate over the leadership of reform, and France’s struggle against royal despotism soon became a struggle against noble and clerical privilege. In this context, Emmanuel Sieyès’s pamphlet “What Is the Third Estate?” galvanized patriot opinion by responding “Everything!” and by portraying the privileged groups as unproductive parasites on the body politic.

During a stalemate over whether the estates should vote by order or head, the Third Estate claimed on June 17 that it formed a National Assembly with the authority to write a constitution. This step transferred sovereignty from the king to the nation and constituted a legal revolution. The legal revolution was protected by a popular revolution on July 14 when the people of Paris stormed the Bastille fortress in search of weapons. Popular participation continued to radicalize the revolution. In the countryside, a peasant insurgency against manorial dues and church tithes prompted the National Assembly to decree the “abolition of feudalism” on August 4.

The revolution had three phases. The liberal phase found France under a constitutional monarchy during the National Assembly (1789–1791) and Legislative Assembly (1791–1792). After the destruction of absolutism and feudalism, legislation in this period guaranteed individual liberty, promoted secularism, and favored educated property owners. The aforementioned Declaration of Rights proclaimed freedom of thought, worship, and assembly as well as freedom from arbitrary arrest; it enshrined the principles of careers open to talent and equality before the law, and it hailed property as a sacred right (similarly, the National Assembly limited the vote to men with property). Other laws, enacted in conformity with reason, contributed to the “new regime.” They offered full rights to Protestants and Jews, thereby divorcing religion from citizenship; they abolished guilds and internal tolls and opened trades to all people, thereby creating the conditions for economic individualism; they rationalized France’s administration, creating departments in the place of provinces and giving them uniform and reformed institutions. Significantly, the National Assembly restructured the French Catholic Church, expropriating church lands, abolishing most monastic orders, and redrawing diocesan boundaries.

The revolution did not end despite the promulgation of the constitution of 1791. King Louis XVI had never reconciled himself to the revolution and as a devout Catholic was distressed after the pope condemned the restructuring of the church (known as the Civil Constitution of the Clergy). Ultimately, the king attempted to flee France on June 20, 1791, but was stopped at Varennes. Radicalism constituted another problem for the assembly, for Parisian artisans and shopkeepers (called sans-culottes ) resented their formal exclusion from politics in the Constitution and demanded legislation to deal with France’s economic crisis and the revolution’s enemies, particularly nobles and priests. After Varennes, radicals called increasingly for a republic. In addition, revolutionaries’ fears of foreign nations and counterrevolutionary émigrés led to a declaration of war against Austria in April 1792. France’s crusade against despotism began badly, and Louis XVI’s veto of wartime measures appeared treasonous. On August 10, 1792, a revolutionary crowd attacked the royal palace. This “second revolution” overthrew the monarchy and resulted in the convocation of a democratically elected National Convention, which declared France a republic on September 22, 1792, and subsequently tried and executed the king.

The revolution’s second, radical phase lasted from August 10, 1792, until the fall of Maximilien Robespierre (1758–1794) on July 27, 1794. The Convention’s new declaration of rights and constitution in 1793 captured the regime’s egalitarian social and political ideals and distinguished it from the liberal phase by proclaiming universal manhood suffrage, the right to education and subsistence, and the “common good” as the goal of society. The constitution, however, was never implemented amid the emergency situation resulting from civil war in the west (the Vendée), widespread revolts against the Convention, economic chaos, and foreign war against Austrian, Prussia, Britain, Holland, and Spain. Faced with imminent collapse in the summer of 1793, by spring 1794 the government had “saved” the revolution and organized military victories on all fronts.

The stunning change of events stemmed from the revolutionaries’ three-pronged strategy under the leadership of Robespierre and the Committee of Public Safety. First, they established a planned economy, including price controls and nationalized workshops, for a total war effort. The planned economy largely provided bread for the poor and matériel for the army. Second, the government forced unity and limited political opposition through a Reign of Terror. Under the Terror, the Revolutionary Tribunal tried “enemies of the nation,” some 40,000 of whom were executed—often by guillotine—or died in jail; another 300,000 people languished in prison under a vague “law of suspects.” The unleashing of terrorism to silence political opponents imposed order at the cost of freedom. It raised complex moral issues about means and ends and has led to vigorous historical debate: Was the Terror an understandable response to the emergency, one that saved the revolution from a return of the Old Regime, or was it a harbinger of totalitarianism that sacrificed individual life and liberty to an all-powerful state and the abstract goal of regenerating humankind? Finally, the revolutionary government harnessed the explosive force of nationalism. Unified by common institutions and a share of sovereign power, desirous of protecting the gains of revolution, and guided by a national mission to spread the gospel of freedom, patriotic French treated the revolutionary wars as a secular crusade. The combination of a planned economy, the Reign of Terror, and revolutionary nationalism allowed for a full-scale mobilization of resources that drove foreign armies from French soil at the Battle of Fleurus on June 26, 1794.

The revolution’s third phase, the Thermidorian and Directory periods, commenced with the overthrow of Robespierre and the dismantling of the Terror on 9 Thermidor (July 27, 1794) and lasted until the coup d’état on November 9, 1799, that brought Napoléon Bonaparte (1769–1821) to power. A new constitution in 1795 rendered France a liberal republic under a five-man executive called the Directory. The reappearance of property qualifications for political office sought to guarantee the supremacy of the middle classes in politics and to avoid the anarchy that stemmed from popular participation. The seesaw politics of the Directory, which steered a middle course between left-wing radicalism and right-wing royalism, witnessed the annulment of electoral victories by royalists in 1797 and by radicals (Jacobins) in 1798 and undermined faith in the new constitution. Similarly, the regime won enemies with its attacks on Catholic worship while failing to rally educated and propertied elites in support of its policies. Initially, continued military victories by French armies (including those by Napoléon in Italy) buttressed the regime. But the reversal of military fortunes in 1799 and ten years of revolutionary upheaval prompted plotters to revise the constitution in a more authoritarian direction. In Napoléon, the plotters found their man as well as nearly continual warfare until 1815. “Citizens,” he announced, “the Revolution is established on the principles with which it began. It is over.”

The French Revolution is the quintessential revolution in modern history, its radicalism resting on a rejection of the French past and a vision of a new order based on universal rights and legal equality. The slogan “Liberty, Equality, Fraternity, or Death” embodies revolutionaries’ vision for a new world and their commitment to die for the cause. Both aspects of the slogan influenced subsequent struggles for freedom throughout the world, but one might look at the French slave colony of Saint-Domingue for an example. On Saint-Domingue the outbreak of revolution received acclaim by the lower classes among the 30,000 whites, while planters opposed talk of liberty and equality and the destruction of privileges. Revolutionary ideals also quickly spread among the island’s 30,000 free people of color (affranchis ), who, despite owning property and indeed slaves, suffered racial discrimination. Free people of color demanded full civil and political rights after 1789, but the denial of these rights resulted in a rebellion of the affranchis that was brutally repressed. In 1791 Saint-Domingue’s 450,000 slaves commenced the most successful slave revolt in history. Tensions among whites, mixed-race people, and slaves were exacerbated by British and Spanish actions to weaken their French rival, creating chaos on the island. The Convention’s commitment to equality and desire to win the allegiance of rebels resulted in the abolition of slavery in 1794. A later attempt by Napoléon to reinstate bondage on Saint-Domingue failed despite the capture of the ex-slaves’ skilled leader, Toussaint Louverture (c. 1743–1803), and the slave uprising culminated in the creation of an independent Haiti in 1804. Revolutionary principles of liberty and equality had led to national liberation and racial equality.

One also sees the revolution’s significance in the fact that nineteenth-century ideologies traced their origins to the event. Conservatism rejected the radical change and emphasis on reason of the revolution, while liberalism reveled in the ideals of individual liberty and legal (but not social) equality of 1789. Nationalists treated the concept of national sovereignty as a call to awaken from their slumber in divided states or multiethnic empires. Democratic republicans celebrated the radical phase, finding in its democratic politics and concern for the poor a statement of egalitarianism and incipient social democracy. Socialists perceived in the sans culotte phenomenon the rumblings of a working-class movement, while communists considered the Russian Revolution of 1917 the fulfillment of the aborted proletarian revolution of 1792–1794.

For much of the twentieth century Marxist historians understood the revolution as the triumph of a capitalist bourgeoisie and considered it a bloc (in other words, the radical phase of 1792–1794 was necessary to protect the gains of 1789–1791). Revisionists destroyed this view, treating the revolution as the triumph of a new political culture instead of a new social class and whose main outcome was the realization of the absolutist dream of a strong centralized state rather than a complete break with the past. The revisionists’ denial of social class as an important factor in the revolution opened the field to cultural studies and a focus on marginalized groups such as women and slaves. But the revisionist interpretation has failed to achieve consensus, and scholars continue to dispute the revolution’s legacy. According to the neo-democratic view, the declaration of universal human rights, abolition of slavery, and pattern of modern democratic politics give the revolution a foundational place in the struggle for a better world. For revisionists, the violence of the Terror, the destruction of revolutionary wars, the silencing of dissidents and Catholic worshipers, and the formation of a powerful centralized state render the revolution a source of twentieth-century political horrors ranging from nationalist wars to totalitarian regimes.

Students frequently puzzle over the significance of the revolution when, after all, the Bourbons were restored to the French throne after Napoléon’s final exile in 1815. But the restoration never undid the major gains of the revolution, which included the destruction of absolutism, manorialism, legal inequality, and clerical privilege, as well as commitments to representative government, a constitution, and careers open to talent. Once the revolutionary genie announced the principles of national sovereignty, natural rights, freedom, and equality, history has shown that it could not be put back in the bottle.