Environmental Energy Technologies Division News

Environmental Energy Technologies Division News
  • EETD News Home
  • Back Issues
  • Subscribe to EETD News
  • Print

What Can Models Tell Us About Risk?

Some environmental pollutants pose known health risks to human beings. Manufacturing, energy generation, and other activities release these pollutants to the air, water, and soil. We may eat or drink something containing these substances, breathe them in, or absorb them through our skin.

How do we know which pollutants pose health risks and which need to be regulated to prevent health impacts? Scientists use computer models to find out how pollutants will move in the environment. But how far can we trust these models? How much can they tell us about health risks? What are the limits of models?

Measuring exposure to environmental pollutants and assessing health hazards is a complicated endeavor that poses interesting scientific and non-scientific challenges. Numerous scientists at Lawrence Berkeley National Laboratory (Berkeley Lab) investigate the scientific issues associated with environmental health. Thomas McKone is one of them. A senior staff scientist in the Environmental Energy Technologies Division (EETD), McKone leads the Environmental Chemistry, Exposure, and Risk Group and is an Adjunct Professor in UC Berkeley's School of Public Health. He and his colleagues study the physical processes by which pollutants migrate through the environment. McKone's own work focuses on developing computer models of pollutant migration to understand pollutant transport and to help policy makers decide whether and how to regulate chemicals. Other researchers conduct field studies on pollutants ranging from particulates and other airborne substances to hazardous chemical and radioactive wastes.

In the early 1990s, McKone's group developed the CalTOX model to help California's Department of Toxic Substances Control develop clean-up goals for contaminated soils and sediments, air, and surface and ground water. The model incorporates multimedia transport (movement or pollutants to or from ground, air, and water) and can estimate multiple-pathway exposures in humans. This model continues to be widely used not only for setting clean-up goals but also for comparative risk and life-cycle impact assessments. McKone has also worked with colleagues at Trent University, Canada and the Swiss Federal Institute of Technology to develop a multimedia fate and transport model called BETR (Berkeley-Trent). BETRNorth America addresses continental-scale transport and distribution of persistent pollutants, and BETRGlobal incorporates features of general circulation models of the atmosphere to study long-range pollutant transport.

What is a Risk? A Hazard?

Understanding the field of risk assessment, which is fraught with collisions between science and value-laden issues not within the realm of science, means taking care with terminology. One example is the use of the terms "risk" and "hazard." A risk is the possibility of experiencing harm; a hazard is defined by its ability to cause harm. "Here's an example of a risk," McKone explains. "'What is the probability that a human will get cancer?' A hazard, however, involves human possibility. If you can show that a chemical causes cancer, then you have shown that it is a hazard." Everyone is at some risk of getting cancer. But exposure to a carcinogen (a cancer-causing chemical) is a hazard because it could increase a person's risk of developing disease. "Science can measure exposures and set up experiments to demonstrate hazard based on occupational or other exposed groups or based on animals studies, for example, in rats," he says, "but you cannot do a scientific experiment to assess human risk. Risk assessment is not a science, but it does have a foundation in toxicology and chemistry."

Participating on National Advisory Panels

One whose expertise is frequently sought by various U.S. science advisory bodies, McKone currently sits on two National Academy of Sciences Panels: "Environmental Decision Making: Principles and Criteria for Models" and "Improving Risk Analysis Approaches Used By the U.S. EPA." The former will release its report early next year, and the latter has just started work; its reports are expected in 2008. In 2006, McKone sat on a National Academy Panel that released a revised assessment of the health risks from exposures to dioxin and dioxin-like chemicals. The work of these committees is typically influential, helping guide risk assessment studies throughout the U.S.

The "Improving Risk" committee is one of a series of National Research Council committees formed during the past 25 years to issue guidelines about the application of risk analysis. These guidelines are for federal agencies such as the U.S. Environmental Protection Agency (U.S.EPA) that must assess the risks of environmental pollutants. The NRC first issued risk-assessment guidelines in 1986 and revised and extended them in 1996. Another review of risk analysis science is currently in progress.

Help from Computer Models

Scientists conduct field studies to understand how a pollutant disperses in the environment and travels from one medium to another, for example how polychlorinated biphenyls (PCBs), which are classified as persistent organic pollutants (POPs), move into soil, water, sediments, and organisms. Raw field data hint at the mathematical relationships that characterize pollutant movement. For example, a particular POP is characterized by, among other things, its half-life in a medium such as air or water. Half-life is the time it takes for the POP to decrease in concentration by one-half its value at the time of measurement.

But scientific field studies are always limited by time and location. To get the bigger environmental picture, scientists create computer models that estimate where chemicals will go; how much of a pollutant will get into the air, water, or ground ("partitioning"); and how long a chemical will persist before break down into simpler chemicals or combines with others. POPs attract considerable scientific and regulatory attention because they take so long to break down, which means they have time to diffuse all over the earth and for humans to build up exposure to these sometimes carcinogenic and mutagenic substances.

Scientists can incorporate into a computer model everything they know about the chemical properties of a pollutant under investigation: the equations governing transport within and among different media and the measurements or statistics regarding the substance's abundance in the environment. Turning the crank on the computer model results in snapshots in space and time of what pollutant concentrations might look like (if the model is constructed accurately), and potential human exposure. The model can also hint at what intervention might be most successful in reducing pollutant concentration and the potential for human exposure.

Models do not Predict

Models have limitations. "Models are not very useful if you don't have something with which to anchor them," says McKone. "As in the case of theory, you need observations to confirm the model and move it closer to a representation of reality."

What can a model do? "This is a big issue," he says. "A lot of people think models provide predictions, but they don't do this."

McKone sees a model as a description of the physical and chemical processes that govern the behavior of chemicals in the environment. "You can build relationships between factors that you can't otherwise do without a model," he explains. "Because of complexity, you can relate things in a model that you couldn't in your mind because there is too much to keep in your head. "A model puts all these pieces together. But just because you understand how the pieces fit together, it doesn't mean that you get the correct results at the other end. You can still get results that don't correlate to the real thing. So models are both potentially powerful, and potentially dangerous."

Too Much Information

What, then, is the proper way to interpret the results of a model? One caution is to not regard the model's results uncritically. "Policy makers don't like to make choices involving uncertainty," says McKone. "But a danger is that they may just use model results to tell them what to do." If a model's performance isn't properly evaluated (compared against real measurements in the field), it may not provide accurate information on which to base regulations.

He continues: "Adding more detail into a model doesn't necessarily get you a better result if you don't understand the basic science. Model development has to be paced with the science."

McKone cites an example: "Years ago, when I was a graduate student, I saw a regional pollutant model for radionuclide transfer from soil to vegetation. It was a square grid with lots of detail - airflow, crops growing in specific areas - and it was used to calculate crop uptake. But there was only one experiment done that provided data, and that didn't account for the uptake to plants by species type; seasonal uptake of some species could vary by a factor of 10 to 50.

"The spatial variation was less than a factor of 10, but the uptake by crops was uncertain by more than a factor of 10, so the added detail did not improve the model result. The reliability of the calculation depends on the reliability of the least-well-known element. If you don't know how uncertain this weak link is, then you are making the model results look more accurate then they really are."

This is why the performance evaluation of existing models, especially those currently in use by federal agencies to provide guidance in the regulation process, has become a hot research area.

Thanks to continuing support from the U.S.EPA, McKone conducts regular research and development to improve models for use in experimental studies and risk assessment. "Some of the questions we ask are: What are the critical uncertainties? What processes do you need in a model, and what can you do without?

"A important quality in a good model is called 'parsimony'; this is defined as making the model as complicated as needed to solve a problem, but not more so. You don't want to add details that make the model overly complicated."

A Model Model Study

Recently, McKone, his Berkeley Lab colleagues, and research teams at universities and government research laboratories throughout the world assessed whether existing models of the persistence and long-range transport of POPs were truly parsimonious and whether each produced output similar to that of all the other similar models. The outcome of the study was the development of a consensus model that described the minimum set of model components needed.

Study participants were McKone, Matthew MacLeod, formerly of Berkeley Lab's Earth Sciences Division and now at the Swiss Federal Institute of Technology in Zurich, and researchers at nine universities and institutions in Switzerland, Germany, Canada, France, Italy, and Japan. "To my knowledge, no study of this kind has been undertaken before," says McKone.

The study was intended to make one type of risk assessment model more useful to the policy-making community.

The participants compared their models to see whether the models produced consistent results. "We ran all nine models on various persistent organic pollutants to characterize them and to compare the models' output for the same chemical against one another, to see how well each model characterized these chemicals.

"Then we realized we needed to create a 'surface of possible properties.' We found that there were four important properties that characterized these chemicals. We created 4,000 'chemicals' - not real ones, but imaginary chemicals with idealized properties - and we ran all the models through this 'space' of chemical properties. Then we compared the output of the various models against each other.

"We found that there was a lot of commonality in the model results, but also there were subtle differences. There were areas where the models had the same results, and areas where they diverged from one another. So, all nine teams proposed a model that included the elements of all nine models that led to common results. And we resolved elements that produced divergent results.

"This process seemed to be leading to the simplest model possible for solving the problem that nonetheless had enough detail and complexity to accurately model the result properly. " There were four chemical properties that determined the behavior of POPs in all models. Two were solubility ratios: the solubility of the POP in air divided by its solubility in water, and the octanol/water solubility ratio. The latter is one indicator of the POP's mobility based on how much it sticks to soils, sediments, and the lipids (fat) in biological organisms. A POP that accumulates rapidly in fat tissue is cause for concern because human tissues will build up high levels over a lifetime.

The two other properties involved are chemical half-life in air and chemical half-life in water. The latter is a good measure of persistence in surface waters, soils, and sediments.

A page of chemical space plots from the CalTOX model.

Figure 1. A page of chemical space plots from the CalTOX model. Each plot on the page shows how a chemical's overall persistence in the environment depends first on its water-to-air solubility ratio (vertical axis) versus its octanol-to-water solubility ratio (horizontal axis). From upper left to lower right, the modeled pollutants become increasingly persistent in the environment.

Figure 1 shows an example of the output of the study: a page of color chemical space plots run on output from McKone's own CalTOX model. Each individual plot shows how a chemical's overall persistence (environmental half-life), as reflected by the color range, depends first on its water-to-air solubility ratio on the vertical axis versus its octanol-to-water solubility ratio on the horizontal axis. The outer horizontal and vertical axes reflect increased half-life in air and water.

From the upper left of the page to the lower right, the POP becomes increasingly persistent in the environment. But some POPs, even if they are persistent, can be volatile, meaning that they will cycle through different media rapidly. A persistent, volatile POP needs to be treated differently than a persistent, stable POP. Such classifications of hypothetical POPs helped to evaluate and classify the performance of the nine models by showing that they produced results similar to one another. The study also showed that this classification scheme can help policy makers single out POPs with specific qualities because of how these pollutants behave in the environment.

Says McKone, "We are not just developing more models, or refining them, or improving the user interface. Our group's goal is to ask 'how do decision makers use models, what is it that they need to do their work effectively?' and then we determine what it is you can do to make the models more effective."

What Makes A Good Model Result?

In addition to parsimony, McKone mentions two other qualities that make a useful model: "One of the things decision makers want is transparency. They need to know how the model works- the method has to be transparent to the world.

"Another is 'fidelity,' the need to be true to reality by including elements in the model that answer the relevant science and policy questions. So in addition to making the model transparent and as simple as possible, you must incorporate all the processes that are important in linking the final result to the factors that, if changed, will alter that result.

"You are always walking a fine line between how much detail you need to get fidelity while not incorporating so much detail that it overwhelms the final users."

The problem of making models useful to non-scientist users is not specific to environmental pollutants. From his conversations with colleagues, McKone has learned that it comes up elsewhere, including in weather forecasting. He notes that there are plenty of sophisticated, supercomputer-based weather models, but many daily forecasts are based on simple plug-in PC-based models that incorporate rules of thumb, and judgment.The bottom line is that no one wants to be overwhelmed with data; we all want just the few basic results that are useful to us.

— Allan Chen

For more information, contact:

  • Thomas McKone
  • (510) 486-6163; Fax (510) 486-6658

This research was funded by the U.S. Environmental Protection Agency.

Web sites for National Academy of Science Panels and model information:

↑ home | ← previous article | next article →