U.S. EPA Will Consider Broad Plan to Implement NAS Toxicology ‘Vision’
Risk Policy Report Vol 14 (43) October 23, 2007
Senior U.S. Environmental Protection Agency (U.S. EPA) scientists are weeks away from proposing a broad plan to senior management for implementing the National Academy of Sciences’ (NAS) far-reaching recommendations for transforming the risk assessment discipline of toxicology, including an expansion of cutting-edge technologies such as high-throughput chemical screening and research on substances’ interactions with genetic material.
Many experts say the effort to implement NAS’ recommendations will likely affect the direction of toxicological research and risk assessment over the next 20 years, given the academy’s embrace of technologies that do not have the confidence of the entire risk assessment community.
In an exclusive interview October 11 with Risk Policy Report, U.S. EPA National Health and Environmental Effects Research Laboratory Director Hal Zenick said the agency’s Future of Toxicity Testing Workgroup (FTTW) is expecting to finish developing a proposal within the next several weeks that would identify the process under which the workgroup will establish a more detailed blueprint that responds to the June 12 NAS report, “Toxicity Testing in the Twenty-First Century: A Vision and a Strategy.” The near-final proposal would mark a significant milestone in advancing an agencywide response to the report.
Robert Kavlock, Director of U.S. EPA’s National Center for Computational Toxicology (NCCT), also took part in the interview.
Zenick, who co-chairs the FTTW, said that once the workgroup completes the broad plan it will then present the proposal to the agency’s Science Policy Council for approval. The council is chaired by U.S. EPA Research Chief George Gray and includes the deputy assistant administrators of the agency’s program offices. Zenick said the proposal will address broad issues, such as the scope of U.S. EPA’s response and how the agency plans to work with other federal partners. Zenick also said the FTTW held its first face-to-face meeting to discuss how to respond to the NAS report earlier this month.
The NAS report says that emerging technologies—such as computational toxicology, which allows chemicals’ likely health impacts to be rapidly assessed according to their composition—should be further developed to better determine how chemicals interact with the body, rather than relying on slow and costly animal studies as is currently done. The report also says a new research institute with “intramural and highly targeted extramural activities [would] provide the nexus through which the new testing tools would be conceived, developed, validated, and incorporated into coherent testing schemes.”
U.S. EPA, state, academic, and industry researchers have said that the recommendations included in the NAS’ June report and a new related report on toxicogenomics will take years to implement, and will likely serve as the template for toxicological research for the next 20 years.
NAS issued its related “Applications of Toxicogenomic Technologies to Predictive Toxicology and Risk Assessment” Oct. 9, providing more detail on some of the technologies outlined in the June report. The technologies could ultimately lead to changes in risk assessment default assumptions and other key aspects of risk analysis. Toxicology provides the underlying data for hazard and dose-response assessments, currently two of the key steps in the overall risk analysis process.
Kavlock said one of the key challenges under the new toxicology will be to identify the specific pathways, or biological mechanisms, that are associated with health effects, or endpoints. Once the pathways are identified, U.S. EPA can then use high-throughput screening (HTS) techniques to see if chemicals “activate” those pathways. However, Zenick said it will be important to stress that a chemical should not be considered toxic based solely on the results of HTS tests. Zenick explained that, even with the new technologies at its disposal, U.S. EPA will still have to perform targeted animal studies to test the results of the HTS process. For example, if HTS data indicate a chemical is a neurotoxin, follow-up animal studies will be needed to “see if it really is a neurotoxin” or causes some other toxic effect indicated by the HTS results.
The use of HTS techniques to prioritize chemicals for further toxicity research is an area in which U.S. EPA expects to find early success as it implements the recommendations in the June NAS report, Kavlock and Zenick agreed. “Screening is where the first successes are likely to happen,” Zenick said.
HTS techniques allow researchers to determine how chemicals act on a given cell much more quickly than can be done using traditional methodologies, and NCCT has already launched a program to use those techniques to prioritize unregulated chemicals for future research by testing the biochemical effects of thousands of chemicals a day. HTS techniques rely on a variety of new technologies, including robotics, to test the chemicals over a short time.
The ToxCast™ program will primarily provide information on a chemical’s mode of action, which describes a compound’s biochemical impacts in the body. U.S. EPA sources have said in the past that prioritizing research for unregulated chemicals based on their mode of action would be a step forward because, historically, U.S. EPA has had to rely on analogies between the unregulated chemicals and other better-understood substances with similar chemical structures (Risk Policy Report, Aug. 7, p1).
A state toxicology expert warned earlier this year that toxicologists may disagree about how to interpret the results of tests performed on cell cultures rather than traditional ORIGHIT_2HIT_2lab animal data (Risk Policy Report, July 31, p1).
Other areas addressed by the NAS report will likely be more challenging, and take much longer to accomplish, than using high-throughput systems as a screening tool. “Just working through the pathways is a challenge,” Zenick said, but exploring developmental toxicity issues “gets much more complicated.”
Kavlock said that it is “very difficult” to identify pathways for specific endpoints in developmental toxicity, “because developing embryos are incredibly complex,” and are not fully understood.
In addition, Kavlock said the metabolism of chemicals in the human system will also be a challenge for researchers moving forward. Kavlock explained that chemicals are metabolized by the human body, but that the human cells used in HTS and other in vitro tests do not metabolize chemicals. In some instances, Kavlock said, a metabolite of a chemical may be toxic, even though the chemical itself is non-toxic. This creates several challenges for toxicity researchers. For example, if little is known about a chemical, and tests show that the chemical itself is non-toxic, researchers may be unaware that it can break down into a toxic metabolite.
However, Kavlock noted, U.S. EPA does have models that can extrapolate what metabolites may be created when a human body metabolizes a chemical. But, Kavlock said, each chemical can have 20, 30, or 40 metabolites—and synthesizing the metabolites for thousands of unregulated chemicals would be a huge challenge.
Proceed to on-line brochure
[Necessity of Animal Research]
[Reliability of Animal Data]
[Commitment to ResponsibleTreatment] [Advancing
Valid Alternatives] [Position Statement]
[Guiding Principles] [Further
Information] [Selected References]
[Public Policy Statement] [Download
[Messaging Guide for Scientists Discussing Humane, Responsible Animal Research]