The goal of the In Vitro Toxicology Lecture series is to feature important research using in vitro and alternative techniques to study basic mechanisms and to develop test methods aimed at replacing animal use whenever feasible. Undergraduates, graduate students, postdoctoral scholars, and recipients of Colgate-Palmolive awards are among the guests at the In Vitro Toxicology Lecture and Luncheon at the SOT Annual Meeting. This event is supported by an educational grant from the Colgate-Palmolive Company.
Aaron B. Bowman
Purdue University, West Lafayette, IN
Dr. Bowman examined the benefits of using iPSC technology to personalize human risk assessment for suspected and known neurotoxicants. Induced pluripotent stem cells (iPSCs) can be generated from individual human subjects or representative vulnerable populations. These cells can be used to differentiate cells along all three embryonic germlines, including the brain and neurovascular unit. Examples from the recent literature illustrated opportunities and challenges to the field.
NIEHS, Research Triangle Park, NC
Dr. Kleinstreuer discussed the challenges in developing and receiving regulatory acceptance of non-animal testing approaches for skin sensitization. Skin sensitization, or allergic contact dermatitis, is a toxicity endpoint of widespread concern for which the mechanistic understanding and concurrent necessity for non-animal testing approaches have evolved to a critical juncture. Legislation in Europe and other regions prohibits the use of animals for testing cosmetics ingredients and public pressure has inspired many companies to make their products “cruelty-free.” Further, retrospective analyses have shown that the traditional animal-based tests have relatively poor reproducibility and predictive performance when compared to the human endpoint. The biological processes underlying the skin sensitization adverse outcome pathway (AOP) have been well-described, and many alternative options have been developed for predicting key events in the AOP, including human-cell based methods, cell-free assays, and computational approaches. However, none of these tests are yet acceptable as stand-alone replacements for the murine local lymph node assay (LLNA).
Therefore, various non-animal defined approaches that combine these different data sources have been proposed for skin sensitization testing and assessment. Curated LLNA and human sensitization data can be used to evaluate the performance of eight non-animal testing strategies for both hazard and potency characterization. Defined approaches under examination include consensus methods, artificial neural networks, support vector machine models, Bayesian networks, and decision trees, most of which can be reproduced using open source software tools. Multiple non-animal testing strategies incorporating in vitro, in chemico, and in silico inputs have equivalent or superior performance to the LLNA when compared to both animal and human data for skin sensitization. The practical challenges and potential hurdles to regulatory application of these alternative approaches will be discussed, as well as the process of achieving international acceptance and implementation.
Anthony Bahinski, PhD
GlaxoSmithKline, King of Prussia, PA
Dr. Bahinski discussed the potential for human organs-on-chips to provide better predictive power over existing preclinical animal models that often lead to failure of drug compounds late in their development. Organs-on-Chips are microfluidic cell culture devices that contain hollow micrometer-sized chambers inhabited by living cells that recreate the specialized multicellular architectures, tissue-tissue interfaces, physicochemical microenvironments and vascular perfusion necessary to recapitulate organ-level physiology in vitro. These biomimetic devices provide a window on human physiology as they enable real-time, high-resolution microscopic imaging as well as analysis of biochemical, genetic and metabolic activities of living cells when they are positioned within the context of functional tissue and organ units. These microsystems could potentially further our understanding of disease etiology and fill the critical need for improved model systems to predict efficacy, safety, bioavailability, and toxicology outcomes for candidate compounds.
Dr. Bahinski presented an introduction and examples of organs-on-a-chip and research directions. Questions regarding applications of the chips in toxicology were discussed in groups at the tables; table hosts used the discussion guide. Participants reported responses via electronic audience polling following the discussion.
Norbert E. Kaminski, PhD
Michigan State University, East Lansing, MI
The development of in vitro alternative approaches to test chemical toxicity and reduce the need for in vivorodent testing continues to be a key area of focus for toxicologists and the public in general. While traditional toxicology methods have relied heavily on animals, new high-throughput screening approaches to generate toxicological data are becoming increasingly available for the safety assessment of chemicals. The emergence of the nanotechnology revolution has made the demand for alternative testing more urgent than ever to address a rapidly expanding number and variety of engineered nanomaterials.
A critical component of toxicological research is use of an appropriate model that will provide insight as to the effects and mechanisms by which xenobiotics alter physiological systems. Models depend on many factors including target tissue(s), whether a xenobiotic metabolite mediates the effects, and whether effects are direct or indirect. Also critical to model selection is consideration of ways to refine, reduce and replace animal use when possible. In many cases, in vitrosystems can be used exclusively to assess xenobiotic effects and mechanisms, especially since these models can be further developed to examine effects on several cell types simultaneously.
Dr. Kaminski presented an introduction and provided several examples of the use of in vitro multicellular model systems, emphasizing both the strengths and challenges of the models and the information that is obtained. Questions regarding data interpretation and the limitations of such systems were discussed in groups at the tables; table hosts used the discussion guide. Participants reported responses via electronic audience polling following the discussion.
James C. Bonner
North Carolina State University, Raleigh, NC
Dr. Bonner presented an introduction to use of in vitro approaches to testing safety of nanomaterials and participants responded to polling questions and discussed a related case study.
The development of in vitro alternative approaches to test chemical toxicity and reduce the need for in vivo rodent testing continues to be a key area of focus for toxicologists and the public in general.
While traditional toxicology methods have relied heavily on animals, new high-throughput screening approaches to generate toxicological data are becoming increasingly available for the safety assessment of chemicals. The emergence of the nanotechnology revolution has made the demand for alternative testing more urgent than ever to address a rapidly expanding number and variety of engineered nanomaterials.
Nanotechnology is anticipated to bring societal benefits in the areas of medicine, engineering, electronics, and energy. However, it is also inevitable that some nanomaterials will present risks for disease in humans exposed occupationally or as a result of exposure to consumer products that incorporate nanomaterials. As the number of different types and modifications of nanomaterials in research, development, and commercialization continues to grow exponentially, a reliable and robust scientific approach to screen nanomaterial toxicity will require in vitro cell systems that can predict disease in mice and humans in vivo. A promising new toxicological paradigm for nanomaterials will be discussed, using carbon nanotubes as a case study, which utilizes alternative test strategies to reduce reliance on animal testing through the use of in vitro cell-based model systems. The most appropriate types of in vitro systems for predicting specific types of disease (e.g., cancer, fibrosis, asthma) will be addressed for hazard assessment of nanomaterials at various stages of synthesis, product development and overall life cycle.
MatTek Corp. & MatTek In Vitro Life Science Laboratories, Bratislava, Slovakia
Dr. Kandarova presented an introduction to the use of models to predict human toxicity, especially several related to skin and eye toxicity. The audience responded to questions related to the content presented.
Lord Kelvin is reputed to have said, “If you can’t make a model of it, you do not understand it.” This maxim can be applied perfectly to the situation of searching for replacement models (i.e., models where animals are not required) in toxicology. The more we know about in vivo models used to study toxicity effects and the more we know about biological pathways and events that lead to their modulations and perturbations, the more precisely we can create reliable in vitro and in silico replacement systems to predict human toxicity. However, development of reliable and relevant replacement models is, in many cases, hindered by technical difficulties or lack of knowledge and, at later stages, by lack of scientific and regulatory willingness to accept the novel systems.
Major progress in development and broad acceptance of replacement models have been achieved in the area of topical toxicity. Since 2004, the Organization for Economic Cooperation and Development (OECD) Test Guidelines Program adopted three methods for skin irritation and four methods for skin corrosion testing that are based on the use of in vitro reconstructed human skin models. Reconstructed cornea models are being validated for general prediction of eye irritation, and they now are accepted by the US Environmental Protection Agency (EPA) for antimicrobial pesticides toxicity testing. This success was achieved because these in vitro systems are able to mimic with great fidelity many responses of native human tissues to toxic stimuli. However, one key problem in establishing reliable and relevant replacement models and methods is linked to the in vivo animal models used currently in regulatory toxicology. There are questions of their prediction accuracy for human responses despite acceptance of the animal models as the “gold standard” for human skin and eye toxicity. Animal models correctly predict only 40–70% of human responses depending on the toxicity endpoint. Therefore, in vitro assays calibrated against over-predictive or under-predictive in vivo animal assays may be challenged for their prediction accuracy. Ongoing scientific dialogue between the developers and users of these systems and involvement of the regulators at early stages of the validation processes makes the scientific, as well as regulatory, acceptance significantly easier.
A case study followed the presentation. Participants at each table took on the role of a member of an advocacy group, a government regulator, or a basic research scientist, reviewed data for a replacement test, and made a case for the validity of the new model.
Amgen, Inc., Thousand Oaks, California
Dr. Hamadeh presented an introduction to the use of models in hazard identification and risk assessment, challenge participants to discuss specific questions at their tables, and then lead discussion of the participant responses.
Gaining insight to a molecule’s potential to cause harm to humans is a major challenge for scientists working in a variety of industries and disciplines including drug and chemical development as well as environmental protection.
The range of tools that are employed range from in silico to in vitro and in vivo animal models in order to approximate the reaction of humans to drugs and chemicals. In vitro models have the obvious advantages of not requiring large quantities of given molecules for testing, they contribute to a decrease in animal testing (3Rs). One example is the use of the bile salt export pump (BSEP) functional assay for informing the potential hazard associated with molecules for causing hepatotoxicity in humans through interference with normal bile acid transport. In drug development, clinical liver injury might translate into potentially less competitive drugs. Molecules hypothesized to be associated with clinical liver injury via interference with BSEP function have often not resulted in liver injury signals in preclinical species. In vitro assays that can predict the potential clinical hazard in the absence of animal models are valuable not only for selecting more quality candidate drugs to advance, but also to reduce the number of animals consumed by advancing molecules with high potential for liver effects that may eventually be removed from development, depending on the indication. Several confounders that underlie the interpretation of BSEP activity data will be discussed and potential solutions that may improve the translation of the hazard signal to a risk assessment using this assay will be debated.
Luncheon participants discussed the following questions:
- Should we use in vitro methods like the BSEP functional assay to screen for hypothetical risks and prioritize drugs before going into animal testing? This will result in increased cost and delay in getting new drugs to market to address risks that aren’t even proven.
- If we proceed with this screening approach, what complexity of a model is appropriate, a simple single cell type enzyme-focused system vs. complex, multi-cell system that attempts to replicate actual liver architecture? Your position should take into account the difficulty and cost of the models and, more importantly, the ability to interpret the results.
- Do you agree that NIH should reallocate funding to support studies of dynamic in vitro models for toxicity testing rather than static cell culture models? This would have the net effect of reduced funding for research involving traditional static cell cultures.
Timothy J. Shafer
US EPA, Research Triangle Park, NC
Dr. Shafer presented an introduction to the topic, and then participants will discuss related questions and report responses.
Since publication of the National Academy of Science (NAS) paper on Toxicity Testing in the 21st Century, there has been an increased emphasis on the development of in silico and in vitro approaches to toxicity testing. The NAS vision is to replace the current animal-based tests, which are low through-put and often do not predict human responses well, with higher throughput toxicity pathway-based approaches that will allow testing of greater numbers of chemicals and be more predictive of toxicity to humans.
In some cases, there has been considerable progress with these approaches such that in silico or in vitro data can and are being used to make decisions regarding drug or chemical safety. Most cosmetics today tout the fact that they “were not tested in animals” and the European REACH legislation will no longer allow a compound that has been tested in animals to be used in cosmetics. In silico, in vitro, and ex vivo approaches are now widely accepted and utilized to predict ocular toxicity such that the Draize Eye Test is now only rarely used.
Other attempts to develop in vitro approaches have been less successful. For example, the Ames Test successfully predicts only about 70% of rodent carcinogens and by itself has not replaced the 2-year cancer bioassay. Replacing in vivo tests with in silico or in vitro data is not a simple task, and doing so requires acceptance from regulators, the regulated community, and ultimately the public. Thus, a variety of factors may contribute to whether or not, and when, in silico/in vitro testing can replace in vivo testing. These may include considerations about the nature of the regulatory decision to be made, the ability of the test to predict in vivo and human responses, and public perceptions, among other factors. This talk will briefly summarize the current rationale and approach to in vitro testing and provide some examples where in vitro tests have, and have not, successfully replaced in vivo approaches.
This background will be used to stimulate a discussion on what is needed for in vitro data to be used in regulatory decision making, and whether or not in silico and in vitro approaches ever could (or should) entirely replace in vivo approaches.
Students and discussion leaders were asked to consider the following questions:
- Is it a realistic goal to replace all animal testing?
- What criteria must be fulfilled for an in vitro approach to replace an in vivo approach?
- Does the context of the decision to be made, or the level of information required, matter?
- What are the challenges to human risk?
- Can an in vitro approach be useful if the toxicity pathway is not completely understood?
- If we replace, what are the scientific questions (uncertainties) about which we need to be concerned?
- Would the public accept and be comfortable with decisions made using in vitro data? What if the decision was made entirely on the basis of in vitro data?
Robert E. Chapin
Pfizer, Groton, Connecticut
The goal of the In Vitro Toxicology Lecture series is to feature important research using in vitro and alternative techniques to study basic mechanisms and to illustrate how these test methods benefit animal welfare by refining, reducing, and replacing animal use whenever it is feasible. Graduate students, undergraduates, postdoctoral scholars, and recipients of Colgate-Palmolive awards are among the guests at the In Vitro Toxicology Lecture and Luncheon.
The “Toxicity Testing in the 21st Century” vision promulgates an in vitro approach to safety assessment based heavily on knowing the pathways responding in a cell and then correctly relating that to an in vivo exposure and response to predict the likely health outcome. But we are now much like Galileo was with our Moon: seeing the goal is many, many times easier than actually getting there. However, given that animal models correctly predict only 40–70% of human responses, in vitro models won’t actually have to do that well to be better than the current in vivo models. Thus, for both animal-use issues and for correct-predictivity issues, an in vitro future is a worthy and achievable goal. Meanwhile, there is much trial and error to pursue. This talk reprised an in vitro testing vision, and then put it into an industry perspective. It is clear that we’re a long way from where we want to be. After this stage-setting, the audience was asked to discuss and then present their answers to a set of related questions.
- What are the limitations (and “costs”) of the current approach using animals?
- What are the limitations of the proposed safety assessments using cell cultures and predictive models?
- What are some possible explanations for the less-than-hoped-for predictivity? Which is most likely, and why?
- How long will it take to implement this new paradigm? Why will it take longer than that?
- What role might stem cells play in this future? What are the assumptions (and thus, possible pitfalls) in their playing that role?
- List the benefits and drawbacks of having multiple cell types in the culture vs. having one cell type.
- What would be the motivations of industry to embrace this new toxicity testing vision? What conditions need to be met for that marriage to happen?
- Why might one solution to the predictivity problem be to use multiple predictive models? How would those models have to differ from each other to make that work?
Brown University, Providence, Rhode Island
The purpose of this event for postdoctoral scholars, graduate students, undergraduate students, and other invited guests is to focus on the importance of animal research to biomedical sciences and toxicology and the ethical obligations of the scientific community to follow the “3Rs” of animal testing (refine, reduce, replace) whenever it is feasible.
In the future, toxicity testing will utilize emerging technologies from the ongoing revolution in understanding biological processes to identify the effects of chemicals on toxicity pathways, using in vitro approaches. The interpretation of chemically-induced alterations in toxicity pathways will depend upon sophisticated modeling that extrapolates from the measured dose-response in cell-based systems to human exposure.
After providing an overview of the National Academy of Sciences report entitled “Toxicity Testing in the 21st Century: The Vision and Some Questions,” this presentation turned to a discussion of issues raised by this new approach. The audience was asked to think about and respond to the following questions:
- Is focus on environmental agents important for the design criteria?
- How long will it take to implement this new toxicity testing paradigm?
- Is the focus on “toxicity pathways” useful or distracting?
- Does a test for neurodevelopmental effects have to look at neurons?
- How do we distinguish adaptive versus adverse (toxic) responses?
- Is this a screening tool or a stand-alone system?
- How is the new paradigm validated?
- What about epigenetics and other new biology?
- How do regulators handle the transition in testing?
- How does the apical definition of adversity work in the new toxicity testing paradigm?
Courtney E. W. Sulentic
Wright State University, Dayton, Ohio
The purpose of this lecture is to discuss the importance of animal research to biomedical sciences and toxicology and the ethical obligations of the scientific community to follow the “3Rs” of animal testing (refine, reduce, replace) whenever it is feasible. Following this discussion, Dr. Sulentic described her current research utilizing an in vitro alternative to understand mechanisms in altered immune function. The immune system is critical to human survival but also plays a contributing role in various mechanisms of toxicity. Assessing alterations of immune function by potential immunotoxicants is complicated by the diffuse nature of the immune system which is composed of various effector cells each with differing functions. Current immunotoxicity testing relies heavily on animal studies underscoring the need to develop and implement alternative approaches. Dr. Sulentic discussed a cell line model developed to provide an in vitro alternative to animal studies in identifying immunotoxicants that specifically target B-cell function (i.e., alteration of immunoglobulin expression and antibody secretion) as well as elucidating the mechanisms of altered B-cell function.
Pfizer, Inc., San Diego, California
It is estimated that only 50% of the animal studies predict human efficacy and more importantly human toxicity. In addition, the use of animals should be minimized as much as possible for ethical reasons. Today, there are vigorous, ongoing national and international research and policy efforts to develop alternatives to animal testing. The efforts focus on both in vitro and in silico approaches and methods. For example, the National Toxicology Program (NTP) at the National Institute of Environmental Health Sciences (NIEHS) created the NTP Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM) in 1998.
Mitochondrial dysfunction is a common mechanism of drug-induced toxicity for a variety of therapeutics, such as certain antiviral drugs, lipid-lowering drugs, NSAIDs and certain cancer chemotherapeutics. Therefore, the early identification of drug candidates that potentially disrupt mitochondrial function is of significant importance in drug discovery. In the past few years we have developed organelle and cell based in vitro screens to detect potential mitochondrial toxicities. These include oxygen sensors to measure mitochondrial respiration in isolated mitochondria and cells, immunocapture of individual electron transport chain proteins that can identify inhibitors of mitochondrial electron transport, and metabolic profiling using oxygen and pH measurements. The presentation discussed the strength and limitations of new applicable high throughput screens and provide recommendations of where to position these assays within the drug development process.
Cynthia A. Afshari
Amgen Inc., Thousand Oaks, California
The goal of the In Vitro Lecture series is to feature important research using in vitro and alternative techniques to study basic mechanisms, and to illustrate how these test methods benefit animal welfare by refining and reducing animal use.
The advent of microarray technology into non-clinical studies has demonstrated impact in several avenues. For example, whole genome analysis of the transcriptional effects of drug exposures has allowed elucidation of mechanism of action, potential off-target indication, and discovery of promising new biomarkers of pharmacological effect or possible adverse indications. For this latter goal, the integration of genomic, proteomic, and metabolomic-based technologies has led to a blossoming of the field known as “toxicogenomics.” One of the early promises of integrating these molecular approaches was that new genomics based models would provide for accurate prediction of toxicity. This talk focused on examples where analysis of in vivo gene expression data derived from early nonclinical screening studies led to the elucidation of a new in vitro screen for aiding selection of molecules during lead optimization. The characterization of this model was presented, which is an additional filter to discriminate compounds that are selected for in vivo studies and should impact the number of animals used in a screening paradigm. The potential challenges that still need to be surpassed in order to allow progressive development of these models were also addressed.
Russell S. Thomas
The Hamner Institutes for Health Sciences, Research Triangle Park, North Carolina
The lecture reviewed an important application of in vitro toxicology to the study of basic mechanistic processes and provide examples of how new test methods have benefited animal welfare by refining experimental procedures and reducing animal use. Recent developments in genomics technology now allow for the comprehensive screening of the impact of chemicals and pharmaceuticals on complex cell signaling networks without the use of whole animal systems. The high-throughput requirement of these approaches necessitates use of in vitro cell culture systems. These high throughput screens provide enormous amounts of data in the context of mechanistic and predictive toxicology. The tools for this type of research include a combination of receptor-based reporter gene assays, gene expression analysis using genome-wide microarrays and large-scale, loss-of-function and gain-of-function studies using inhibitory RNA libraries and libraries of full-length genes, respectively. From results obtained with these tools, a cell signaling pathway can be constructed and a more comprehensive and mechanistic understanding of the impact of chemicals on biological systems can be developed. Elucidation of signaling pathways at the cellular level is not possible in intact animals and identification of mode of action at the molecular level is often important in explaining disease states or toxicities identified in vivo.
William S. Stokes
National Institute of Environmental Health Sciences
The lecture discussed the application of in vitro toxicology to regulatory safety assessment and provided examples of how recently adopted in vitro test methods have benefited animal welfare by refining and reducing animal use while providing for the protection of human health. The process by which new technological methods evolve from development to regulatory acceptance will be discussed, including the validation process necessary to determine their usefulness and limitations for defined specific purpose. Dr. Stokes also discussed expected opportunities for an expanded role for in vitro toxicology in an integrated approach to safety assessment.
US Food and Drug Administration
The lecture addressed in vitro methods for skin corrosivity, skin sensitization, skin phototoxicity and skin absorption that are widely used in the safety assessment of topical products. These alternative methods can reduce and sometimes replace the need for animals. Methods for skin corrosivity and skin sensitization have been validated by both the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM) and European Centre for Validation of Alternative Methods (ECVAM). ECVAM is ready to begin a validation study to assess the adequacy of three methods for dermal irritation measurement. Further efforts are being made to clarify controversial areas in skin absorption methodology with regard to the skin reservoir, skin metabolism, and other issues.
Rodger D. Curren
Institute for In Vitro Sciences
The lecture examined how in vitro methods have struggled to gain respectability within the toxicological community, and how companies now routinely use in vitro data as they make major product development and safety assessment decisions. Dr. Curren reviewed how several new in vitro test procedures have gained international regulatory approval, which he believes will make in vitro methods an “alternative” no longer.
Johns Hopkins University Center for Alternatives to Animal Testing
The lecture examined the ethical background to the concept of the 3Rs of alternatives and the concept that humane science is the best science, providing some specific examples. Included was a discussion of what each of us can do to practice humane science.
Syngenta Central Toxicology Laboratory
Dr. Kimber helped develop the local lymph node assay, one of the two alternative tests currently approved by the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM). His presentation explored how an increasingly sophisticated understanding of the cellular and molecular mechanisms through which chemicals induce skin sensitization and allergic contact dermatitis has translated into new approaches for hazard identification and risk assessment. The evolution of the local lymph node assay (LLNA) and its evaluation, validation, and application will be described. This method was developed originally as an alternative approach to hazard identification. More recently, however, it has been found that in modified form this method can determine accurately the relative potency of skin sensitizing chemicals; an important first step in the risk assessment process. Finally, recent approaches to the development of in vitro methods for the identification of skin sensitization hazard were described.
In Vitro Technologies, Inc
The presentation reviewed the history of in vitro techniques in toxicology and the rapid changes that have occurred over the last few years, leading us to the future use of these technologies.