The Continuing Education (CE) Program offers a wide range of courses that cover established knowledge and new developments in toxicology and related disciplines.
Taking place from 7:00 am to 7:45 am, this special Continuing Education minicourse includes breakfast and features two speakers.
This course will provide attendees with a new perspective on critical issues toxicologists face studying microplastics and their potential effects on human health.
Microplastics, once relatively unknown, have become the focus of local, national, and global interest. Microplastic particles are one subset of plastic debris primarily characterized as having a size of less than five millimeters down to one micrometer; plastic particles smaller than this size are typically termed nanoplastic particles. Together, these particles also may be called NMPs for short (nano- and microplastics). Microplastic particles can either result from the discharge of plastic materials originally manufactured at that size (primary microplastics) or from the degradation of larger plastic debris (secondary microplastics). However, before researchers begin to tackle the question about microplastic risk, you must understand how plastic is manufactured.
Plastic begins as polymers, and through the application of energy (e.g., heat) and incorporation of the desired additives, a plastic material is created. Additives are chemicals intentionally added to plastics to provide a function fit for the purpose to provide, improve, modify, or retain plastic properties such as preventing fire and providing flexibility, durability, or stability during the plastic lifecycle. Additives often are included in plastics because without additives, the plastic materials would have limited applications, be brittle, potentially degrade, and have a very limited shelf life. It is this combination of particle characteristics (e.g., size, shape, polymer type) and the presence of chemical additives that presents toxicologists with a sizable issue.
Another challenge to understand the potential risks of microplastics is the number of potential chemistries used as additives. There is a vast amount of information available through existing regulatory programs; programs like the US Food and Drug Administration’s food contact notification and the Threshold of Toxicological Concern model, coupled with the European Chemicals Agency REACH registration, are sources of valuable exposure and toxicological information. If there is no exposure and toxicological data, scientists can turn to frameworks to predict potential exposures and risks. To reduce the complexity of the issue, scientists might look at human exposure to screen out chemical additives that are low risk due to low exposure potential.
In this course, the first presenter will focus on modeling probabilistic estimates of both direct exposure (e.g., food packaging) and exposure from modifications to an existing numerical bioaccumulation food web model. The second presenter will discuss how risk can be estimated with a newly developed framework when traditional exposure and toxicity data have not been developed, but the molecular structure and chemical tonnage of a chemical is known. These presentations will provide attendees with a new perspective on critical issues toxicologists face studying microplastics and their potential effects on human health.
Microplastics and Chemical Additives: Migration Considerations for Human Exposure. Todd Gouin, TG Environmental Research, Sharnbrook, United Kingdom.
Modeling Chemical Risk without Traditional Exposure or Toxicological Data. Li Li, University of Nevada School of Public Health, Reno, NV.
These courses take place on Sunday, March 10. They are the only Scientific Sessions presented on Sunday and are available for an added fee. There are six courses in the morning, 8:15 am to 12:00 Noon, and six courses in the afternoon, 1:15 pm to 5:00 pm.
The course will be of interest to those engaged in wider aspects of metal toxicology, mechanism of chemical toxicity, neurotoxicology, carcinogenesis, risk assessment, regulatory and safety evaluation, and occupational and public health.
Of the 118 elements in the Periodic Table, 95 elements are classified as metals, and 34 of these have been identified to be hazardous to human health. Differing from organic chemicals, metals’ chemical forms may change, but their basic unit is neither created nor destroyed. Thus, a persistent distribution of metals in the ecosystem with no environmental half-life renders the body system susceptible to metal toxicity in every stage of human life. Exposure to metals occurs in daily life through one’s lifestyle, food intake, occupation, or medical treatment. Considering a worldwide growing geriatric population, the possibility of metal exposure in accelerating the aging process has drawn public attention. For the past half century, assessment of total body metal burden and metal toxicity depends largely on conventional techniques such as atomic absorption spectrophotometry and bioassays; however, more recently, real-time analyses and molecular approaches for gene-environment interactions have been implemented. This advanced course invites experts to address interconnected subjects in metal toxicology.
After a general introduction to metals, the first lecture introduces recently developed innovative technologies to determine level of exposure, such as x-ray fluorescence and neutron activation analysis for real-time, noninvasive, nondestructive quantitation of metal levels in bone, nail, hair, and other tissues. The second lecture discusses recent advancements in ’omics-based technologies, such as whole genome and CRISPR-based screens, in understanding the genetic susceptibility that contributes to metal toxicities. The third lecture uses the body as a whole system to address metal toxicity in an aging population from the impact of aging itself on metal toxicity to the impact of metals on aging. The last lecture further discusses the role of the US Food and Drug Administration in the regulation and safety assessment of metals in food additives. Throughout the course, concerns related to human exposure to these metals and potential risk will be raised and discussed with particular emphasis on metals of concern, such as lead, arsenic, manganese, and mercury.
The course serves well for those who desire an advanced knowledge on metal toxicity in lifespan, metal-gene interaction, risk assessment/regulation, and advanced technical approach for metal quantification. The course will be of interest to those engaged in wider aspects of metal toxicology, mechanism of chemical toxicity, neurotoxicology, carcinogenesis, risk assessment, regulatory and safety evaluation, and occupational and public health.
Introduction: Chemical Properties of Metals Determine Unique Metal Toxicology. Wei Zheng, Purdue University, West Lafayette, IN.
Advances in Metal Detection and Quantification in Human Subjects for Risk Assessment. Aaron Specht, Purdue University, West Lafayette, IN.
Using ’Omics to Advance Our Understanding of Metal-Induced Toxicity. Koren Mann, McGill University, Montreal, QC, Canada.
Intersection of Toxicology and Aging: Current Understanding of Metal Exposure and Aging. Johnny Wise, University of Louisville, Louisville, KY.
Safety Assessment and Regulation of Metals in Food Packaging. Laura Markley, US FDA/CFSAN, College Park, MD.
This course will focus on the basic biology of implantation, early embryonic development, and organogenesis, while discussing comparative cross-species timelines and critical periods during development.
During the pre-implantation period, there are major developmental events that occur, and knowledge of these events is critical for understanding and interpreting potential effects on fertility and early embryonic development. Embryo-fetal development is a complex process that initiates following implantation. The development of the major organ systems during gestation varies in timing across species and drives the design of nonclinical studies conducted to assess the potential human risks of drugs or chemicals. This course will focus on the basic biology of implantation, early embryonic development, and organogenesis, while discussing comparative cross-species timelines and critical periods during development.
In addition to reviewing the key developmental processes and events that occur during the pre-implantation and post-implantation periods, the first two speakers will provide examples of phenotypes that arise following exposures to chemicals or drugs during sensitive developmental periods. The information presented will provide attendees with a basis for interpreting the potential mode of action, as well as interspecies comparison to aid in human risk assessment.
After this critical background information that is the biological basis for prenatal developmental toxicity testing, the succeeding presentations will discuss the design of nonclinical developmental toxicity studies for agrochemicals and pharmaceuticals. These speakers also will present key information on how study outcomes are interpreted and impact human risk assessment. Specific case studies and mode-of-action studies for determining human relevance will provide attendees with real-world examples of how to apply the information presented in this course. Novel approaches for developmental toxicity testing, including the use of comprehensive toxicogenomics data and the use of alternative assays as outlined in ICH S5(R3), also will be discussed, with the aim of providing attendees with a perspective on the evolving future of developmental toxicity testing.
Fertilization to Implantation: Self-Organization of the Mammalian Conceptus. Tristan Frum, University of Michigan Medical School, Ann Arbor, MI.
Comparative Embryo-Fetal Development. Sarah Campion, Pfizer Inc., Groton, CT.
Developmental Toxicity Assessment of Agrochemicals and Using Mode-of-Action Studies to Inform Human Relevance. Kamin Johnson, Corteva Agriscience, Indianapolis, IN.
Developmental Toxicity Testing in Pharma and How Understanding of Embryology Impacts Study Design and Risk Assessment. G. Cappon, ToxStrategies Inc., East Lyme, CT.
This course aims to (1) educate researchers to use high-throughput IVIVE to estimate toxicological points of departure (PODs) in their work and (2) allow decision-makers considering the use of new approach methodologies–based PODs to be better informed about the capabilities and limitations of high-throughput IVIVE.
Next-Generation chemical risk assessment (NGRA) aims to replace and expand traditional toxicity testing via new approach methodologies (NAMs) including in vitro screening. Translating in vitro points of departure (PODs) to in vivo contexts requires in vitro–in vivo extrapolation (IVIVE) based on toxicokinetics. IVIVE methods for single chemicals were developed and vetted by the pharmaceutical industry. However, NGRA is intended to accelerate the pace of chemical risk assessment, potentially generating in vitro PODs for thousands of chemicals and endpoints. These data require chemical-specific IVIVE to be interpreted as in vivo PODs. To allow higher-throughput IVIVE, higher-throughput toxicokinetic (HTTK) methods are needed. Public health risk chemical prioritization efforts based on HTTK are under consideration at the US Environmental Protection Agency, Health Canada, and the European Food Safety Authority.
This course, which focuses on high-throughput approaches for translating NAMs into PODs, complements but is distinct from fellow 2024 SOT Continuing Education (CE) course “Putting Theory into Practice: Using Computational New Approach Methodologies in Next-Generation Risk Assessment,” which focuses on integrating all NAMs needed for next-generation chemical risk assessment. Further, with its focus on high-throughput IVIVE, it is distinct from previous years’ CE course offerings focused on other aspects of physiologically based kinetic modeling and toxicokinetics.
HTTK is playing an increasing role in creating more predictive toxicological methods by allowing (1) conversion of external dose metrics (such as mg/kg/day) to internal/tissue dose metrics and (2) relating in vitro PODs to human-relevant doses. With confirmed speakers that include subject matter experts with many years of experience in the development and application of these tools, this course aims to (1) educate researchers to use high-throughput IVIVE to estimate toxicological PODs in their work and (2) allow decision-makers considering the use of NAM-based PODs to be better informed about the capabilities and limitations of high-throughput IVIVE. Attendees of this course will become familiar with the types of data, models, and tools needed to create “bioactivity:exposure ratio” risk-based prioritizations, and other rapid IVIVE techniques. These tools will include SimCyp, httk, and WebICE. While single chemical IVIVE has built a strong foundation, this course will focus on IVIVE to inform models of toxicity that can be applied to large numbers of chemicals. Each of the four main presentations will provide examples that can be easily adapted to the attendees’ research questions and risk assessments.
Introduction: Fifteen Years of High-Throughput Toxicokinetics. Barbara A. Wetmore, US EPA, Research Triangle Park, NC.
High-Throughput In Vitro Data and Tools for Toxicokinetics. Hiba Khalidi, Certara Inc., Sheffield, United Kingdom.
R Package httk for High-Throughput IVIVE. Caroline Ring, US EPA, Research Triangle Park, NC.
In Vitro–In Vivo Extrapolation for New Approach Methodologies. Xiaoqing Chang, Inotiv, Research Triangle Park, NC.
Moving toward Next-Generation Risk Assessment with High-Throughput IVIVE. Katie Paul Friedman, US EPA, Research Triangle Park, NC.
In this course, attendees will receive an overview of ongoing efforts and an up-to-date strategy to implement reduction and replacement of animals used for acute toxicity in health hazard and risk assessment of chemicals and end-user product formulations for chemical markets, including drugs, pesticides, and consumer products.
The acute toxicity of chemicals, mixtures, and formulations has traditionally been assessed with an animal-intensive in vivo battery of tests that in the context of pesticides is collectively referred to as the “six pack.” These tests include methods to address acute oral, dermal, and inhalation toxicity; primary eye irritation; and dermal irritation and sensitization. In light of recent retrospective reviews of in vivo test method performance, the ability of the individual components of the test battery to reliably predict human-relevant responses to chemicals has been in question. In recent years, international collaborations have led to partial and full replacements for several of these endpoints.
In this course, attendees will receive an overview of ongoing efforts and an up-to-date strategy to implement reduction and replacement of animals used for acute toxicity in health hazard and risk assessment of chemicals and end-user product formulations for chemical markets, including drugs, pesticides, and consumer products. The first portion of the course will provide strategies for predicting, waiving, and measuring oral, dermal, and inhalation lethal doses in order to minimize animal use. LD50 tests have been used for nearly a century and still guide regulatory decision-making, with acute oral lethal dose studies the most prevalent, despite poor reproducibility for Globally Harmonized System of Classification and Labelling of Chemicals categorization. The first talk will cover computational approaches to predict acute oral toxicity using the OPERA suite, a free and open-source/open-data suite of QSAR models providing predictions on physicochemical properties, environmental fate, and toxicity endpoints. The second speaker will outline strategies for waiving in vivo acute dermal toxicity tests for US Environmental Protection Agency requirements, including the implementation of bridging principles to leverage data already gathered and reduce additional testing. The third talk will cover human-relevant approaches to overcome the anatomical and physiological respiratory differences that make rodents poor predictors for human inhalation toxicity. The second portion of the course will cover advanced in silico, in vitro, and ex vivo methods for non-lethal acute endpoint studies. The fourth speaker will cover in vitro methods showing improved reliability and human-relevance relative to the Draize test, as well as proposed avenues for the development of a defined approach for dermal irritation. The fifth speaker will cover the methods in the defined approaches for serious eye damage and irritation, offering a full replacement for the Draize eye test in rabbits. Finally, the last talk will round out the six-pack assessment by covering methods within and updates to the defined approach to skin sensitization, including presenting information on the first fully nonanimal defined approach as an effective replacement for the mouse Local Lymph Node Assay and guinea pig maximization tests for dermal sensitization.
In sum, this course will prepare agency and commercial risk and hazard assessment attendees to replace and reduce animal use in acute toxicity testing batteries wherever possible by offering up-to-date guidance on modern nonanimal methods.
Acute Oral Toxicity Predictions for Environmental Safety Assessment Using CATMoS Models. Kamel Mansouri, NIEHS/NICEATM, Research Triangle Park, NC.
Nonanimal Approaches for Dermal Toxicity. Monique Perron, US EPA, Washington, DC.
Nonanimal Methods for Acute Inhalation Toxicity. Clive Roper, Toxicology Consulting Limited, Edinburgh, United Kingdom.
New Approach Methodologies for Primary Dermal Irritation: Implementing Human-Relevant Testing Approaches. Hans Raabe, Institute for In Vitro Sciences, Gaithersburg, MD.
Nonanimal Methods for Eye Irritation. Nathalie Alépée, L’Oréal Research and Innovation, Paris, France.
Defined Approach for Skin Sensitization. Patience Browne, Organisation for Economic Co-operation and Development, Paris, France.
This course will examine the promise and the pitfalls associated with the rapidly evolving field of cell and gene therapies for the treatment of liquid and solid tumors in regenerative medicine and for the treatment of rare monogenic and acquired diseases.
Cell and gene therapies have emerged as an exciting breakthrough treatment for liquid and solid tumors in regenerative medicine and for the treatment of rare monogenic disorders as well as a wide range of acquired diseases with limited therapeutic options. While these therapies hold tremendous promise in treating complex diseases, they can be associated with significant immune safety concerns that should be carefully considered during preclinical and clinical development. Such risks may include off-target toxicities, integration-associated genomic toxicities, mutagenic transformation, organ/tissue damage, immunogenicity, and exaggerated activation of the immune system. These therapies also are accompanied by a unique set of challenges where standard safety assessments may not apply and additional testing is warranted, often involving novel de-risking approaches and post-marketing surveillance depending on the therapy in question. Recent advances in the field, along with the sustained momentum in developing safer and more effective next-generation cell and gene therapies, have encouraged a closer examination of the promise and the pitfalls associated with this rapidly evolving class of therapies.
In the first talk of this course, attendees will be guided through the history of these advanced therapies, dating back to the first human gene and cell therapies to the present day approved therapies, along with a description of current nonclinical and clinical strategies designed to overcome hurdles associated with immune-related events. Then, the course will take a deeper dive into immune system considerations for cell and gene therapies with the second talk focusing on the immune barriers to in vivo gene therapies, such as immunogenicity to viral and nonviral (i.e., liposomes, nanoparticles) therapies. This talk also will provide pointers to toxicologists for assessing immunotoxicity issues in gene therapy. The third talk will focus on operational aspects of immune system monitoring during the conduct of a nonclinical study from the contract research organization perspective. The fourth speaker will focus on nonclinical safety assessment for engineered CAR-T therapies, with a focus on immunosafety risks posed by engineered Teff and Treg cells, paving the way for the fifth speaker to discuss the next generation of cellular and gene therapies, including an overview of gene editing to avoid graft vs. host disease.
Historical Overview of the Cellular and Gene Therapy Landscape: The Promise to Revolutionize Treatment of Difficult-to-Treat Diseases and Cancers. Ashwini Phadnis Moghe, Takeda Pharmaceutical Company Limited, Cambridge, MA.
Overcoming the Immune Barriers for In Vivo Gene Therapies. Basel Assaf, Sanofi, Waltham, MA.
Considerations for Immune System Monitoring during the Conduct of Nonclinical Studies for Novel Gene-Modifying Therapies. Brian McIntosh, LabCorp, Madison, WI.
Nonclinical Safety Assessment of Engineered CAR-T Therapies. Herve Lebrec, Sonoma Biotherapuetics, South San Francisco, CA.
Infinite Possibilities!?! The Next Generation of Cellular and Gene Therapies. Kathryn Fraser, Takeda Pharmaceutical Company Limited, Cambridge, MA.
This course will provide attendees with an understanding of the underlying concepts, principles, and techniques of weight of evidence analysis in the context of chemical risk assessment.
The depth and breadth of information used to characterize hazard and exposure have expanded beyond traditional in vivo studies to encompass ’omics, in vitro, and computational approaches. Interpreting various lines of scientific evidence is rarely unambiguous or straightforward, as the same information can support multiple legitimate interpretations and conclusions. The weight of evidence (WOE) approach is, therefore, essential in toxicology and risk assessment to support decision-making. Although some tools are available to enhance transparency and consistency in WOE, professional judgement remains necessary in almost all cases to evaluate the strengths and limitations of each data source. Subject experts also are expected to provide knowledge-based insights on how various factors, such as experimental design, can influence data comparability. WOE is particularly challenging due to the diverse regulatory and scientific contexts involved in the analysis. Therefore, WOE must be fit for purpose and framed in problem formulation. This course will provide attendees with an understanding of the underlying concepts, principles, and techniques of WOE analysis in the context of chemical risk assessment.
Since the WOE approach is flexible and adaptable (i.e., it can be tailored to fit specific risk assessment contexts or regulatory requirements), this course will use case examples to demonstrate the integration of various lines of scientific evidence generated from both conventional and new approach methods for different purposes, including industrial chemicals, personal care, fragrance, and agrochemical sectors. The case examples will illustrate the concept of fitting WOE to a specific purpose, including optimizing the design of animal toxicity studies (where required), estimating points of departure using nonanimal data, and predicting toxicity of a structurally similar chemical with limited or no toxicity data, as well as merging monitoring data and exposure model predictions.
The introduction will describe the overarching, flexible, and adaptable principles in WOE and problem formulation that will be illustrated throughout the course. The first presentation will cover the fundamental principles in problem formulation, including defining resources and contexts, and computational tools to integrate and evaluate data for fit-for-purpose WOE. A WOE approach using multiple lines of evidence for contemporary study design for regulatory required animal studies to maximize the use of computational modeling, in vitro, and pharmacokinetic data will be shared in the second talk, while the third presenter will discuss a WOE approach for considering the appropriateness of new in vitro inhalation methods for the evaluation of fragrances in risk assessment. The fourth presentation will focus on interpreting multiple lines of evidence using in vitro and computational approaches for safety evaluation of cosmetics. The next presenter will show WOE across in vitro physiologically based kinetic and short-term in vivo data to consider human relevance of particular hazards. The course’s final presentation will focus on exposure assessment and a WOE approach for evaluating and applying measurements and models together in exposure characterization, followed by an interactive session where each presenter will pose a question to test attendee knowledge of fit-for-purpose WOE and problem formulation principles and application.
The learning goals for this course are to (1) understand the value and content of problem formulation; (2) gain knowledge in the concept of fit-for-purpose WOE and its connection to problem formulation; (3) learn how WOE analysis can be used to analyze data and evaluate risks for chemicals that have varying degrees and types of available data; and (4) learn about different types of risk assessments and decision contexts.
Problem Formulation: The Foundation That Supports Any Weight of Evidence Approach. Michelle Embry, HESI, Washington, DC.
Using a Weight of Evidence Approach to Optimize the Design of Animal Toxicity Studies. Cecilia Tan, US EPA, Durham, NC.
Using Weight of Evidence for Inhalation Exposure Safety Evaluation of Fragrances. Nikaeta Sadekar, Research Institute for Fragrance Materials, Mahwah, NJ.
Using Computational Models to Build a Weight of Evidence in Safety Assessments of Cosmetic Ingredients. Alistair Middleton, Unilever, Bedford, United Kingdom.
Use of Weight of Evidence and Uncertainty Analysis in Hazard Characterization and Risk Assessment of Agrochemicals. Marco Corvaro, Corteva Agriscience, Rome, Italy.
Merging Measurements and Models in a Weight of Evidence Approach for Exposure Estimation. Jon Arnot, ARC Arnot Research and Consulting and University of Toronto, Toronto, ON, Canada.
This course aims to provide participants with a fundamental understanding of the concepts and principles of benchmark dose modeling methodology and demonstrate its usefulness through a few applications and case studies, covering current practice, issues, and challenges.
The rapid expansion of benchmark dose (BMD) modeling methodology has brought it into the spotlight of chemical risk assessment in a variety of applications, including evaluating the safety of substances as diverse as metals, pesticides, nutrients, and pharmaceuticals. This course aims to provide participants with a fundamental understanding of the concepts and principles of BMD modeling methodology and demonstrate its usefulness through a few applications and case studies, covering current practice, issues, and challenges.
The first presentation will provide a general introduction on the BMD modeling methodology and demonstrate its utility to assess dose-response relationships and estimate critical doses using multiple types of data in the recently developed Bayesian benchmark dose modeling (BBMD) system. An application of the BMD modeling approach in drug development for the purposes of safety evaluation, as well as a variety of its advantages over the No-Observed-Adverse-Effect-Level method, will be discussed in the second presentation. Next, a critical issue in BMD modeling, the definition of benchmark response (BMR), will be described by the third presenter, alongside how to standardize, refine, and enrich the BMD approach through the selection of adequate BMR to support prioritizations within food safety. The last presentation will demonstrate an alternative method to derive a human health–protective, point-of-departure value from model organism exposure studies based upon a comprehensive analysis of the transcriptome. Throughout the course, the BMD modeling strategies will be consistently highlighted and demonstrated through cases studies.
Benchmark Dose Modeling Strategies and Tools for Dose-Response Assessment Using Toxicological, Epidemiological, and Genomic Data. Kan Shao, Indiana University, Bloomington, IN.
Benchmark Dose-Response Modeling Is Advantageous for Multiple Endpoints Analysis for Drug Safety Evaluation Purposes. Antero Vieira da Silva, Karolinska Institutet, Stockholm, Sweden.
Standardization, Refinement, and Enrichment of the Benchmark Dose Approach to Support Prioritizations within Food Safety. Salomon Sand, Swedish National Food Agency, Uppsala, Sweden.
Application of a Benchmark Dose–Based Transcriptome Point of Departure in Chemical Safety Assessment. Kamin Johnson, Corteva Agriscience, Indianapolis, IN.
This course is designed to first give an overview of toxicology studies and the pathology parameters and assessments during the drug development process. Once that basic understanding is established, the next step is to work through what adversity means in the context of toxicology and some of the challenges and implications of adverse findings.
Although the design and results of a toxicity study follow general guidelines, at certain times, the actual interpretation of a study can be complex. Such interpretation, which encompasses multiple functional areas with an array of different endpoints, ultimately needs to assess whether any findings are detrimental to the animal and whether such a finding poses a risk to human health. One critical piece of the interpretation relies on gross, clinical, and anatomical pathology endpoints. Proper incorporation of these pathology data—and as such communication and exchange between the pathologist and toxicologist—is critical for this integrated analysis. There are many subtleties that are not often appreciated in the equation, including the phase of drug development, duration of the toxicity study, translational aspects, indication and risk/benefit of the therapeutic, and weight of evidence supporting the interpretation.
To address these aspects, this course is designed to first give an overview of toxicology studies and the pathology parameters and assessments during the drug development process. Once that basic understanding is established, the next step is to work through what adversity means in the context of toxicology and some of the challenges and implications of adverse findings. For example, can a finding be harmful to the animal but irrelevant to human health—or vice versa? How and where is such information communicated? When and how do investigational studies or endpoints help? An experienced set of speakers will address some of the best practices in interpreting pathology findings and walk through some challenging scenarios, including weight of evidence approaches that were utilized to contextualize a pathology finding where adversity was unclear. The last portion of the course will feature live polling to allow the audience to experience unique scenarios and make their own interpretations.
Understanding the Aversity to Adversity: Pathologists’ Perspectives on Adversity in Nonclinical Toxicity Studies. Helen Booler, Novartis Institutes for BioMedical Research, Basel, Switzerland.
The Interconnectivity between Pathology and Toxicology. Satoko Kakiuchi-Kiyota, Genentech Inc., South San Francisco, CA.
Integrating Clinical Pathology into the Assessment of Adversity. Paula Katavolos, Bristol Myers Squibb, New Brunswick, NJ.
Pathology Evaluation in Adversity and Weight of Evidence Decisions for Neurotoxic Findings in Nonclinical Studies. Brad Bolon, GEMpath Inc., Longmont, CO.
Alternative Approaches to Routine Histopathology in Drug Development: How Can Alternative Technologies Inform on Overall Risk Profile? Mark Hoenerhoff, Inotiv, Kalamazoo, MI.
How to Efficiently Anchor Pathology Discussions to Toxicology. Marie Lemper, UCB S.A., Belgium, Cambridge, MA.
This first-of-its-kind SOT Continuing Education course on the placenta is composed of a diversity of early, mid-career, and established investigators from academia and government with expertise in placental toxicology that spans from basic bench research to human studies that capitalize on placental tissue as the basis for epidemiological work.
Pregnancy is considered among the most vulnerable life stages, for both the mother and the child. Despite being a critical bridge between the maternal exposome and fetal development, the placenta is an overlooked and severely understudied organ in reproductive toxicology. An appreciation and understanding of the placenta are critical to study design, execution, and interpretation of effects on pregnancy and fetal, and maternal health.
This first-of-its-kind SOT Continuing Education course on the placenta is composed of a diversity of early, mid-career, and established investigators from academia and government with expertise in placental toxicology that spans from basic bench research to human studies that capitalize on placental tissue as the basis for epidemiological work. The objectives for this course are to provide attendees with an overview of the basic biology of placental function and comparative biology across commonly used animal models. The course also will include placental toxicology approaches spanning cell-based techniques, including more traditional/routine approaches, to state-of-the-art approaches, such as placental chemical transfer and microfluidics, and ways to incorporate high-throughput analyses that can ultimately help inform regulatory decisions. The use of animal models in placental toxicology research, sample collection considerations, and study design of human cohort studies also will be covered.
Placental Biology Basics. Elana Elkin, San Diego State University, San Diego, CA.
Current Models for Studying Placental Toxicology. Sean Harris, University of Michigan, Ann Arbor, MI.
Novel Models of Placental Transfer for Toxicological Studies. Phoebe Stapleton, Rutgers, The State University of New Jersey, Piscataway, NJ.
Application of Molecular Epidemiological Approaches to Gain Mechanistic Understanding of How Prenatal Exposures Influence Fetal Development. Alison Paquette, Seattle Children’s Research Institute, Seattle, WA.
Use of High-Throughput Analyses in Placental Toxicological Studies. Bevin Blake, US EPA, Durham, NC.
Capitalizing on 3D and Microfluidic Technologies to Model the Placenta for Toxicological Studies. Almudena Veiga-Lopez, University of Illinois at Chicago, Chicago, IL.
To help researchers and chemical assessment practitioners prepare for a near future in which open science standards are being implemented, this course will provide a comprehensive primer on what it means for data to be FAIR, a summary of what open science and data policies look like and how they support better regulatory science and public health decision-making, and a practical introduction to the open science workflows that researchers should anticipate engaging with to produce FAIR data.
Open science and data transparency policies, particularly implementation of FAIR (Findable, Accessible, Interoperable, and Reusable) data principles in research, have become a major priority of US, national, and global research governance and funding organizations. The purpose of these policies is to make it easier to find, validate, analyze, reproduce, and reuse scientific data in an era when evidence about the health effects of chemical exposures is being generated faster than it can be cataloged and processed.
To help researchers and chemical assessment practitioners prepare for a near future in which open science standards are being implemented, this course will provide a comprehensive primer on what it means for data to be FAIR, a summary of what open science and data policies look like and how they support better regulatory science and public health decision-making, and a practical introduction to the open science workflows that researchers should anticipate engaging with to produce FAIR data. These workflows will include best practices for increasing credibility when working with sensitive or proprietary datasets that cannot be made openly available.
To achieve this, the course will present five expert perspectives on FAIR data and open science practices, with senior figures in research and publishing providing actionable advice aimed at helping participants prepare strategies for benefiting from FAIR data policies rather than being disrupted by them. There also will be a practical exercise that demonstrates how to provide the necessary support materials to make research data, code, and materials FAIR and reproducible. Finally, specific information needs of participants will be addressed through an interactive question-and-answer panel with speakers.
The target audiences for this session include (1) early career researchers who will need to implement open science policies at bench; (2) senior researchers running labs who need to know how to train their PhDs and postdocs in data standards compliance issues; (3) editors who, as the gatekeepers and publishers of research, will need to support scientists and the policy goals of funders and agencies; (4) contract research organizations and research consultancies that may be expected to comply with open data standards but also have to address proprietary concerns; and (6) funders who may want to support the policy goals of organizations such as the National Institute of Environmental Health Science through their independent grant programs.
Participants should note that any opinions expressed in the session reflect those of the individual presenters and not their employers or otherwise affiliated organizations.
Open Data Standards: What They Are and Why They Matter. Charles Schmitt, NIEHS, Durham, NC.
US EPA Plans for Developing, Supporting, and Using Open Data Standards. Michelle Angrish, US EPA, Durham, NC.
Practical Steps for Implementing FAIR Principles in Research. Kaitlyn Hair, University of Edinburgh, Edinburgh, United Kingdom.
An Unexpected Journey through Another Researcher’s Data. Practical Exercise with All Presenters
How Journals and Publishers Can Support Open Data Standards. Paul Whaley, Lancaster University, Lancaster, United Kingdom.
Tools to Facilitate Compliance with Open Data Standards. David Mellor, Center for Open Science, Charlottesville, VA.
Panel Q&A: Challenges and Opportunities in Data Standards—Your Questions Answered. All Presenters
To show how next-generation risk assessment (NGRA) concepts are being put into practice with a variety of tools and scenarios, this course will offer participants an opportunity to gain in-depth knowledge of the available computational approaches often used as the components of an NGRA through presentation of the basic structure and functional purpose of these approaches, supplemented by real-world application in case examples that have been or will be used in regulatory decision-making contexts.
Next-Generation risk assessment (NGRA) is an approach to understanding the potential risks of ingredients and chemicals using new approach methodologies (NAMs)—specifically to assess the exposure, bioactivity, and metabolic and kinetic behavior of a chemical for a specified use. Following an NGRA approach for any given chemical or use scenario usually involves building an Integrated Approach to Testing and Assessment, including nonanimal computational and in vitro methods, as well as relevant chemical-specific information. There is a recognized need for increased education about the NGRA approach in general but especially how information sources are selected and the data from them analyzed to decide whether to continue gathering more information or whether a decision can be made. A number of case studies have been published in the literature and reviewed by regulatory authorities in different contexts.
To show how NGRA concepts are being put into practice with a variety of tools and scenarios, this course will offer participants an opportunity to gain in-depth knowledge of the available computational approaches often used as the components of an NGRA through presentation of the basic structure and functional purpose of these approaches, supplemented by real-world application in case examples that have been or will be used in regulatory decision-making contexts. The course will begin with an introduction to NGRA concepts, the state-of-the-science, and the progress toward regulatory acceptance. In the second talk, participants will gain an understanding of key human exposure modeling approaches and how they can be used to estimate consumer and occupational exposure to a variety of ingredients and chemicals. Next, computational methods for deriving a quantitative effect level from in vitro bioactivity information using in vitro–in vivo extrapolation will be discussed, drawing on learnings from Health Canada and collaborative activities. Participants will understand how to integrate the information they have gathered as a weight of evidence and determining whether a safety decision can be made or more testing is needed after the fourth presentation by using case studies. The penultimate talk will discuss innovative activities to increase confidence in the use and acceptance of approaches for NGRA across regions and sectors, with special considerations relevant to computational approaches. Finally, additional educational resources to further the group’s overall knowledge of computational approaches to conduct NGRA will be provided by the last speaker, who also will incorporate audience participation. Together, these talks will help participants put NGRA approaches into daily practice through building a better understanding of how computational NAMs can support all stages of NGRA decision-making and by demonstrating available tools using case studies.
Next-Generation Risk Assessment Overview and Regulatory Landscape. Gavin Maxwell, Unilever, Bedford, United Kingdom.
Use of Computational New Approach Methodologies in Human Exposure Modeling. John Wambaugh, US EPA, Research Triangle Park, NC.
Use of Computational New Approach Methodologies in Bioactivity Characterization. Tara Barton-Mclaren, Health Canada, Ottawa, ON, Canada.
Use of Computational New Approach Methodologies in Next-Generation Risk Assessment Decision-Making. Alistair Middleton, Unilever, Bedford, United Kingdom.
Building Confidence in New Approach Methodologies to Support Next-Generation Risk Assessment. Nicole Kleinstreuer, NIEHS/NICEATM, Durham, NC.
Bridging the Gap: Educational Needs and Resources. Kristie Sullivan, Institute for In Vitro Sciences Inc., Gaithersburg, MD.
This course will cover specific examples of new approach methodologies that are currently available to assess respiratory toxicity, including their technical challenges and refined dosimetry characterizations.
New approach methodologies (NAMs) anchored to known mechanisms of human toxicity are increasingly being used to assess the potential toxicity of inhaled substances. Various in silico and in vitro systems can be used to assess respiratory toxicity, and the selection of an appropriate system depends on multiple factors, including the goal of the study, physicochemical properties of the test substance, and biological effects of interest. Despite the need and the growing use, there is an obvious void in NAMs that are accepted for regulatory use related to respiratory toxicity. However, efforts are underway in fulfilling the gap with NAMs that are anchored to known mechanisms of human toxicity and well-characterized, which are critical parameters for their use for risk assessment in support of decision-making.
This course will cover specific examples of methods that are currently available to assess respiratory toxicity, including their technical challenges and refined dosimetry characterizations. This course will discuss case studies of how data are generated using these methods, anchored to adverse outcome pathways, and incorporated into Integrated Approaches to Testing and Assessment for risk assessment and hazard identification of inhaled substances. This course also will explore how to build scientific confidence in such methods to facilitate their use in decision-making and in regulatory acceptance of NAMs.
Establishing Confidence in New Approach Methodologies for Inhalation Toxicity Testing. Dave Allen, Inotiv, Durham, NC.
Human-Derived In Vitro and Ex Vivo Test Systems to Assess Respiratory Toxicity of Chemicals. Holger Behrsing, Institute for In Vitro Sciences Inc., Gaithersburg, MD.
Key Considerations for the Use of In Vitro Systems in the Evaluation of Inhalable Substances for Research and Testing. Shaun McCullough, RTI International, Durham, NC.
Regulatory Use of New Approach Methodologies for Inhalation Risk Assessment. Monique Perron, US EPA, Washington, DC.
Case Study on the Use of an Integrated Approach for Testing and Assessment for New Approach Methodology for Refining Inhalation Risk Assessment from Point-of-Contact Toxicity of the Pesticide Chlorothalonil. Marie Hargrove, Syngenta, Greensboro, NC.
|
Early-Bird |
Standard |
Final |
---|---|---|---|
SOT Member/Global Partner |
$65 |
$100 |
$135 |
SOT Retired/Emeritus Member |
$65 |
$100 |
$135 |
Nonmember |
$85 |
$120 |
$155 |
Postdoctoral |
$65 |
$100 |
$135 |
Student |
$35 |
$70 |
$105 |
SOT Member/
Global Partner
$65
SOT Retired/
Emeritus Member
$65
Nonmember
$85
Postdoctoral
(SOT Member/Nonmember)
$65
Student
(SOT Member/Nonmember/
Undergraduate)
$35
SOT Member/
Global Partner
$100
SOT Retired/
Emeritus Member
$100
Nonmember
$120
Postdoctoral
(SOT Member/Nonmember)
$100
Student
(SOT Member/Nonmember/
Undergraduate)
$70
SOT Member/
Global Partner
$135
SOT Retired/
Emeritus Member
$135
Nonmember
$155
Postdoctoral
(SOT Member/Nonmember)
$135
Student
(SOT Member/Nonmember/
Undergraduate)
$105