Anchoring any hazard assessment to the dose—or its antecedent, exposure—is vital not only to characterizing hazard, but also in understanding risk. However, there are many ways to define and to characterize exposure.
A January 2017 report by the National Academies of Science (NAS), Using 21st Century Science to Improve Risk-Related Evaluations, found that “chemicals that have predicted high concentrations in humans and environmental media can then be used to identify toxicity-data gaps and set priorities for toxicity-testing for risk-based applications.” The NAS report recommended that “interpreting the monitoring data and appropriately applying exposure data in risk-based evaluations will require continued complementary development and evaluation of exposure assessment tools and information, such as fate and transport models, PBPK models, and data on chemical quantity and use, partitioning properties, reaction rates, and human behavior.”
Evaluating Exposure Impacts
Although many efforts have focused exclusively on the contribution of one chemical or stressor to an adverse effect, the modulation of these effects or resulting cellular responses are in many cases impacted by the overall environment or “exposome” of the cell or organism. Wild suggested an exposome that “encompasses life-course environmental exposures (including lifestyle factors), from the prenatal period onwards.” Just as genome-wide association studies (GWAS) can search for genes related to health effects, exposome-wide association studies (EWAS) may allow new toxicological hypotheses for chemical-induced alterations. It is becoming increasingly recognized that exposures to any stressor, chemical or otherwise, need to be considered within the broader context of diet, behavior, and other agents, endogenous and exogenous alike. Miller and Jones recently proposed a refinement of the original definition of the exposome to capture these thoughts considering cumulative assessments of environmental influences and biological response across the life course.
New Tools in Analyzing Exposure
There is a current paradigm shift in exposure science comparable to the advent of polymerase chain reaction (PCR) in biology and inexpensive computing in analytics[1–4]. The tools available to exposure scientists now include:
- meaningful passive samplers that are as simple as wrist bands;
- computational models that can make coarse estimates for thousands of chemicals[6, 7]; and
- non-targeted analytical chemistry that can identify thousands of previously untested chemicals in everything from a glass of drinking water to a handful of dust.
When used together, toxicological data and exposure information allows prioritization and analysis of the potential risk posed by chemicals. Traditionally, there has been little or no exposure data available for most chemicals to place possible chemical hazard within a relevant human exposure context[1, 10]. New and emerging computational exposure science tools are rapidly turning this on its end[3, 4].
Advances in exposomics go hand-in-hand with advances in non-targeted analytical chemistry and suspect screening (i.e., untargeted analysis). As opposed to targeted analytical chemistry focused upon individual analytes, untargeted high resolution mass spectrometry now allows indicators of thousands of chemicals to be discovered in biological and environmental media. Identifying these chemicals can be fraught with difficulty, but the needed tools and databases are rapidly developing. A particularly relevant, novel methodology was suggested by Rager et al., where, in addition to exposure considerations, data from high-throughput toxicity screening was used to prioritize among potential matches to mass features (i.e., those chemicals with greater potential to do harm should be considered first). By the nature of their methods, untargeted analyses help identify those mixtures that occur in people and the environment.
Understanding Exposure to Improve Consumer Products
An area that is currently of keen interest to the exposure science community is consumer products (e.g., cosmetics, cleaning products, building materials, and food contact materials). The presence of chemicals in such “near-field” sources has been shown to be a key driver of high exposure levels in Centers for Disease Control (CDC) National Health and Nutrition Survey (NHANES) exposure biomonitoring data. Information on chemical constituents of products, while only one prerequisite for exposure, provides demonstrable heuristics for estimating human exposure. New high-throughput measurement strategies that combine high-resolution mass spectrometry with chemo-informatics data could enable rapid forensic analysis of chemicals present in these products. Government requirements for product testing and new industry initiatives are providing additional inventories of chemicals present in products due to either intentional inclusion or contamination. These new data sources, in concert with data-driven or mechanistic modeling approaches, can elucidate potential human exposures to thousands of commercial chemicals and reduce uncertainty in modeling approaches.
Identifying Exposure Pathways
Pharmaco/toxicokinetics (PK/TK) exist at the intersection of toxicology and exposure science. Some ideas from exposure science, such as reverse dosimetry to infer exposures from biomarkers, have already been adapted to toxicology for establishing risk priorities for chemicals in the environment. Efforts to build rapid, “fit-for-purpose” models to describe PK/TK for hundreds of chemicals are starting to yield insights that can inform development of more traditional PK/TK models[20, 21].
Teeguarden et al. argue for “the aggregate exposure pathway (AEP) concept as the natural and complementary companion in the exposure sciences to the adverse outcome pathway (AOP) concept in the toxicological sciences.” There is already profound evidence for the need for exposure pathway consideration: the biomarkers of chemical exposure resulting from “near-field” (in the home) sources are significantly higher than those for chemicals with “far-field” sources. New models have been developed to predict from chemical structure the probable functional roles of chemicals in consumer products since that information is often unavailable[22, 23]. These new models have been combined with toxicity information to begin to suggest “green” substitutes—chemicals that may serve similar functions with lesser bioactivity.