“Tox21”—short for “21st century toxicology”—has become synonymous with approaches for modernizing toxicity testing and assessment of chemicals. Tox21 is also the name of the collaboration between selected U.S. government agencies, including the Environmental Protection Agency, the National Institutes of Health, and the Food and Drug Administration. Christopher Austin, M.D., Director of the NIH Chemical Genomics Center talks about the significance of the collaboration. Video Courtesy National Human Genome Research Institute.
These test methods can be automated, so scientists can use machines and robots to test up to 2.2 million compounds a week, monitoring most of the possible effects on cells, tissues, and organs resulting from exposure to each chemical. This type of screening is called high throughput screening, you can see video of two robot’s moving plates with chemical compounds along a production line. Video Courtesy National Human Genome Research Institute.
Chemical Testing Basics
Legislative revisions to the Toxic Substance Control Act (TSCA) will change the way industrial chemicals are regulated. Since chemicals are regulated based on their potential toxic effects, whether, how, and when toxicity testing is conducted is an essential cornerstone—and we must ensure it is laid down correctly.
About Toxicity Testing
Since the 1930s, scientists have used cats, dogs, rats, mice, primates, and other animals to test the safety of chemicals and pharmaceuticals. It was thought that these tests would predict the potential for harm to humans.
Not only can toxicity testing result in unalleviated pain, suffering, and death for the animals used in such tests, it is now widely acknowledged that these tests are not particularly predictive of the substances’ effects in humans. This realization, coupled with advances in scientific research and ethical theory, has led to calls for a new kind of toxicology testing anchored in human biology and an understanding of biological pathways.
Reform is also needed because public information on many chemicals is not available, as discussed here. The figure to the left illustrates the current uncertainty that likely exists in the chemical market.1 A chemical-by-chemical approach to reducing this uncertainty is extremely inefficient and will never succeed, especially given the challenges presented by novel chemical products, such as nanomaterials. One analysis published in early 2009 estimates that thoroughly testing all existing nanomaterials—using current methods—could take 34-53 years.2 And nanomaterials are just a small portion of the entire chemical market.
Ideally, a system of nonanimal methods could assess many chemicals in a few weeks at a cost of around $25,000—compared to current estimates of $6 million and three years per chemical.3 In the short-term, assessing categories of chemicals using a combination of cell- and computer-based approaches and informed by information on levels of human and environmental exposure, using Integrated Testing Strategies is the best approach.4
Toxicity Testing for the 21st Century: A New Standard
Collecting enough information to ensure safer chemicals under TSCA requires a shift to human relevant, high-throughput testing methods and strategies. Since these methods do not involve the use of animals, they are often referred to as “alternative methods.” Over the next few years, these “alternatives” will become the new standard for toxicology testing because they offer indisputable scientific, practical, and ethical advantages.
The blueprint for development and implementation of these alternatives is the National Research Council report, Toxicity Testing in the 21st Century: A Vision and a Strategy in 2007.
“Tox21”—short for “21st century toxicology”—has become synonymous with approaches for modernizing toxicity testing and assessment of chemicals. Tox21 is also the name of the collaboration between selected U.S. government agencies, including the Environmental Protection Agency (EPA) and the National Institutes of Health (NIH).
“This report envisions a not-so-distant future in which virtually all routine toxicity testing would be conducted in human cells or cell lines in vitro by evaluating perturbations of cellular responses in a suite of toxicity pathway assays using high throughput robotic-assisted methodologies.”
The EPA commissioned the writing of the NRC’s landmark toxicity testing report, as it and other scientific bodies had long recognized the inherent shortfalls of the current, animal-based approach.
The “NRC Vision” described in the report outlines a practical plan for revolutionizing toxicity testing by harnessing scientific advances in molecular and cell biology, genetics, robotic testing systems, and computational power. These advances, combined with a robust system of environmental monitoring, will create a system of chemical testing and assessment that is more predictive and more protective of human health.
The NRC Vision is based primarily on new toxicity testing methods that can identify unhealthy changes in normal biological processes in response to a chemical. These tests are conducted in human cells, cell lines, or with cellular components. Changes occur within cells and tissues long before the effects of toxicity would become evident in an animal test. For example, cellular changes that lead up to carcinogenicity can be measured in cells, cell lines, or tissues without the need to look for the eventual disease response—in this case, a tumor —in an animal.
In the NRC Vision, a suite of these tests would replace a single animal test. Each test in the suite would measure different changes that occur in response to a chemical, giving a comprehensive picture of the affect the chemical may have on normal cell and tissue function. These test methods can be automated, so scientists can use machines to test hundreds and even thousands of chemicals at the same time, monitoring most of the possible effects on cells, tissues, and organs resulting from exposure to each chemical. This type of screening is called high throughput screening.
Using this approach, regulatory scientists can simultaneously examine numerous possible health effects of exposure to thousands of chemicals. With these data, they can determine which chemicals present a concern and flag those for further testing, setting aside chemicals that present little or no health concerns. This approach would allow scientists to quickly analyze the backlog of chemicals that may not have undergone rigorous safety testing. That would reduce the amount of uncertainty in the chemical market, optimizing protection of human health and the environment.
Visit the Resources and FAQ pages for more information about the Tox21 effort and the scientific and policy advances that are already occurring—and those that still need to occur—in order to accomplish it.
1. REACH, non-testing approaches and the urgent need for a change in mind set. Schaafsma G, Kroese ED, Tielemans ELJP, Van de Sandt JJM, van leeuwen CJ. Reg Tox Pharm 2009;53:70–80. Abstract.
2. The Impact of Toxicity Testing Costs on Nanomaterial Regulation. Choi J, Ramachandran G, Kandlikar M. Env Sci & Tech 2009; 43(9):3030-3034.
3. Testing for carcinogens: Shift from animals to automation gathers steam—slowly. Schmidt C. JNCI 2009;101(13):910-912.
4. Using chemical categories to fill data gaps in hazard assessment. van Leeuwen K, Schultz TW, Henry T, Diderich B, Veith GD. SAR QSAR Environ Res. 2009;20(3-4):207-220.