Awarded 2018 Best E-Newsletter by the National Association of Government Communicators
Environmental Factor

Environmental Factor

Your Online Source for NIEHS News

November 2017

Breaking new ground in toxicogenomics

Experts reviewed an approach proposed by National Toxicology Program scientists for analyzing and interpreting chemical potency.

Scott Auerbach Auerbach spoke frequently on the complex and highly technical modeling approach discussed at the peer review meeting. (Photo courtesy of Steve McCaw)

Experts in toxicogenomics and bioinformatics met Oct. 23-25 at NIEHS to review a new approach, proposed by scientists from the National Toxicology Program (NTP), for analyzing and interpretating chemical potency.

If adopted by the toxicogenomics community, the new method will accelerate the process of determining the biological potency of specific chemicals — that is, the exposure level below which a toxic response is unlikely to occur.

"We can potentially use that dose, identified in these new studies, to determine the dose that you or I are allowed to be exposed to in a product or in the environment," explained Scott Auerbach, Ph.D., head of the NTP Toxicoinformatics Group and lead NTP scientist on the project. "It means we can provide more rapid information on chemicals on the market, and specifically, the dose levels where they are likely to produce an effect."

The process is known as genomic dose-response (GDR) modeling. It is a statistically intensive method that takes advantage of new, highly advanced analytical genomics tools. The tools provide potency estimates at a molecular level using genomic data derived from short-term exposure studies in animals and human cell cultures.

Although its ability to detect specific hazards is limited, GDR modeling could speed up and expand determinations of potency, with a potentially important impact on public health.

The main tool in the proposed GDR toolbox is a freely available software package called BMDExpress 2.0, which is the product of an ongoing collaboration between NTP, Health Canada, the U.S. Environmental Protection Agency (EPA), and Sciome LLC.

Lively exchange

Yauk Yauk made good use of the chairperson’s gavel and kept the meeting running smoothly and on time. (Photo courtesy of Steve McCaw)

If it is true that the devil is in the details, then it was entirely appropriate that the peer review panel meeting took place just before Halloween, as the experts engaged in spirited debate about the many decision points and parameters involved in the proposed approach (see sidebar) over the course of the three-day gathering.

"The participants were all leaders in the field, so they brought their own practices and informed opinions to the table," Auerbach noted. "The discussions were lively, and we learned an enormous amount that will yield important improvements to the NTP’s proposed approach."

The committee ultimately endorsed a series of recommendations and suggestions to help NTP refine the approach.

"A lot of thought and a huge amount of work have gone into this," observed meeting chair Carole Yauk, Ph.D., from Health Canada. "The approach is based on conventional practices in risk assessment and is grounded in empirical data, two strengths that will help in regulatory acceptance."

At this point, the draft NTP approach is not designed to result in risk assessment, but rather is intended to be a so-called screening-level tool, to help prioritize which chemicals should be assessed for risk. Eli Lilly researcher James Stevens, Ph.D., urged NTP to consider the proposed approach as a pilot endeavor to inform later efforts in risk assessment.

James Stevens Stevens spoke from the pharmaceutical industry perspective, particularly on the biological interpretation of GDR data. (Photo courtesy of Steve McCaw)

Auerbach agreed. "You are setting up the situation where you are going to evaluate the use of this for risk assessment. You’re not using it for risk assessment; you’re looking at evaluation," he said.

The peer review panel meeting was preceded by a series of four preparatory webinars, which are available online.

(Ernie Hood is a contract writer for the NIEHS Office of Communications and Public Liaison.)


David Gerhold David Gerhold, Ph.D., from the National Center for Advancing Translational Sciences, presented an automated method to derive information from genomic data. (Photo courtesy of Steve McCaw)
Russell Thomas The proposed NTP modeling approach was based in part on pioneering work by Russell (Rusty) Thomas, Ph.D., director of the EPA National Center for Computational Toxicology, and his colleagues. (Photo courtesy of Steve McCaw)
Wright Burgoon Panelist Fred Wright, Ph.D., right, described the approach used by his group at North Carolina State University. Lyle Burgoon, Ph.D., left, spoke about the approach his U.S. Army research lab uses. (Photo courtesy of Steve McCaw)
Pierre Bushel Pierre Bushel, Ph.D., from the NIEHS Bioinformatics and Computational Biology Branch, described methods to harmonize transcriptomic data across gene expression platforms. (Photo courtesy of Steve McCaw)
ntp vote After their suggested recommendations were incorporated into each of the proposed approach segments, the panelists voted their support. (Photo courtesy of Steve McCaw)
Back To Top