The majority of commercial chemicals that enter the market in the United States every year have insufficient health and safety data. For pesticides, the U.S. Environmental Protection Agency uses a variety of techniques to fill data gaps in order to evaluate chemical hazard, exposure and risk. Nonetheless, public concern over the potential threat that these chemicals pose has grown in recent years, along with the realization that traditional animal-testing methods are not pragmatic by means of speed, economics or ethics. Now, researchers at the George Washington University have developed a new computational approach to rapidly screen pesticides for safety, performance and how long they will endure in the environment. Moreover, and most importantly, the new approach will aid in the design of next-generation molecules to develop safer pesticides.
âIn many ways, our tool mimics computational drug discovery, in which vast libraries of chemical compounds are screened for their efficacy and then tweaked to make them even more potent against specific therapeutic targets,â Jakub Kostal, an assistant professor of chemistry at GW and principal investigator on the project, said. âSimilarly, we use our systems-based approach to modify pesticides to make them less toxic and more degradable, while, at the same time, making sure they retain good performance. Itâs a powerful tool for both industry and regulatory agencies that can help design new, safer analogs of existing commercial agrochemicals, and so protect human life, the environment and industryâs bottom line.â
Using their model, the team analyzed 700 pesticides from the EPAâs pesticide registry. The model considered a pesticideâs likely persistence or degradation in the environment over time, its safety, and how well it performed at killing, repelling or controlling the target problem.