As it happens, I saw three new papers about toxics regulation at about the same time recently. Between the three, they give a clear picture about the U.S. stance on toxic chemicals. I’ll discuss the papers in separate posts this week.
The first paper, by David Markell of FSU, the Toxic Substances Control Act (TSCA). Supporters had high hopes for TSCA when it was passed over thirty years ago. It was designed to fix regulatory gaps in other statutes. Russell Train, then head of EPA, called it “one of the most important pieces of preventive medicine ever passed.” Yet, the statute is almost universarily considered a failure.
What went wrong? To begin with, the task was enormous. There were 62,000 chemicals on the market thirty years ago, and 22,000 more today. This has proved far too many for EPA to keep up with.
EPA has issued formal rules or entered into agreements requiring testing for only about 200 of these, and has performed informal internal reviews for about 1200. A drop in the bucket, in other words. There are currently over 200 in mass production for which EPA has obtained no information whatsoever.
Risk reduction efforts have been similarly limited. As of five years ago, EPA’s new chemical review resulted in some action to reduce the risk of about 10% of the chemicals. The manufacturer is not required to do any tests before submitting a chemical for approval (only about 15% include health tests), so EPA is relies on screening models to try to identify the riskier ones. If EPA finds that a chemical is risky, it may still be unable to take action because courts have imposed onerous cost-benefit barriers.
Obviously, TSCA is badly in need of reform. Later this week, I will return to this topic by discussing another paper, which contains important proposals for fixing the situation.