January 30, 2005
On Friday, January 28th, Michael Crichton, MD spoke at the American Enterprise Institute on Science Policy in the 21st Century. His latest book, “State of Fear” is the product of three years of examining environmental research, activism and policy. Dr. Crichton, a graduate of Harvard Medical School, as well as being a best selling author with more than 100 million copies sold worldwide, addressed the audience as an informed citizen concerned with how politicized research results in distorted conclusions and misguided government policy. (During the Q & A, Crichton was asked by a wag if his talk was a partisan tool for the Bush administration. He replied that he “follows the data” and if his conclusions are the same as those of the Bush administration then it’s a coincidence.)
It should be cause for alarm, according to Crichton, that science policy is being shaped by lawsuits rather than by negotiation and legislation based on sound, un-biased scientific research. The first of six questions to stimulate thinking on this situation was, “How do we obtain good (i.e., un-biased) information?”
He proposed two approaches: the “FDR tactic” and the “FDA tactic.” The first refers to that President’s habit of allowing proponents of differing ideas to “fight it out” in front of him, permitting him to recognize the more robust of the contenders. Crichton laments that we presently lack a suitable forum in which to carry out such debate. The second approach would be characterized by true “double-blind” testing of research, along the lines that must be followed by new products when seeking certification from the Food & Drug Administration. As an aside, he suggests that the results of publicly-funded research should be available to the public on the Internet.
Considering that we are a society deeply dependent on information, Crichton is surprised that we are slow to think of information as a ‘product’ and foresees product-liability lawsuits in the near future concerning flawed information.
“How do we set policy in uncertainty?” He cited the example of the recent tightening of the ‘safe’ level of arsenic, despite the absence of decisive evidence that the new standards will bring about a significant improvement in public health. He recommends tying policy to research; in the case of arsenic, establishing very long-term studies at costs that would be a fraction of the costs of implementing the proposed, un-tested standards.
Having thus invoked the precautionary principle, he turned to the question, “When to prevent? When to adapt?” Noting that prevention tends to cost more and favor elites, while adaptation costs less and favors the average person, he recommends adaptation, both as a “coping mechanism” as well as a “policy predicate.”
His own answer to the question “How should we promote desirable technology?” was a startling, “We shouldn’t.” Briefly relating California’s intervention in the automobile business – the botched attempt to mandate a percentage of electric vehicles in that state – he concluded that we should specify outcomes, not procedures. The challenge, after all, was smog-abatement, not a specific way to achieve it.
“How do we regulate a knowledge society?” Noting that even the infrastructure-intensive nuclear genie was “out of the bottle” and that bio-tech research could be effectively conducted in a garage, Crichton concluded that “we can’t.” Again preferring outcomes over procedures, he suggests that criminalizing certain outcomes might be our only practical way to ‘regulate’.
“Can we manage complex natural systems?” Surprisingly, he suggests reason for optimism, despite citing Alton Chase’s history of 100 years of botched ‘management’ of Yellowstone Park. Disposing quickly of the myth of the “balance of nature” he moved on to the German, Dorner’s, computer model studies of complex systems, which were sent out to various scientists for 10-year management experiments. Some experiments broke down while others thrived. The determining factor, apparently, was that the successful experiments were observed for longer initial periods and interventions were small and infrequent, gradually developing to a near-constant fine-tuning. The failures, on the other hand were characterized by an a priori assumption of what was ‘good’ followed immediately by numerous, major interventions, swift system collapse and a reluctance to acknowledge responsibility for the failure. Crichton concludes that attention to reality rather than blind ideology is the key to a successful management strategy.
It doesn’t matter if we are conservative or liberal; we are naive to assume that regulatory policy, at home or abroad, is based on un-biased, ‘pure’ science. For generations, so-called “scientific studies” have been used as partisan tools to further ideological agendas. As a contemporary illustration of his point, Crichton cited the current issue of the Washington Post which, on its front page, noted the anniversary of the liberation of Auschwitz, an extreme example of politicized science. Yet, in the same issue, the editors, apparently deaf to history, urged President Bush to get with the Global Warming program…for political reasons!