I’ve got an interesting little book to recommend to you. And before I go any further, in the spirit of full disclosure, you should know I got this copy free – no strings attached; don’t have to say anything good; don’t have to say anything bad; don’t have to say anything at all – but free, in hopes I would write something. And, with that out of the way, I want to recommend you read this book.
My recommendation is probably based on something a little different than the authors and the publishers would expect; my recommendation is not because of the thesis of the book (though that is of interest, also) nor on how well the thesis is explained (though there is good documentation) nor necessarily on how well written the book is (although it is pretty well written). Rather, as an auditor, I found its exploration of flaws in logic and investigative techniques worth the read in itself.
The book is Measures of Doubt by Naomi Oreskes and Erik M. Conway, and it lays out the recent history of scientific studies and the ways they have been attacked and eroded. Starting with the 50s and 60s and the arguments against the concept that smoking kills and winding up smack in the middle of the global warming debate, the book explores how the battle against these studies has been waged.
Now, before you start ranting about the fact or fiction of global warming, that is not the reason I am suggesting this book. (Remember that paragraph above; the one where I mentioned I was not recommending the book because of its thesis?) What the auditor in me found fascinating was that, within each chapter were intriguing examples of how some of the arguments (pro and con) struggled under closer scrutiny. Within are numerous examples that the discerning auditor will recognize apply when developing his or her tests, findings, and opinions.
1) In the discussion of strategic defense, one panel noted the Soviets had spent large sums of money on a certain antisubmarine system. When there was no evidence the system was deployed, the panel determined that the Soviets had been successful and then covered it up. They basically said evidence that a particular capability had not been achieved was proof that it had. (The book includes a great quote from C. S. Lewis at this point - “A belief in invisible cats cannot be logically disproved” although it does” tell us a good deal about those who hold it.”)
2) In an example of “never play your hand too soon”, a scientific report on ozone hole depletion was rejected by the peer review panel of a national scientific magazine. Perfectly fine; the scientists began revisions based on the comments. However, the preliminary draft was leaked, and the message that got transmitted was the incorrect one, effectively sabotaging the actual results.
3) In rejecting the work done by Rachel Carson in her book Silent Spring, the critics attack the writing style saying “the book was impassioned, rather than balanced, and read as if written by a prosecutor”. Damning with faint praise, the underlying realities are dismissed. (An excellent example of how perception of an audit report’s intent is as important as the content itself.)
4) An interesting exploration into the effect of Type 1 and Type 2 errors (yes, I’m serious, it was really interesting) on the debate regarding the dangers of secondhand smoke.
On the surface, the book is a good discussion of how scientific research is used and misused in some of the most important debates still occurring today. However, auditors will also find value in the subtext that exists relating to how conclusions can be skewed and messages can be misinterpreted.