BOSTON U. SESSION DISCUSSES PITFALLS IN EPIDEMIOLOGY

by Ellen Ruppel Shell


Remember "Sleeper," the 1973 Woody Allen classic in which a health food salesman stumbles out of suspended animation in the year 2173, hungry for health food? A bewildered doctor asks a colleague how even so primitive a creature could favor wheat germ and tiger's milk over steak and hot fudge. His colleague replies that at that time "those (foods) were thought to be unhealthy, precisely the opposite of what we now know to be true." Ah, yes, the ebb and flow of dietary recommendations based on "scientific fact". As Gary Taubes wrote in the July 14 issue of Science, "The news about health risks comes thick and fast these days, and it seems almost constitutionally contradictory." Taubes lays the blame for this squarely at the feet of epidemiology, which, he says, has in numerous cases outlived its usefulness as a predictor of health effects. Meanwhile, some scientists complain that journalists have entered into an "unholy alliance" with epidemiologists, jumping on pilot and preliminary studies and reporting them as "breakthroughs". Epidemiology, after all, is accessible, dealing not with abstractions, but with people and the way we actually live. Epidemiological studies give credence to cocktail-party notions, such as the association between booze and breast cancer, or coffee and high blood pressure or sugar and hyperactivity or electromagnetic fields and leukemia. This is stuff that almost anyone can relate to and journalists know it, particularly science and medical journalists. So when an epidemiologic study comes out suggesting that women of average weight die earlier than do their thinner counterparts (as happened in the New England Journal of Medicine (NEJM) earlier this year), we throw common sense to the wind and go with it, knowing full well that another, contradictory, study is just around the corner. The fact that thousands of women are frightened and confused by such nonsense seems to count for much less than both the scientist's and the journalist's lust for front-page real estate. In September, the Science Journalism Program at Boston University attempted to cast a little light on this issue by sponsoring a symposium on epidemiology and the media. The meeting brought together 22 journalists and a handful of scientists to try to come to some useful agreement as to the purpose and limitations of the epidemiological approach. Dr. Charles H. Hennekens, a professor of epidemiology at Harvard's School of Public Health, got the meeting off to a heady start by pronouncing that science and the press are in "collusion to confuse the public". Hennekens confessed that epidemiology is a "crude and inexact science" and that "we tend to overstate findings , either because we want attention or more grant money." He also warned journalists to "beware of scientists bearing press releases rather than peer-reviewed manuscripts". Fair enough. But it seemed to many journalists present that the good doctor was being a bit disingenuous. After all, the peer reviewed NEJM is often first out of the blocks when it comes to headlining questionable epidemiological findings-that little article about the longevity of skinny women, for instance. NEJM executive editor Dr. Marcia Angell pointed out that even the Journal was not perfect, adding that it's not up to scientists-or publications geared toward scientists-to do the journalists' job of digging out the truth of the matter. "The most important job (of the scientific journal) is to publish the best research and to be aware that this is being published for other scientists primarily," she said. Dr. Angell added that journalists need to become more skeptical of what they read in journals, and that the public needs to become more skeptical of press reports. Ultimately, both scientists and journalists agreed that reading the methods section of a paper, as well as its conclusions, was critical to good reporting. "The methods section may not make for sexy headlines," Hennekens said, "but it is where the science game is won or lost."

Dr. David Allison, a statistician and specialist in obesity at Columbia University's College of Physicians and Surgeons, warned journalists not to treat journal articles as sources. "If we are writing it in a journal today, it means we are confused about it: We're all working on a puzzle and have stumbled upon a piece. You need to look at the methods of the study-was the study done in a way to lead to valid inferences?" For example, the NEJM report linking moderate weight gain in middle aged women with shorter life was based on a 16-year longitudinal study of 115,000 female nurses in Boston, not a random slice of the American population. It's possible that there is a confounder afoot, that, for example, female nurses gain weight by eating frosted doughnuts on the night shift, and that it is the doughnuts, not the few extra pounds, that cuts their life short. Then again, it is possible that the association between life span and weight gain was grossly overstated, given that its author, Dr. JoAnn E. Manson, serves as a consultant to two diet-pill companies.

Not that the "money trail" is the only path to conflict of interest. "Sometimes dogma is an even more powerful force than dollars when it comes to science", said Dr. Allison. Epidemiologists and other scientists who have a professional stake in a theory are more likely to keep pushing it, even when evidence suggests they should do otherwise. And the pressure to publish contributes to this effect. Scientists know that a paper is more likely to be accepted for publication if it shows a positive finding, so they can't help but hope to find one. It is unlikely that a study showing no association between moderate weight gain and longevity would have found its way into the prestigious and powerful New England Journal, let alone to the front page of the New York Times. Dr. Tim Byers, a professor of preventive medicine at the University of Colorado's School of Medicine, suggested that the public might be better served if scientific information was filtered through a formal scientific clearing house, a panel of pre-selected experts who agreed to be "on call" to journalists. This idea sounded good to Kim Pierce, of the Dallas Morning News, who explained that she often has less than two hours to write a breaking story and needed all the help she could get. But the suggestion outraged several other journalists, notably

Return to ScienceWriters table of contents.