May 16, 2007

Critiquing Scientific Studies, for the Layperson

Great article in Newsweek about the potential pitfalls of using scientific research to make a point. Sharon Begley details some ways in which research methodology can skew study results. Study design, Begley argues, can lead to results which contradict results of a similar study which used different methodology.

For instance, a recent study funded by the government reviewed the effectiveness of abstinence-only programs on teens' rate of sexual activity and found that "kids in abstinence-only 'were no more likely to abstain from sex than their control group counterparts ... [both] had similar numbers of sexual partners and had initiated sex' at the same age."

This is very different, of course, from programs publicized by social conservatives suggesting that abstinence programs did reduce teen sexual activity. Begley shows problems with some of the studies cited by social conservatives, such as:

"Many [studies] evaluated programs where kids take a virginity pledge. But kids who choose to pledge are arguably different from kids who spurn the very idea. 'There's potentially a huge selection issue,' says Christopher Trenholm of Mathematica Policy Research, which did the abstinence study for the government. 'It could lead to an upward bias on effectiveness.' "

Begley cites several other examples of study bias leading to what she calls "bad science," and what is at best, conflicting and confusing results. It's a good start -- especially for those of us trying to teach students about scientific research. But I wish it were a bit more, well, scholarly and meaty.

* Begley, Sharon. Just Say No--To Bad Science.
Newsweek, 5/7/2007, vol. 149 Issue 19, p57.
* Trenholm, Christopher, Barbara Devaney, Ken Forston, Lisa Quay, Justin Wheeler, and Melissa Clark, Evaluation of Abstinence Education Programs Funded Under Title V, Section 510. Available online via Mathematica Policy Research, Inc., April 2007.

1 comment:

kate said...

A huge part of information literacy is actually scientific literacy. We don't need to understand the science, but knowing how a good study is conducted is hugely important.

It doesn't help that reporters don't get it, either. So news stories leap to conclusions without stopping to say "hmm... sometimes correlation doesn't equal causation."

I agree that the article could have been a little more substantial, but honestly, i'm happy to see anything that tries to bring even a teeny bit of awareness about this to the fore.
/pet peeve rant