“When research should be ignored….”

Gays, bias and phony science
New York Post 1 December 2016
Family First Comment: Here’s a classic example of why you should always be wary of social science research by activists…
“In a paper just published in the peer-reviewed journal Social Science & Medicine, Regnerus examines a widely-publicized 2014 study by Mark Hatzenbuehler of Columbia University, which found that living in an “anti-gay” community reduces the life expectancy of sexual minorities by 12 years—more than a pack-a-day smoker’s habit reduces his lifespan. After ten attempts using the same data, Regnerus was unable to replicate Hatzenbuehler’s results. Despite ample opportunity to respond, Hatzenbuehler, who is an editor of Social Science & Medicine, has not yet done so, and he may not feel the need. As Riley observes, “certain conclusions are simply more acceptable than others,” and neither the academic establishment nor the media that covered Hatzenbuehler’s study may want to hear that his work is questionable.”
Ooops!
#JunkScience #DontLetTheFactsGetInTheWayOfAgendas

The headlines were unsparing and unambiguous. “Anti-gay Stigma Shortens Lives,” wrote US News & World Report.

“Anti-Gay Communities Linked to Shorter Lives,” said Reuters. “LGB Individuals Living in Anti-Gay Communities Die Early,” according to Science Daily.

Two years ago, these stories were hard to ignore when Columbia professor Mark Hatzenbuehler found that gays and lesbians who faced prejudice in their communities had a life expectancy 12 years shorter than those who lived in more accepting areas. Just so we’re clear, that’s bigger than the lifespan gap between regular smokers and nonsmokers.

We always knew prejudice was bad, but an Ivy League researcher had found that there were significant effects on the physical health of those experiencing it.

But where, one might wonder, were the headlines when another researcher tried to replicate Hatzenbuehler’s effects and came up empty?

Last month, Mark Regnerus, a professor at UT Austin, published an article in the journal Social Science and Medicine that concluded that “ten different approaches to multiple imputation of missing data yielded none in which the effect of structural stigma on the mortality of sexual minorities was statistically significant.”

In other words, Regnerus tried seven — er, 10 — ways from Sunday to try to get the same results as Hatzenbuehler using the exact same data, but failed. Which means, he concluded, that “the original study’s . . . variable (and hence its key result) is so sensitive to subjective measurement decisions as to be rendered unreliable.”

Oops. In case you missed it, there has been a “crisis of replication” in the social sciences recently. Or at least it was discovered recently. In 2015, a large initiative called The Reproducibility Project, led by Brian Nosek at the University of Virginia, repeated 100 published psychological experiments and replicated the results of only a third of them.

While no academic or media outlet has made a peep about Hatzenbuehler in the weeks since Regnerus’ article was published, Regnerus was subject to immediate public excoriation for his findings in a 2012 paper on the effects of same-sex parenting on children, which ran contrary to accepted academic opinion on the subject. Despite calls for his firing, the University of Texas found no wrongdoing. Critics disagreed with his methodology, but there was no mystery about how he arrived at his conclusions.

The difference between Regnerus and Hatzenbuehler is obvious. Certain conclusions are simply more acceptable than others.

Take, for instance, the fraudulent 2014 study in which UCLA’s Michael LaCour was found to have made up out of whole cloth segments of his data on shifting attitudes toward same-sex marriage. LaCour suggested that if someone knocking on doors and asking people about their attitudes toward gay marriage revealed that he himself was gay, that would dramatically change the answers of the respondent.

For years, LaCour’s research was cited by major media and used in political campaigns, but it turned out to be what New York magazine’s Jesse Singal called “one of the biggest scientific frauds in recent memory.”

In the end, neither LaCour nor Hatzenbuehler actually did the work to prove their theses — because there would be no real consequences if they were caught, and anyway academia writ large didn’t want to “catch” them at all. Facts be damned. Academics care only about the “narrative.”

facebook_icon