Register for an account

X

Enter your name and email address below.

Your email address is used to log in and will not be shared or sold. Read our privacy policy.

X

Website access code

Enter your access code into the form field below.

If you are a Zinio, Nook, Kindle, Apple, or Google Play subscriber, you can enter your website access code to gain subscriber access. Your website access code is located in the upper right corner of the Table of Contents page of your digital edition.

Mind

Co-Vary Or Die

Neuroskeptic iconNeuroskepticBy NeuroskepticApril 5, 2012 7:03 AM

Newsletter

Sign up for our email newsletter for the latest science news

I've just come across a striking example of why correcting for confounding variables in statistics might not sound exciting, but can be a matter of life and death.

Imagine you're a doctor or researcher working with HIV/AIDS. You're taking a sample of blood from a HIV+ patient when you slip and, to your horror, jab yourself with a bloodied needle. What do you do?

In a 1997 study, researchers Cardo et al studied hundreds of cases of this kind of accidental HIV exposure ("needlestick injuries") in medical and scientific workers. They wanted to find differences between the people who contracted the virus, and the ones who didn't.

One factor they considered was post-exposure prophylaxis - taking HIV drugs as soon as possible after a suspected exposure. Now these drugs were still pretty new in 1997, and it wasn't clear how well they prevented infection, as opposed to just delaying symptoms. Many people with needlestick injuries were offered a course of drugs - but did they work?

Cardo et al's raw data found no significant benefit

By univariate analysis, there was no significant difference between case patients and controls in the use of zidovudine [AZT, the first HIV drug] after exposure.

But it turned out that this was due to confounding variables. When they corrected for other factors...

Infected case patients were significantly less likely to have taken zidovudine than uninfected controls (odds ratio 0.19, P=0.003). This is a classic example of confounding, since the adjusted odds ratio differed from the crude odds ratio (0.7) because zidovudine use was more likely among both case patients and controls after exposure characterized by one or more of the four risk factors in the model.

So while people who took zidovudine were just as likely to catch HIV than ones who didn't, they were also more severely exposed to the virus i.e. by being exposed to a greater quantity of blood, or a deeper wound. People were more likely to decide to take it after severe exposures. Zidovudine actually dramatically reduced the risk.

Post-exposure prophylaxis has since become standard procedure and it has undoubtedly saved many lives since. Without statistical correction, it might have taken longer for people to see the benefits.

In summary, I guess what I'm saying is, remember to correct for confounds - or die.

rb2_large_white.png

Cardo DM, Culver DH, Ciesielski CA, Srivastava PU, Marcus R, Abiteboul D, Heptonstall J, Ippolito G, Lot F, McKibben PS, and Bell DM (1997). A case-control study of HIV seroconversion in health care workers after percutaneous exposure. Centers for Disease Control and Prevention Needlestick Surveillance Group. The New England journal of medicine, 337 (21), 1485-90 PMID: 9366579

    2 Free Articles Left

    Want it all? Get unlimited access when you subscribe.

    Subscribe

    Already a subscriber? Register or Log In

    Want unlimited access?

    Subscribe today and save 70%

    Subscribe

    Already a subscriber? Register or Log In