“Exciting new evidence proves that eating capers three times a day will CURE bad morning breath!”
Even those with minimal education would spot this fictional headline for the crock that it is… Unless, of course, you’ve never tried capers; which taste like the smell of a rubber tires, in my opinion. Unfortunately, not all bogus “evidence” is quite as easy to spot, especially in an age when the media has learned to wrap not-so-great quality science in shiny paper with all the bells and whistles. All quality journals closely vet the data presented by researchers and the claims/implications they make; however, there are a myriad of outside parties who will twist and extrapolate the information to further their own agenda.
It takes years to reach true mastery in critiquing scientific research. I am far from reaching that point. I can give some beginner level tips for making your way through the swamp of health-related information (or misinformation) being thrown your way. Here are some questions you should ask yourself when an article comes up on your screen touting the latest “scientifically backed” trend or claim:
-Do the authors use words like “proves”, “cure”, “abolish”, “fool-proof”, or “miracle”? As a general rule, if I see these or similar verbiage in an article headline, I don’t even bother to click. And “fool proof”? Really? If a treatment is truly fool proof, it would work even if you forgot to take it, and I have yet to hear of a bottle of pills that can exert any kind of effect from your medicine cabinet.
-Do they cite the research? To me, this seems like a no-brainer. I have read many website articles promoting a food, nutrient, or specific product with reference to “a new study” or “extensive research” supporting their claim. That’s great and all, but if there is no hyperlink or citation that allows me to go directly to the source i.e. scientific article(s) published in a peer-reviewed journal, then I am moving on without a second thought. They either failed to do their due diligence and provide a reference (and give credit to the researchers!), or they don’t want the reader to see what the evidence REALLY states. It’s one thing to discuss well established and widely accepted science without a citation; I don’t need to scrutinize the findings that tell me it is important to drink plenty of water or that vitamin C is vital to the immune system. But if it is new, if it is novel, or if it is very specific to one food or product, citations are crucial.
-Is there more than one study supporting their claim? It is one thing to showcase a prestigious study and rave about exciting results, but another thing to endorse a lifestyle change (i.e. inclusion of a new product in your health regimen) based on a lone study. The truth is, studies can be BAD. Have you heard of a confounding factor? A confounding factor is “a variable that influences both the dependent variable and independent variable causing a spurious association” (thanks Wikipedia). What this means is that the subjects that ate capers thrice a day may have immediately brushed their teeth afterwards to rid themselves of the nastiness (in case it is not clear, I’m not a caper fan). If so, it was likely not the capers that “cured” bad morning breath, but the teeth brushing aka the confounding factor. Good researchers do everything in their power to avoid confounding factors, but this doesn’t always work out especially when you are dealing with human subjects. This is why it is important to have multiple studies by different research groups, preferably on different populations, to confirm the validity of an association.
These tips only begin to scratch the surface, but I believe if employed correctly, they may help you weed out some of the more egregious articles. I applaud those of you who are beginning your quest towards a fine-tuned critical surveyance of the research world! Happy Googling!
By: Danielle Ashley