Jon Noble is a Clinical Scientist and Phd student who lives in London and works at Guy’s and St Thomas’ NHS Foundation Trust. He’s just getting back into CrossFit here in Vauxhall after a shoulder injury over the last year.
After a recent conversation on a topic that Jon is clearly quite knowledgeable on, I asked him to write this article for me.
I am a physicist and my life is dominated by logic, the scientific method and evidence-based practice in a clinical setting. I am aware that this is an unusual start to a CrossFit article but I promise not to discuss particle physics.
Many of you have read news reports, blogs, magazines, or research papers reporting a new exercise regime, diet, or medical discovery. The question is how reliable is the reported information?
I have wanted to be able to do the splits since I was a child, probably from watching too many Jean Claude Van Damme movies. I recently decided it was time to act. I really want to be able to do the box splits. Like anyone else, I want to achieve the largest gains in the shortest amount of time. I searched the internet for ‘how long does it take to learn the splits’ and ‘best stretch for learning the splits’. Unsurprisingly I found a lot of opinion and very little evidence.
My favourite search result was from a pole dancer who suggested stretching twice a day for three weeks would achieve the splits easily. This is a great example of a common error; over generalisation. Just because one girl who may have only been six inches away from the splits to begin with achieved the splits in three weeks, does not necessarily mean that a 27-year-old man who could not touch his toes can achieve the splits in three weeks. Although I wish this was true! So I searched the research literature for flexibility studies. This provided some much needed evidence to structure my stretching plan around, although some studies were poorly designed and potentially misleading.
Therefore, I wanted to share with you the common problems in science reporting so that you can make your own informed decisions when investigating nutritional supplements, exercise routines, or stretching methods.
Some of the most common problems in reporting research are:
1. Over generalisation and extrapolation of results
This is the problem with my pole dancer’s advice. She believed her experience and results could be extended to everyone else. Sometimes research may be performed on animals, often mice.
2. Absolute and relative percentages
If there is a risk of an injury whilst skiing of 2 people in 500,000 and a new knee brace has been discovered that reduces occurrence of the injury to 1 in 500,000. This is an absolute percentage improvement of 0.0002%, which is not big news. However, using relative percentages, this can be reported that a new knee brace discovered that reduces risk of injury by 50%. This is clearly misleading but a common problem.
3. Study design
The testing protocol goes like this: formulate a question, develop a hypothesis, test it, and analyse the results.
Problems can arise during the testing part. It may be that the study wasn’t designed very well or that important factors were not considered. For a fair test to take place you must control all of the variables and conditions, and the subjects should be as similar as possible. This becomes very difficult and costly.
As an example, imagine conducting a study regarding front squat 1RM and one subject has trained consistently for the last five or ten years and another has never seen the inside of a gym before this experiment. One performs a full depth squat, and the other a partial squat. One is 18 years old, the other 43. One does his front squats first thing in the morning before having eaten anything that day, and the other late in the evening. One had a knee operation last year… you get the idea.
4. Conflict of interest
Always check for conflicts of interest as these can bias the study design, data, and interpretation of the results. Negative experimental results often go unpublished, either because they are deemed ‘uninteresting’ or because the researcher or research sponsor have a vested interest, where positive findings will result in larger sales of their product.
5. Correlation and causality
Just because two factors are correlated, does not necessarily mean that there is a causal relationship between them. For example, imagine a study found that wearing pink negatively correlated with squat 1RM. This of course does not mean that by wearing pink you will become weaker. More women wear pink and sex is causally related to strength.
6. Check references
If an article references a research paper, where possible you should read the referenced article. It is easy to over simplify someone else’s work to provide evidence to support your opinion. Also scientist make mistakes, we might have misquoted or misunderstood the research work in question.
These problems are often not only in the media, but also within the scientific publications where the original findings are reported. If we go back to the pole dancing blog writer from earlier, we can apply these six principles to her advice. She has generalised her own experience, and possibly that of a few friends, and applied them to the entire population, irrespective of age or sex, and we do not know her age or how flexible she was before she started the training plan. Imagine also that she recommended I buy some device that would help me stretch and was being paid to do so, or that what she was telling me wasn’t based on her own experiences but that she’d misinterpret the experience of someone else.
Therefore, I urge you to consider these points as it could save you time, money, and effort, and help get you the results you are looking for.
Just remember: are there problems with the study design? Can the results be applied to the general population, or someone of your age and gender (or even species)? And are there any conflicts of interest?
The more critically you can think, the more informed your decisions will be.