Finding Credible information in a sea of shit.
As I continue my never-ending quest to gather more helpful information about health and fitness, I encounter one issue.
Information is rarely straightforward.
The need for a qualified expert is low in an age where starting a business and selling a product or service is remarkably easy.
Furthermore, as consumer attention becomes the most valuable form of currency for business, the need to stand out and market shiny objects and magic pills is at an all-time high.
The product of these variables is an enormous surplus of information that can pull you in a million different directions.
If you've landed on this page and are reading this article, I can assume a couple of things.
First, you are someone who doesn't take everything you see and read at face value. You like to analyze and find the truth.
This is an outstanding quality that already sets you miles ahead.
Second, you want to learn more about interpreting information.
You are not someone who wishes to move through life blissfully ignorant.
You want answers.
Again, these are great qualities and ones that cannot be taught.
I tip my hat to you for being one of these people.
With that being said, my area of expertise is fitness.
In this article, I'll outline problems associated with searching the internet for answers to fitness-related questions.
And, of course, offer some tools to help you solve them.
I'll also emphasize research and the many ways it can both provide clarity and become weaponized.
Let's begin.
To fully understand this topic, we need to rewind and examine why health and fitness information seems so vast and contradictory.
You need to understand that the discipline of health and fitness is still very much in its infancy.
Gyms are not exactly new (in fact, the first recorded gym in history was apparently established in 3000 BC), but their prevalence and the large part they play in our lives are definitely new.
Your grandparents probably didn't go to Crunch or have an Equinox membership growing up.
I won't go through the entire history and evolution of gym culture, but it's crucial to appreciate how relatively new the science surrounding it is.
The way training and fitness have exploded into our everyday lives highlights why it doesn't feel like there's a ton of concrete, non-disputable pieces of information.
The second thing we need to appreciate is just how vast the category of fitness has become.
To you, fitness might mean going on a run.
To your neighbor, it might mean competing in CrossFit.
To my sister, it means running ultra marathons.
To your grandma, it might mean being able to go on walks and feed her pets.
In other words, the information can feel overwhelming because it lacks a lot of context.
This brings us to our third consideration.
The answers to what works and what doesn't in fitness are not binary.
Rarely can questions be answered with a simple "do this, not that."
I'm a coach.
When people find out I work in the fitness space, they ask questions like "What are your thoughts on ____?"
And almost always, I need to warn them that I have a TON of thoughts on ____.
But without more context, it's hard to give you an answer you'll appreciate.
Think about your car when the battery dies.
You know you need to jump the car but might not know how to do it.
So you find someone willing to give you a jump, and if you both aren't sure how to go about it, you do what any sensible person would do.
You Google it.
You'll get millions and millions of results from Google.
Endless articles and YouTube videos.
But want to know the nice part?
There's really only one way to jump a car.
And the variance in instruction you'll get is insanely small.
So sure, there are a thousand videos you can watch on how to do it.
But everyone will give you the same answer.
But imagine now that there are thousands of ways to jump a car.
And what might work for your car might not work for someone else's car.
So, you start running through the videos and articles, trying one thing after another.
Not only will you need to try a potentially large number of things, but when they don't work, you might not fully understand why.
Did it not work because this isn't the right method for my car? Or was I doing it wrong? Was there an important detail I missed?
This is fitness.
The answers to your fitness-related questions are rarely (if ever) straightforward.
There are many other points we can discuss here, but in the interest of time, I'll leave you with one more extremely important consideration.
Fitness is a billion-dollar industry.
In fact, Google tells me the global fitness market is valued at 96 billion and growing rapidly.
To put 96 billion dollars into perspective, if I paid you a million dollars daily, it would still take you roughly 263 years to reach 96 billion.
In other words, there's a lot of money to be made and a lot of people want a piece of the action.
With more people trying to earn your attention, there are more and more sources of information, with almost no checks and balances for how accurate that information is.
Now at this point, you might be thinking: Bro, how the FUCK am I supposed to trust anything these days?
That is precisely what the rest of this article will focus on.
As we search the internet for information about our health and fitness, we're attempting to answer questions.
This is a form of the scientific method.
The scientific method is essentially where we ask a question we want the answer to, we do some research around that topic, we formulate a hypothesis based on our findings, we make predictions as to what the outcome will be, and we then test that hypothesis.
Heading to Google, Instagram, or YouTube to find the best exercises to grow your pecs is essentially this process.
You want to know how to grow your pecs, you watch some Youtube videos, you determine what exercises you need to perform, you put them in your program for a few weeks assuming they will grow your pecs, and then you analyze your results.
This is a healthy process that will lead to a lot of learning and discovery.
But with all the factors we discussed earlier, how do we validate this research?
How do we know that the dude on YouTube is giving us solid advice?
What do we do when one Instagram influencer says Bench Press will maximize pec growth, and another says it won't?
The first major step you can take to ensure you're getting the correct information is clarifying exactly what question you want answered.
This may sound like a "no shit" step but let me illustrate.
Let's say your primary goal is growing your pecs.
But what you search in Google is "best pec exercises."
"Best" is a very contextual term.
You may get results for the best exercises to grow your pecs, but you might also get the best exercises to strengthen your pecs.
While they overlap, training for strength and muscle growth are not the same, and different exercises may be better suited depending on your goal.
So before diving into what the best exercises are, try to get crystal clear on what exact question you want answered.
This will allow you to start from a much better position.
The next major consideration is going to be your sources.
I don't expect any of you reading this to dive deeply into the peer-reviewed research of pec hypertrophy (and doing so can carry its own issues, as we'll see later).
But sources don't always have to mean they come directly from the scientists studying it.
Sources, even in places like YouTube and Instagram, can have varying levels of credibility.
Ideally, you're looking for people who have extensive experience coaching other people, demonstrate an ability to apply the knowledge themselves, and have credentials to support their status.
It's true that credentials don't always mean credibility. There are some highly credentialed coaches out there who are only interested in selling their own systems or ideas. However, credentials are often a great place to start to determine, at the very least, if this person has the formal education necessary to make the claims they're making.
For example, I have a bachelor's in Applied Exercise Science from Springfield College (informally known as The College of Coaches), I'm a Certified Strength and Conditioning Specialist through the National Strength and Conditioning Association, and I'm a USA Weightlifting Level 2 coach.
So, without any additional knowledge, it's fair to assume that I'm more qualified than someone without that criteria.
Again, there are exceptions, but this is a great place to start.
I also emphasize having experience coaching others because there are a lot of people out there giving fitness advice who have only ever trained themselves.
This is not inherently bad, but if someone lacks formal education and has only ever trained themselves, they are essentially experts in their own body.
Not yours or anyone else's.
Another way to vet your sources is based on particular character traits.
The main one I look for is dogmatic tendencies.
It's just a harsh truth that selling a solution is easier when you have a boogie man.
People love to take sides, whether consciously or unconsciously.
Unfortunately, having firm conviction doesn't mean you're right.
One of my favorite examples is the highly credentialed, well-educated folks who swear by carnivore diets.
Based on my original advice about credentials and experience, some of these folks seem to know exactly what they're talking about.
However, their intense demonization of fruits and vegetables should be a red flag.
Not because it goes against conventional wisdom but because it leaves no room for context or interpretation.
Let's walk through a hypothetical.
It's very well documented at this point that being obese leads to many health issues.
So, let's say Phil's doctor has determined that his body fat percentage is leading to several potential health issues, such as pre-diabetes.
Phil tries several restrictive diets and sees no results because he's unable to stick to his plan and reduce his calories effectively.
Eventually, he tries a carnivore diet.
After a month or two, his weight has gone down, and his doctor is happy to see his health improving.
The carnivore fanatics on the internet will use Phil as a prime example of how eating fruits and vegetables (like Phil previously did) causes health issues and how eating a strictly carnivore diet made him healthy.
This dogmatic perspective undercuts any ability to investigate WHY this happened.
When we analyze the situation, we can observe that Phil's health improved because he lost body fat, not because he stopped eating fruits and vegetables.
Phil simply found a diet that allowed him to remain in a caloric deficit.
Something he had not found prior.
So when sources of information create big bad boogie men, like carbs, fruits, or vegetables, they're trying to navigate around the mechanisms that actually create change.
Usually, the end goal is to sell a product or increase their notoriety.
This should be a major red flag in their credibility.
Note: There is a difference between someone explaining what THEY do and what they think EVERYONE should do. It's not a hit on someone's credibility when they explain that they avoid high-glycemic carbs, but it's a red flag when they try to persuade everyone that high-glycemic carbs are terrible for your health.
A major green flag in credibility is someone who can comfortably analyze both sides of a dispute or view a problem from various angles.
Suppose our buddy Phil explains what happened to a less dogmatic coach. That coach explains that his changes in health could be due to a variety of factors and analyzes how his activity levels, total caloric intake, and increased protein intake have all contributed to his health improvement. In that case, this is a major green flag.
At the end of the day, everyone has biases. It's not reasonable to assume that just because someone has a bias (e.g., I personally am very against fad diets like keto and carnivore), their credibility is non-existent.
But the deeper someone is into one specific ideology, the less likely they are to give you complete and objective information.
I also mentioned an ability to demonstrate the knowledge they're sharing themselves.
This is in line with our inherent biases.
The "believe what we can observe" part of our brains.
For example, if you read two different blog posts on building muscle, and one is written by an extremely jacked dude and one is written by a dude who isn't, we typically are biased towards the more jacked fella because we can visually see some form of proof.
In recent years, there has been a major pushback on this notion.
This tends to be a hard line to walk.
On one hand, we should absolutely not disqualify someone's input because they aren't jacked or insanely strong.
But it's worth considering the claims they're making and what you see.
This is where context becomes extremely important.
What I mean by context is that it's useful to do some digging into what service or information someone provides and how the advice they're giving aligns with what you see.
Let's do another hypothetical.
Let's say you're on the internet looking for sources of information on growing your pecs.
Your goal is to get as jacked as humanly possible.
You stumble upon a blog post about chest training but realize the author has small pecs and an unimpressive physique.
Is this person's advice worth taking?
Well, we need to expand our search and figure out exactly what this person's area of expertise is.
Suppose they're someone who gives information mostly on mobility or shoulder health. In that case, you can piece together that the reason it seems this person is not walking the walk is because they are not walking the walk of getting jacked as all hell.
They're walking the walk of mobility and shoulder health.
This doesn't mean that they're not totally justified in making a post about chest training, but they might be taking a perspective centered around health and mobility rather than getting jacked at all costs.
Having this new perspective, you can take in the information with an understanding of where their values live.
This brings us to the last point on credibility.
Expertise.
It's not uncommon for someone whose primary background is in powerlifting to share content about building muscle or gaining mobility.
But their expertise is in powerlifting.
This is to say, don't be afraid to designate categories for the sources from which you get your information.
I have certain people within the field whom I turn to for information on building muscle and others about eating healthy.
I may have others I look to for biomechanics, sports performance, mobility, etc.
It doesn't mean I won't take their advice on areas outside their expertise; it just means that if what they say contradicts what someone who specializes in that category is saying, I have a hierarchy I can look to to help create clarity.
This categorization of sources is a great tool for gaining a more complete picture of health and fitness.
Here's another example to illustrate this point.
I'm a huge NFL fan.
I love watching it, but I never miss a Steelers game.
While I can share a lot of information about all of the NFL teams, I can give you far more about the Steelers.
They're my area of expertise within the NFL.
Another NFL fan might be an expert on the Vikings or the Patriots.
This means we will all have opinions on all of the teams, but we all have specific teams we pay far more attention to and are way more knowledgeable about.
This is how it often works in health and fitness.
Now, let's move on to some other considerations when finding information on the internet.
As the amount of fitness information on the internet has expanded, it's created a need to build credibility.
This is why, now more than ever, you see people cite studies in their Instagram posts, YouTube videos, and blogs.
On the surface, this seems like a great thing.
After all, if they can cite their sources, then they must be providing completely valid information.
Right?
Well…
Not exactly.
This usually comes as an unfortunate shock to many people, but just because someone cites sources in their content doesn't mean they are a valid source of information.
The reason is usually ignorance, but it can also be a little more devious.
Let's cover ignorance first.
When I say ignorance, I mean that most people are pretty bad at reading and interpreting research.
I bring this knowledge to you as someone who is also not the best at reading and interpreting research, and I try pretty hard to do so.
You see, research is not as cut and dry as many people believe.
Various types of research papers carry different weight.
For example, a Meta-analysis or Systematic Review is typically regarded as a paper that we can trust.
While things like Narrative Reviews are subject to much more scrutiny, and have a lot more room for opinion.
So just because someone says, "A 2019 study found ____," it doesn't mean that those findings hold a lot of weight or that what they're saying is even accurate.
A personal example is I recently read Physiological and Physical Profile of Snowboarding: A Preliminary Review.
It went through the physical characteristics we assume make up a good snowboarder.
The problem is that this is a preliminary review.
This essentially means it's a paper that evaluates the current body of literature, mostly to identify gaps and areas that need more attention.
These tend to be less structured and aren't necessarily designed to draw conclusions on what we know but instead identify what it seems we don't know enough about.
Sure, there is definitely information that can be pulled from papers like this, but it would be misguided to draw hard conclusions.
Sometimes, a content creator's lack of understanding can cause them to misrepresent the data.
A while back, a former client sent me a post from a nutrition "expert."
They made a detailed post about how a keto diet led to various positive health outcomes.
Based on what I knew, the conclusions from the research that the content creator was sharing seemed way too strong.
If a paper had this much concrete evidence, we'd be hearing about this paper more.
So I read it.
And I'll admit that at the time, my own ability to read and interpret research methodically was average at best.
But as I made my way to the conclusion, it read clear as day (and I'm paraphrasing because this was years ago): "More research is needed to validate these findings.""
After seeing that, I went back and looked at the sample size. I don't remember the exact number, but I remember it being comically small.
And piece by piece, it became abundantly clear that even the authors were not sold on their findings.
And again, I'm no expert.
However, the authors themselves clearly stated that the findings were not statistically significant and that more research was needed.
So, was this person devious about their content?
Possibly.
But more likely, they were just naive.
You see, once researchers get data, it doesn't end there.
There is a whole element of statistical analysis that needs to be done.
An unfortunately common practice is for someone to look at the data from a study and utilize it to support an opinion they already have.
Completely missing that kind of important part where we validate the significance of that data and interpret the results objectively.
So when you have a bunch of content creators who are way better at editing videos than they are at reading research, you end up with a lot of extreme claims that generally misrepresent the actual findings.
We also need to be aware of certain limitations in research.
Things like sample size and demographics matter.
For example, women are horrendously underrepresented in sports and nutrition research.
I don't know why, but I do know it makes it harder to generalize findings across sexes.
We also can't generalize research done on trained 20-25-year-old males to untrained geriatric women.
Yet this happens a lot on the internet.
This whole topic could be its own book, so I'll leave it at this.
There is a LOT that goes into research and the data that it collects, so I feel confident saying that most people are not great at representing it well (take it from someone who is actively trying to improve in this area).
So, that covers ignorance.
But what about the more devious content?
Well, let's consider everything that we just mentioned.
With all of this nuance and the public's general lack of understanding of research, it becomes pretty easy to paint pictures in favor of whatever stance we are supporting.
You can see this when large companies use research to support their products.
Side note: if you want a funny example, most wearables (Garmin, Whoop, Apple Watch, etc.) claim that their research says their data is a bazillion times more accurate than their competitors.
But it happens on smaller scales as well.
Someone has an opinion, and instead of evaluating that opinion objectively, they go out and look for data until they find something that supports their stance.
Or they purposely misrepresent data to fit whatever narrative they're preaching.
Okay, let's take a deep breath.
I know I made it sound like anyone who uses research in their content is either a dumb asshole or a diabolical asshole, but that's not the take-home message (I promise).
All I'm trying to do is highlight that research does not equal credibility.
So that's cool.
But like, how do we know what research to trust or when to trust content creators who cite research to support their claims?
Well, the most straightforward way is to actually read the research yourself.
But a less painful way of doing it is to scrutinize the information they're presenting.
You can do this in several ways, combining them to get a better idea of the information's credibility.
The first step is everything listed above.
What are their credentials?
What is their area of expertise?
How are they presenting the information?
Are they dogmatic? Or do they offer multiple perspectives?
But when it comes to actually analyzing what they are citing, you can use a similar process.
How groundbreaking is this data?
Typically, if the research being cited seems too good to be true, then you're probably missing a lot of context.
Are they offering the full context?
Sometimes, findings might be from studies done on animals or with very small sample sizes, making them less generalizable or prone to outliers.
Do they have a clear understanding of what the findings mean?
Sometimes, people misrepresent findings simply because they don't fully understand them. This is when asking questions or seeking expanded viewpoints can be crucial.
Are there conflicts of interest?
Is this person trying to sell you something, and how does that align with the information they're providing?
Okay.
We've spent a lot of time discussing all of the ways we can scrutinize information on the internet, and frankly, it sounds exhausting.
The goal is not to make researching how to be healthier or get in better shape seem daunting.
The goal is to push you to ask more (and better) questions about the information you're getting to enhance your learning and do your best to get the most accurate information possible.
So, let's end on a positive note.
What are some ways we can learn more about health and fitness in a way that minimizes faulty information?
The first is to communicate with many individuals from different schools of thought.
If you walk into a CrossFit gym to meet fitness experts, you'll be exposed to a bias toward CrossFit.
If you read a bunch of nutrition articles written by a vegetarian, then you'll be exposed to more plant-based biases.
It's not wrong to get information from these sources; you just need to bring in other sources to balance out the biases and expose you to a greater range of ideas.
The next (and arguably best) thing you can do is do some self-experimenting.
People get frustrated with learning new things when they are impatient.
If you can look at this as a lifelong process of finding what works for you and your body, then you can experience the freedom of trying new things for the sake of learning.
If you do this and then bring in the considerations we've laid out here, then you'll start to develop a good eye for bullshit and enhance the quality of your learning.
Remember to take all of the variables listed into consideration.
Who:
Who is this person?
Do they have any credentials?
Do they work with others?
Are they dogmatic or balanced?
Do they practice what they preach?
What:
Does the information seem too good to be true?
Does this demonize one thing in favor of another?
How many perspectives are being shared?
How:
Is this person validating their sources with high-quality research?
Is this established information or a personal anecdote?
Is this the full context being presented to you?
Ultimately, finding reliable information boils down to being open-minded with a hint of skepticism.
Engage with the process, expand your schools of thought, and challenge your biases. You'll have a much better experience finding information on the internet.
Hope this helps.
P.S. This topic is a major rabbit hole. The most significant help I've received in formulating this healthy skepticism is the fellas over at MASS Research Review. Dr. Eric Helms, Dr. Eric Trexler, Dr. Mike Zourdos, and Lauren Colenso-Semple have done awesome work to help break down the current research into something extremely digestible. And their guides on how to read and interpret research have been a massive help in my development. So, a huge thank you to them, and I encourage you to check out MASS (not affiliated).