1. Guest, the 'News in Brief' for the week beginning 23rd January 2023 is here.
    Dismiss Notice
  2. Welcome! To read the Core Purpose and Values of our forum, click here.
    Dismiss Notice

With Great Power Comes Great Responsibility: Common Errors in Meta-Analyses and Meta-Regressions in Strength & Conditioning Research, Kadlec…, 2022

Discussion in 'Research methodology news and research' started by cassava7, Oct 10, 2022.

  1. cassava7

    cassava7 Senior Member (Voting Rights)

    Messages:
    794
    Background and Objective

    Meta-analysis and meta-regression are often highly cited and may influence practice. Unfortunately, statistical errors in meta-analyses are widespread and can lead to flawed conclusions. The purpose of this article was to review common statistical errors in meta-analyses and to document their frequency in highly cited meta-analyses from strength and conditioning research.

    Methods

    We identified five errors in one highly cited meta-regression from strength and conditioning research: implausible outliers; overestimated effect sizes that arise from confusing standard deviation with standard error; failure to account for correlated observations; failure to account for within-study variance; and a focus on within-group rather than between-group results. We then quantified the frequency of these errors in 20 of the most highly cited meta-analyses in the field of strength and conditioning research from the past 20 years.

    Results

    We found that 85% of the 20 most highly cited meta-analyses in strength and conditioning research contained statistical errors. Almost half (45%) contained at least one effect size that was mistakenly calculated using standard error rather than standard deviation. In several cases, this resulted in obviously wrong effect sizes, for example, effect sizes of 11 or 14 standard deviations. Additionally, 45% failed to account for correlated observations despite including numerous effect sizes from the same study and often from the same group within the same study.

    Conclusions

    Statistical errors in meta-analysis and meta-regression are common in strength and conditioning research. We highlight five errors that authors, editors, and readers should check for when preparing or critically reviewing meta-analyses.


    Key Points
    • A meta-analysis combines data from single studies to test specific hypotheses, but statistical errors can substantially impact the calculated results and lead to flawed conclusions.

    • We describe five common statistical errors that are easy to spot and serious enough to markedly impact results.
    • We identified statistical errors in 85% of the 20 most highly cited meta-analyses in strength and conditioning research over the past 20 years.

    • Sixty percent of all effect sizes (standardized mean differences) greater than 3.0 were due to a standard error/standard deviation mix-up, meaning that effect sizes this large should have a high index of suspicion for error.

    • Understanding common sources of statistical error in meta-analyses helps the reader evaluate published research.
    https://link.springer.com/article/10.1007/s40279-022-01766-0
     
    alktipping, lycaena, CRG and 6 others like this.
  2. cassava7

    cassava7 Senior Member (Voting Rights)

    Messages:
    794
  3. RedFox

    RedFox Senior Member (Voting Rights)

    Messages:
    643
    Location:
    Pennsylvania
    This is concerning, to say the least.
     
    alktipping, Sean, Trish and 1 other person like this.
  4. rvallee

    rvallee Senior Member (Voting Rights)

    Messages:
    10,193
    Location:
    Canada
    Seems expected considering what usually happens when errors are pointed out: nothing. We're not the only ones noticing this, even, uh, traditional, skeptics find the same frustration, that they chase errors and find no interest from editors to address them. Once published, it seems that it takes nothing short of deliberate fraud for a paper to be retracted, even with glaring mistakes, because it clearly massively affects their reputation more than not caring.

    And that also applies for papers. Meta-reviews are supposed to be the best form of evidence, and yet they are clearly lousy at it, compounding their own errors that become impossible to correct. It's like everyone understands the system is broken but it's locked firmly in place so nothing ever happens, no matter how many times it's pointed out.
     
    Sean, RedFox, Hutan and 4 others like this.
  5. RedFox

    RedFox Senior Member (Voting Rights)

    Messages:
    643
    Location:
    Pennsylvania
    You're hitting the nail on the head. Science is prioritizing quantity over quality. To answer scientific questions, you need the opposite. One large, rigorous, definitive study has more value than 100 small poorly-controlled ones, in the same way one sharp photo is worth more than 100 blurry ones. Pilot studies have a place too, but medicine needs a general shift towards quality over quantity.
     
    Mithriel, Sean, rvallee and 2 others like this.

Share This Page