Publication bias, statistical power and reporting practices in the Journal of Sports Sciences: potential barriers to replicability, Mesquina+, 2023

Discussion in 'Research methodology news and research' started by cassava7, Nov 29, 2023.

  1. cassava7

    cassava7 Senior Member (Voting Rights)

    Messages:
    986
    Authors from the Technological University Dublin (Ireland) & the Eindhoven University of Technology (Netherlands)

    Received 30 Jan 2023, Accepted 04 Oct 2023, Published online: 29 Nov 2023

    Abstract

    Two factors that decrease the replicability of studies in the scientific literature are publication bias and studies with underpowered desgins. One way to ensure that studies have adequate statistical power to detect the effect size of interest is by conducting a-priori power analyses. Yet, a previous editorial published in the Journal of Sports Sciences reported a median sample size of 19 and the scarce usage of a-priori power analyses.

    We meta-analysed 89 studies from the same journal to assess the presence and extent of publication bias, as well as the average statistical power, by conducting a z-curve analysis. In a larger sample of 174 studies, we also examined a) the usage, reporting practices and reproducibility of a-priori power analyses; and b) the prevalence of reporting practices of t-statistic or F-ratio, degrees of freedom, exact p-values, effect sizes and confidence intervals.

    Our results indicate that there was some indication of publication bias and the average observed power was low (53% for significant and non-significant findings and 61% for only significant findings). Finally, the usage and reporting practices of a-priori power analyses as well as statistical results including test statistics, effect sizes and confidence intervals were suboptimal.

    Open access link (Journal of Sports Science): https://www.tandfonline.com/doi/full/10.1080/02640414.2023.2269357
     
    Last edited: Nov 29, 2023
  2. cassava7

    cassava7 Senior Member (Voting Rights)

    Messages:
    986
    Hutan and Andy like this.
  3. rvallee

    rvallee Senior Member (Voting Rights)

    Messages:
    12,508
    Location:
    Canada
    The issue is not with replicability, it's with validity. And it's not limited to sports sciences, it's basically in all studies using the EBM/pragmatic approach. If anything, there is an excess of replicability, because it's actually easy to replicate bad results. They're not small studies with high bias for random reasons, it's a format that allows to 'validate' invalid conclusions.

    The bigger issue is excessive bias and an industry that rewards a high quantity of low quality work and is entirely driven by supply-side thinking and incentives. If only it were limited to sports sciences. Instead it applies in everything related to health, and it's a human problem either with human solutions, unlikely to happen, or a massive shift in paradigm through technology, likely with the upcoming AI revolution.

    Only the scientific method works. It's no surprise that not using the scientific method in research doesn't yield good results.
     
  4. Sean

    Sean Moderator Staff Member

    Messages:
    7,243
    Location:
    Australia
    This.

    Replicating in this context means identifying, isolating, and maximising known biases and confounders, and then redefining them as therapeutic benefits. Which is not science. It the complete opposite: anti-science.
     
    Peter Trewhitt and bobbler like this.
  5. Adrian

    Adrian Administrator Staff Member

    Messages:
    6,488
    Location:
    UK
    It does seem interesting that bad methodology can lead to replicated results. In some cases its not surprising as experiments are designed to elicit bias and hence get certain results.

    However, when we talk about replication shouldn't we talk about replicating results and part of that should be using different methods?
     
    Sean and Peter Trewhitt like this.

Share This Page