The piece by Munafo and Bishop et al is all very well but I think it could be described as 'fluff-wiping'. (The practice of starting to do a job by picking something up, wiping the fluff off and then putting it down and forgetting to do the job.)
Pretty much everything in the manifesto is obvious and has been known about for generations. It makes nice lists but these are exactly the sorts of lists that give rise to the 'validated tools' of assessment that allow people like Cochrane to conveniently not notice the things that really matter. It seems a little bit like the mafia going to church on Sunday.
What I would suggest we actually need are things like exposure to public review. Any study published publicly (and every study that is started should be published publicly with open access) should appear on a webpage which has a mechanism for uncensored scientific feedback. The PACE paper should appear alongside any criticisms people can raise.
This might frighten some scientists but it is worth noting that this was in the past a standard method in some branches of medical science. At least until 1980 the Physiological Society in the UK published papers in its journal only if those papers had been presented to a meeting and following any comments anyone wanted to make in response were voted suitable for publication by consensus. I well remember a paper being blocked because the method for restraining an animal was considered unethical.
I've just had a read over the manifesto. The full doc is
here.
It seems to be authored by a lot of people who have genuinely interesting things to say about how to reform science. Bishop is on there, but probably one of the least heavyweight figures on the author list.
I find the document makes a genuine and useful contribution to science, by presenting in a simple condensed form a lot of the key issues in science today and the social, political and financial factors that shape them. It mentions the phenomena commonly identified as factors in poor reproducibility of science, including:
p-hacking. This is when you don't get the result you want, so you run another study in a slightly different way and only report the second, successful study. It also applies if your main outcome measure showed nothing significant, so you find another outcome measure that does, and report only that in the abstract.
Harking. This is when your result goes the opposite way you expected, but you fit your hypothesis to the findings, and present the paper as if you were predicting that outcome all along.
Publication bias. Only papers with significant results end up getting published.
There are a few measures suggested, including lots of forms of "independent oversight", but I'm sceptical of that, because apparently the PACE trial had some form of that, and we all know what went down there. There can also be problems finding independent people who have sufficient expertise but at the same time, have no motive to either assist or obstruct the work. But some of the others are good, and nice to have collected in the one place:
Registered Reports. This means the journal publishes the details of the rationale and method of the study before the data is in. In true registered reports, acceptance of the paper is based on the rationale and methods presented there, and not on the outcome. The idea is that then publication will not depend on whether the study "worked out" or not.
Disclosure of COIs. Its good that they mention non-financial COIs. Like wanting to get published.
Funding replications. We don't have enough people out there bothering to find out whether they could replicate the results of some big study in their own lab. This is because its hard to get replication work published (journals and grant bodies are biased towards "exciting new" ideas).
Transparency and Openness Promotion guidelines. This is good. Providing guidelines as to best practice, including how to deal with issues like confidentiality.
Incentives. Encouraging employers to preferentially employ and promote/reward researchers that demonstrate a commitment to open science practices.
None of these ideas are novel - they've occurred to all of us interested in reform - but its useful to have them integrated into a single document.