Discussion in 'Advocacy Action Alerts' started by John Mac, Sep 20, 2018.
Good point, but I suspect the connections were in place.
hardly a high level statement of support - No politician has put their name to it. The PACE team really were overhyping this by quoting in the naughty newsletter
That's how I see it. They were using an appeal to authority to bolster the bias effect on participants. It's a generic comment in itself, it simply does not belong in a clinical trial and shows their complete lack of respect for ethics. They basically did so much to bias participants it could very well have been called the BIAS trial.
The participants' quotes about how it's successful for them are much worse by themselves, but the PM office quote is just straight bizarre, reflects on how they felt confident to really overdo their attempt to bias the results without fear of it being found out.
#BiasTrial - Yup! @rvallee Sure seem that way. Perhaps we need to rename it!
It could be used as a question to an MP - to ask whether it is normal for No 10 to make comments on live trials before results are published, and in a way that the response is used to influence participants responses to the trial which is in direct contradiction to how clinical trials should be carried out.
I would also ask if they have records of the quote that is used to see how it was actually given out by No10 and then it would be possible to see if/how it was manipulated for the newsletter.
As this is contrary to how trials are supposed to be done, it might be that there would be political interest in clearing this up as if it was more publicly known then presumably it would be politically embarrassing.
Maybe also asking the question to friendly MPs (Carol M might be interested in finding the info) would get a better answer and also add fuel to any of the arguments they use when they are debating this for us, to show how badly this trial was conducted.
Number 10 commented as a result of a petition so it would be normal too respond to a petition it’s not N10 that have done anything dubious it is the PACE team who have misrepresented part of the government response
But asking the question would still make MP's/politicians aware of misuse/misrepresentation of the quote, and the conduct of the researchers? Someone reading the newsletter would have no way of knowing it wasn't a statement made to the trial participants.
@John Mac - maybe it would be worth posting the video into the first post of the thread, just for new people who click on it?
Is this the best link for it?
ironically, MS has retweeted
If a trial had ever needed to be strangled by red tape, it was PACE.
I’ve noticed it’s often people who are a little bit illogical that don’t like following rules and call sensible procedures ‘red tape’
Don’t get me wrong, sometimes it’s good to break rules and disrupt the norm, but in the case of rules broken for PACE it’s akin to calling ‘do no harm’ red tape it’s so basic
Evidence? We don't need no stinking evidence.
Of course Sharpe would be a fan of "lobster" Peterson.
Also, if anyone has an opinion this is the article: https://www.universityaffairs.ca/op...we-minimizing-harm-or-maximizing-bureaucracy/
I strongly suspect the project was a waste of time - asking students questions in a 'qualitative study'. One of the commenters makes the point that ethics committees have to protect people who feel obliged to help - like students and junior staff, from repeated intrusive requests to be part of studies. Studies done on volunteers through adverts are usually a waste of time too.
I agree that ethics committees are a grind but when I put in an ethics application to set up a rituximab trial for ME it was dealt with rapidly. Although I was told it might help if I attended the committee in person it was to reduce risk of delay and I was happy to do so. I was asked some very tough questions that reminded me of the key things I had forgotten to do which I should have done. I thanked them for their help and within a couple of days of me doing what I should have done they sent approval. The committee consisted of about ten people who were clearly going to be there for a couple of hours after a full days work and who took turns to ask questions in a ruthlessly efficient and courteous manner, indicating that they knew exactly how to work as a team. I had some years before been on the same committee myself and was impressed to see that efficiency and professionalism had gone a step up.
Peterson is a treasure trove of stupid statements and ideas and one was something like aggressive male behavior is normal and natural because look at how lobsters are naturally aggressive in their mating. Maybe I got it wrong, doesn't even matter, it's all just nonsense. The guy just loves to say dumb stuff that riles up incels and is cashing in on his limelight.
I'd like to reply with something quite witty but I'm utterly spent at the moment.
But thanks for the explanation @rvallee.
he seems to like a lot of guardian articles.........hmm
Some highlights from Brian Hughes presentation:
"Every single thing I say about psychology can be said about the PACE trial and the way that this condition [ME] has been dealt with. And therefore I use is as sort of the climax of the whole book."
"..the claim was in 2011, that positive change had occurred as a result of CBT and exercise therapy, compared to standard medical care. And in 2013 it was even reported that 22% of patients in the trial who received CBT and or GET actually recovered from CFS."
"That by using this psychotherapy you are effectively reverse engineering the condition and fixing it".
"When this single study is treated as the final word on a topic then you are not dealing with good science per se because there is a big issue around replication. And science is a field of empirical study that relies on replication."
"You just take a hundred studies you do them again and most of them do no result at all. So why does that happen?
One of the reasons that it happens, and it happens very much with the PACE trial, is what I call Rampant methodological flexibility."
"..because there is no standard methodology in psychology research that means that it is very difficult to control what goes on in the research context.
And the PACE trial took advantage of rampant methodological flexibility in all sorts of ways."
"That flexibility is not good science it opens the door to confirmation bias, it opens the door to something that scholars call Harking (hypothesising after the results are known)."
"Moving the goal posts, we’ll hear about that a little bit later."
"Fishing for findings: If you have a lot a data in a computer you can pull out a fraction of it and report that and make your study look very strong when in fact most of the data don’t show anything."
"Method blaming, which means that if your study finds something different to the other guys study you can say well its because they’re different methods, because no two studies are alike."
"So, the PACE trial then;
What’s the basic problem with the PACE trial in scientific terms is that when you have this open-ended flexibility you end up with studies that are weak by design. Studies that rely on self-reported data require a thing called blinding."
"So, when you have flexible non-standardised methods, you make the study design up yourself, you open the door to unconscious biases by the researchers, perhaps conscious biases in some cases"
"The PACE trial is full of problems. But I would simply say this, even if you knew nothing more about the PACE trial except that it is a non-blinded trial based on self-report you know enough to know that you cannot rely on that trial, that trial is not a good study"
"[The researchers] between collecting the data and publishing the paper they changed the criteria."
"..the protocol was published before data collection. So we all know that they moved the goalposts."
"another problem here that we call the ‘winners curse’.
Which is, when we do lots of studies or a study with lots of bits, the temptation is to look at the bits that worked, publish that and then quietly forget about the other bits."
"The boring findings, the non-findings they are in the researchers file drawer. We call that the file-drawer problem."
"The PACE trial, the original study had three principal investigators. All of them have a working history of promoting CBT and cognitive non-biological theories in their field. Each of them have published books, prior to the PACE trial and they show their hand.
Their view is that CBT is the cure for lots of things, cure for, for example, chronic fatigue"
"So the risk here is that there is a bias, what we call a therapeutic allegiance, in these people. That they were so wedded to their theories, that they pre-empted the data and interpreted all the data in a weird way to justify their prior assumptions."
"We are guilty of confirmation bias all the time even when we have little grounds to draw conclusions.
And one of the problems here is, that we know from people who have looked at this in psychology research, if you have a strong expectancy about your research you are more likely to have the finding that you were looking for."
"....the PACE trial stopped being independently verified or independently replicated. All the studies all the papers emanating from the PACE trial dataset are by the same network of people. They are all connected."
"Psychology has a measurement crisis"
"A regular study would triangulate. They would use the objective measure to allow or disallow the subjective measure, but that’s not what they do on PACE."
"I mentioned earlier that the researchers moved the goal posts."
"So they had to defend themselves, and in the written report in the journal they said that the reason they did this is because they pitched too high to begin with. They were asking too much of patients. They were saying that if you had a score of 85, half the population wouldn’t have a score of 85.
It’s what they said, in writing.
And they literally point out, that threshold would mean that approximately half the general working age population would fall outside the normal range. So they said ‘we got it wrong we should never have said 85 so that’s why we’re reducing it to 60’."
"But they base this conclusion on prior data showing that the average score was 85. But it was the mean average.
Now I don’t want to be patronising, but in school we learn the difference between the mean, the median and the mode. And on this scale, this fatigue scale, this general functioning scale, the mean is 85 but the median is close to 100. So it is simply inaccurate to say most people score either 95 or better on this scale.
It’s inaccurate to conclude that just because the average, the mean average is down at around the 85 point, that this means that half the population are above and half the population are below. "
"The PACE trial entry criterion was 65, so in order to be considered sick enough to take part you had to have a score of 65, but in order to be considered recovered in the published paper a score of 60 will do. Which means your score could go down and you would be considered to have improved."
"the PACE trial uses the Oxford criteria for determining whether or not people had ME or CFS."
"in the PACE trial 47% of the people wouldn’t meet the CCC for CFS.
So if the PACE trial was funded and conducted in Canada half the participants wouldn’t even be in the study because they wouldn’t be diagnosed with CFS."
"Finally then, it all culminates in a notion of exaggeration. So even if, when you break it down, when you step back, there is an awful lot of information in the PACE trial, but when you step back and draw a picture this is what people come up with.
No group CBT, GET or control, no group stands out as having improved much better than all the others. So even if you just step away from all the noise, and all the debate and just look at the findings as they are, they are very, very modest.
And this is what I referred to as an exaggeration crisis."
"Psychologists and clinical professionals, they want therapies to work.
And there is a big problem in therapy research, not just psychotherapy research, of over optimistic interpretations of rather banal data."
"In a nutshell.
Psychology is full of potential, full of strengths, but the PACE trial and the ME controversies, CFS controversies, they put psychology in a negative light"
"..it’s a type of shame I feel when I hear my profession being talked about as a source of damage and a source of arrogance and a source of delusion. That it affects peoples lives."
"There is a lot of scepticism about bad science in psychology and a lot of concern that these types of cases get defended, sometimes at the highest level."
full transcript coming up
Separate names with a comma.