"The impact of leading questions on ME/CFS research: bias and stigma in study design", 2025, Jason et al

boolybooly

Senior Member (Voting Rights)

Research Article

The impact of leading questions on ME/CFS research: bias and stigma in study design​

Anthony T. T. Campolattara
Leonard A. Jason
Katherine C. Tuzzolino

Objective
To investigate how question phrasing in ME/CFS research may influence participant attributions of fatigue/energy problems and unintentionally reinforce psychosomatic assumptions.

Conclusions
Leading or biased question phrasing may distort participant responses in ME/CFS research, potentially inflating psychosomatic interpretations of the illness. Researchers should critically examine survey language to avoid introducing unintended bias that could compromise research validity and reinforce stigma.

Courtesy feed from Dr Marc-Alexander Flux
 
Full abstract.

ABSTRACT

Background
Myalgic Encephalomyelitis/Chronic Fatigue Syndrome (ME/CFS) is a complex and often misunderstood illness, characterized by post-exertional malaise, unrefreshing sleep, and cognitive impairments.

Objective
To investigate how question phrasing in ME/CFS research may influence participant attributions of fatigue/energy problems and unintentionally reinforce psychosomatic assumptions.

Methods
A total of 2248 individuals with ME/CFS from an international sample completed a survey assessing fatigue-related attributions. We analyzed how question wording influenced whether participants attributed their symptoms to physical or psychosocial causes. Particular focus was given to a fatigue attribution item that framed causes in terms of ‘personal life’ or ‘environmental factors.’

Results
Participants were significantly more likely to attribute their fatigue/energy problems to psychosocial factors when prompted with psychosocial framing. Many respondents who previously indicated physical causes as the primary source of their symptoms shifted to psychosocial explanations in response to the differently phrased item. This shift was especially pronounced among participants reporting higher levels of psychological distress.

Conclusions
Leading or biased question phrasing may distort participant responses in ME/CFS research, potentially inflating psychosomatic interpretations of the illness. Researchers should critically examine survey language to avoid introducing unintended bias that could compromise research validity and reinforce stigma.
 
This is actually a rather interesting study.

Using the question "What do you think is the cause of your problem with fatigue/energy?" - an open, neutral question - as a baseline showed that patients viewed their illness as overwhelmingly physically driven. But when patients were prompted with a psychosocial "frame" ("Do you think anything specific in your personal life or environment accounts for your problem with fatigue/energy?") this caused around one-third of the patients who had endorsed a principally physical view to pivot towards psychosocial responses.

Just one suggestively-worded follow-up question is capable of seriously influencing the distribution of patients' attributions - quite an insight into how easy it is to manipulate survey responses by leading questions.
 
This is actually a rather interesting study.

Using the question "What do you think is the cause of your problem with fatigue/energy?" - an open, neutral question - as a baseline showed that patients viewed their illness as overwhelmingly physically driven. But when patients were prompted with a psychosocial "frame" ("Do you think anything specific in your personal life or environment accounts for your problem with fatigue/energy?") this caused around one-third of the patients who had endorsed a principally physical view to pivot towards psychosocial responses.

Just one suggestively-worded follow-up question is capable of seriously influencing the distribution of patients' attributions - quite an insight into how easy it is to manipulate survey responses by leading questions.
But do people really pivot, or is it simply that they try to answer the question and this is then interpreted by the reader to be "the most important reasons" or whatever? I don't see that the second question in the example would necessarily cancel out the first in the participants' mind.

On framing and suggestions:
I had a patient who had a stable weight, but based on their activity level the food intake they were describing was way below what they needed to maintain weight. After asking a bit more about their daily routine, I asked about pre- and post-workout meals and "found" the missing ~1000 kcal (which was about 1/3 of the estimated required intake). The patient hadn't provided this information when I asked about meals, as pre/post-workout was pre/post-workout not a meal in their mind.
 
How many of us can say we live in an ideal environment.

Environment covers such a broad range of factors from noisy neighbours to having family responsibilities to lack of support witg personal hygiene, cleaning, cooking, dealing with bureaucracy etc , also lack of understanding by paid carers/medical staff/family/friends which have the potential to negatively affect us and worsen symptoms

Good to see this issue of questionnaire design highlighted
 
I assume that many responders are also interpreting the “environment” part to be a catch all for exposure to pathogens, allergens, etc. similar to how people reference “environmental triggers” in epidemiology

[edit: it’s just that the question was intending “environment” to mean psychosocial without clarifying, and just assuming that respondents had the same interpretation]
 
Last edited:
Of course, but the real problem here is that it's precisely the goal of this language. It can't be critically examined, because the entire point is to mislead.

Still, this is valuable research, and it's good to have it... examined critically.
 
This is actually a rather interesting study.

Using the question "What do you think is the cause of your problem with fatigue/energy?" - an open, neutral question - as a baseline showed that patients viewed their illness as overwhelmingly physically driven. But when patients were prompted with a psychosocial "frame" ("Do you think anything specific in your personal life or environment accounts for your problem with fatigue/energy?") this caused around one-third of the patients who had endorsed a principally physical view to pivot towards psychosocial responses.

Just one suggestively-worded follow-up question is capable of seriously influencing the distribution of patients' attributions - quite an insight into how easy it is to manipulate survey responses by leading questions.
A principle that has rarely failed me: the answer to a question is rarely the solution to the problem. Biopsychosocial is basically the systematic asking of irrelevant questions.
But do people really pivot, or is it simply that they try to answer the question and this is then interpreted by the reader to be "the most important reasons" or whatever? I don't see that the second question in the example would necessarily cancel out the first in the participants' mind.
Yup. They were just answering the question as it was asked.

It's obvious that asking a psychosocial-framed question will have psychosocial-framed answers in many cases, especially those of us who did not have a clear infectious illness trigger after which everything changed. In fact my initial notes about the early symptoms, even though I had had a bad case of the flu right before when they began, were entirely psychological, because I had no clue and nothing else, and I simply didn't think of that flu, it's just typical illness after all.

Shows how total BS the models that explicitly assert otherwise are, but it's not as if any more evidence is needed for this.
 
I haven’t read the paper but the basic premise that different framing gets different answers has long been known, and sometimes exploited, by any pollster worth their salt
Senator, when have you stopped beating your wife?

Political polling also has the same problem when asking vague questions, such as asking people whether the country is headed in the right or wrong direction. Many will give the same answer for completely opposite reasons, while many will give a different answer while having the same reason for it.

This is why biopsychosocial ideology has zero depth. Like most political polling. It isn't meant to know what's going on out there, it's usually meant to know what needs to be influenced, how to get people to agree with you on a question, even if they fully disagree with you on what the problem even is.
 
Back
Top Bottom