The Inside Story Of How An Ivy League Food Scientist Turned Shoddy Data Into Viral Studies (Brian Wansink, Cornell)

Cheshire

Senior Member (Voting Rights)
When Siğirci started working with him, she was assigned to analyze a dataset from an experiment that had been carried out at an Italian restaurant. Some customers paid $8 for the buffet, others half price. Afterward, they all filled out a questionnaire about who they were and how they felt about what they’d eaten.

Somewhere in those survey results, the professor was convinced, there had to be a meaningful relationship between the discount and the diners. But he wasn’t satisfied by Siğirci’s initial review of the data.

“I don’t think I’ve ever done an interesting study where the data ‘came out’ the first time I looked at it,” he told her over email.

More than three years later, Wansink would publicly praise Siğirci for being “the grad student who never said ‘no.’” The unpaid visiting scholar from Turkey was dogged, Wansink wrote on his blog in November 2016. Initially given a “failed study” with “null results,” Siğirci analyzed the data over and over until she began “discovering solutions that held up,” he wrote. Her tenacity ultimately turned the buffet experiment into four published studies about pizza eating, all cowritten with Wansink and widely covered in the press.

But that’s not how science is supposed to work. Ideally, statisticians say, researchers should set out to prove a specific hypothesis before a study begins. Wansink, in contrast, was retroactively creating hypotheses to fit data patterns that emerged after an experiment was over.
https://www.buzzfeed.com/stephaniem...cking?utm_term=.cb9BaqMQN&bftwnews#.brXZqwz1x
 
Really brings home the perils of changing the question (or just numbers in the question) after scouring the data for answers to a question you did not originally ask, and maybe not properly controlled for. Presumably there will sometimes be genuine cases where someone stumbles upon an apparent correlation of interest in the data post hoc, but would then need to do another independently designed study with freshly acquired data to prove if a fluke or not.
 
This is such an elegant story - a beautifully perfect train-wreck, so to speak.

The Cornell prof in question got caught out by blogging about how to fish for relationships in data sets. He'd had a 30-yr academic career, and apparently had no idea that his normal research methodology (p-hacking) is unscientific; a recipe for false positives. (Why else would he tell the world that this is how to do research?)
 
This is such an elegant story - a beautifully perfect train-wreck, so to speak.

The Cornell prof in question got caught out by blogging about how to fish for relationships in data sets. He'd had a 30-yr academic career, and apparently had no idea that his normal research methodology (p-hacking) is unscientific; a recipe for false positives. (Why else would he tell the world that this is how to do research?)
Yep. A mind set so steeped in bad methodology, that he had no recognition of that fact. Sort of sounds familiar.
 
I feel for the "dogged" unpaid grad student from Turkey "who never said 'no'"

My sympathy is somewhat limited by having read so many crappy CFS dissertations.

There are so many PhD students working underneath quacks churning out dodgy work and when they play along with it they can do real harm to others. The money made from PhD students seems to be another corrupting aspect of academia (although I don't really know enough about it to be sure).
 
My sympathy is somewhat limited by having read so many crappy CFS dissertations.

There are so many PhD students working underneath quacks churning out dodgy work and when they play along with it they can do real harm to others. The money made from PhD students seems to be another corrupting aspect of academia (although I don't really know enough about it to be sure).
Good point. I suspect I wouldn't have made that comment if the research hadn't been about subsidised pizza eating.
 
Back
Top