Fake data, fake science

It’s easy to assume that no one would falsify important scientific findings. Unfortunately, recent events show some people will.

In 2014, the prestigious journal Science published a provocative paper, "When contact changes minds: An experiment on transmission of support for gay equality," by Donald Green, a professor of political science at Columbia University, and Michael LaCour, a graduate student at UCLA. Its findings, which focused on political persuasion, were contrary to existing beliefs about the topic.

"Persuasion is famously difficult: study after study — not to mention much of world history — has shown that, when it comes to controversial subjects, people rarely change their minds, especially if those subjects are important to them," The New Yorker reported about the controversial paper in 2015. "You may think that you’ve made a convincing argument about gun control, but your crabby uncle isn’t likely to switch sides in the debate. Beliefs are sticky, and hardly any approach, no matter how logical it may be, can change that."

Not according to the paper, however. The authors reported that over a nine-month period in 2013 they had sent canvassers from a Los Angeles LGBT centre into neighbourhoods where voters had supported Proposition 8, which banned same-sex marriage. "The canvassers followed standardized scripts meant to convince those voters to change their minds through non-confrontational, one-on-one contact," the magazine reported. "The survey highlighted a surprising distinction. When canvassers didn’t talk about their own sexual orientations, voters’ shifts in opinion were unlikely to last. But if canvassers were openly gay — if they talked about their sexual orientations with voters — the voters’ shifts in opinion were still in evidence in the survey nine months later."

It was a stunning result, one that garnered extensive national media attention. "The messenger, it turned out, was just as important as the message," the magazine noted.

The results highly impressed David Broockman, a third-year political science doctoral student at UC Berkeley. He met with LaCour to discuss the findings, "which flew in the face of just about every established tenet of political persuasion," the New York Times said.

Out of admiration, Broockman decided to attempt to replicate the results. He began by calculating the cost of the survey. LaCour had told him that some 10,000 people had been canvassed and each was paid about US$100. A million-dollar budget? How could that be, Broockman wondered?

"He sent out a request for proposal to a bunch of polling firms, describing the survey he wanted to run and asking how much it would cost," the Times said. "Most said they couldn’t pull off that sort of study, and definitely not for a cost that fell within a graduate researcher’s budget. It didn’t make sense."

Eventually, Broockman’s curiosity would lead to LaCour’s downfall. In May 2015, he and fellow graduate student Joshua Kalla, as well as Yale professor Peter Aronow, posted a document entitled "Irregularities in LaCour (2014)." They argued that "the survey data in the study showed multiple statistical irregularities and was likely ‘not collected as described,’ " The New Yorker said. Not long after, Science retracted the paper from its website.

It wasn’t clear if the paper was a deliberate fraud or wishful thinking. The New Yorker posited that confirmation bias — the people involved in the study wanted to believe the findings — could have played a role in the paper’s positive reception from onset to publication. "We know that studies confirming liberal thinking sometimes get a pass where ones challenging those ideas might get killed in review," the magazine said. "The same effect may have made journalists more excited about covering the results."

No matter the motivation, the discredited paper was a black eye for Science and for the media outlets that lapped it up. It was not, however, an isolated occurrence. Science fraud is, in fact, more common than most people likely suspect.

In November 2015, the website Quartz noted that "the number of published science papers that have been retracted due to misconduct or fraud has ballooned in the last decade."

A month later, The Scientist echoed that conclusion. "Recent years have seen a spate of scientific scandals," it said. "Whether this is due to an increase in dishonesty or foul play in the lab or simply closer attention to the issue, research misconduct is now squarely in the public eye."

In November 2015, Stanford University reported that two of its academics had published a paper in the Journal of Language and Social Psychology that could help identify false research before it is published. "Even the best poker players have ‘tells’ that give away when they’re bluffing with a weak hand. Scientists who commit fraud have similar, but even more subtle, tells," the university said, explaining that researchers had cracked the writing patterns of scientists who pass along falsified data.

The study expanded on "studies [that] have shown that liars generally tend to express more negative emotion terms and use fewer first-person pronouns. Fraudulent financial reports typically display higher levels of linguistic obfuscation — phrasing that is meant to distract from or conceal the fake data — than accurate reports."

The researchers compared 253 retracted papers, mostly from biomedical journals, to unretracted papers on the same topics from the same journals and in the same publication years.

"Scientists faking data know that they are committing a misconduct and do not want to get caught," David Markowitz, one of the researchers, said. "Therefore, one strategy to evade this may be to obscure parts of the paper. We suggest that language can be one of many variables to differentiate between fraudulent and genuine science. Fraudulent papers had about 60 more jargon-like words per paper compared to unretracted papers. This is a non-trivial amount."

China in particular has been victimized by scientific fraud cases. "As China tries to take its seat at the top table of global academia, the criminal underworld has seized on a feature in its research system: the fact that research grants and promotions are awarded on the basis of the number of articles published, not on the quality of the original research. This has fostered an industry of plagiarism, invented research and fake journals that Wuhan University estimated in 2009 was worth US$150 million, a fivefold increase on just two years earlier," The Economist reported in 2013.

In 2015, Science reported "China’s main basic research agency is cracking down on scientists who used fake peer reviews to publish papers in international journals, demanding that many return research funding. A separate Chinese scientific organization released the results of an investigation revealing the role of China’s many unscrupulous paper brokers, which peddle ghostwritten or fraudulent papers, in the peer-review scandal. In some cases brokers suggested reviewers for their clients’ papers, provided email addresses to accounts they controlled, and then reviewed the authors’ work themselves. The National Natural Science Foundation is now revoking funding from authors found to have committed egregious offenses. But critics say the measures don’t go far enough to stave off fraud."

One of the most egregious examples of science fraud occurred in South Korea. On February 12, 2005, the government released a postage stamp that featured silhouettes of a paralyzed man rising from a wheelchair, taking a tentative step, then leaping into the air in joy, and finally standing in an embrace with another person, presumably a loved one.

The silhouettes, which were superimposed over an image of growing stem cells, were accompanied by a simple but stunning statement: "Successful Establishment of Human Cloned Embryonic Stem Cells." It offered incredible hope to countless millions of people in the world.

The stamp celebrated the work of renowned scientist Dr. Hwang Woo-Suk, who had published research in March 2004, claiming to have been the first person to create human embryonic stem cells cloned from the cells of 11 female patients. If true, such cells could be transplanted into people suffering from degenerative diseases, among other conditions.

Known as therapeutic cloning, the new cells could replace damaged tissue without risk of rejection by the body’s immune system. His research offered the promise that paralyzed people would be able to walk, as well as a possible cure for Alzheimer’s disease and diabetes. Tragically, his research had been faked.

In May 2006, Hwang was charged with embezzlement and bioethics law violations. A Seoul University panel had concluded that Hwang had fabricated all his research related to the findings. Three years later, Hwang was sentenced to a two-and-a-half-year suspended sentence for his transgressions. He admitted to having faked his findings.

Many people have a great respect for science, perhaps due to a lack of understanding of the discipline or a sense that no one would wilfully falsify important findings. It’s apparent, however, that some people will. If an investigator becomes involved in a file that involves scientific findings, it’s important not to assume the research, if published, must be accurate and reliable. There are too many peer-reviewed examples to prove that is not always the case.

About the Author

David Malamed


David Malamed, CPA, CA•IFA, CPA (Ill.), CCF, CFE, CFI, is a partner in forensic accounting at Grant Thornton LLP in Toronto.

comments powered by Disqus

Highlights

Our Firm Directory allows you to search for Canadian CPA firms using our interactive map as well as other criteria.

Jointly presented by CPA Canada and CPA Ontario, The ONE is the must-attend, multi-track event of the year, designed for all CPAs who want to be at the top of their game.