It’s always interesting to know how many researchers in any given field engage in so-called questionable research practices that don’t rise to the level of out-and-out fraud: honorary authorship, citing articles they don’t read, choosing reference lists that would please editors or reviewers, for instance. And when the researchers work in a field with potential health implications, the findings are even more compelling. Lauren Maggio and Anthony R. Artino, Jr. from the Uniformed Services University spoke to us recently about the findings from their survey (posted in bioarXiv) of health professions education researchers, a relatively new field that studies how future health professionals are trained.

Retraction Watch: You note that 90% of the people who volunteered to complete the survey admitted to at least one questionable research practice. Was that surprising?

Lauren Maggio and Anthony R. Artino, Jr.: Yes, we were quite surprised! We had an idea that many of these practices were happening, but we didn’t know the extent of the problem and weren’t sure if respondents would be honest about their practices. For example, one of our survey respondents said he was happy we were doing the survey, but he cautioned that respondents would not admit to these practices, even if they were doing them. It seems he was wrong, and we suspect that he too would be quite surprised by our findings.

RW: Your main results: “The three most frequently reported QRPs were adding authors to a paper who did not qualify for authorship (60.6%), citing articles that were not read (49.5%), and selectively citing papers to please editors or reviewers (49.4%). Additionally, respondents reported misrepresenting participants words (6.7%), plagiarizing (5.5%), inappropriately modifying results (5.3%), deleting data without disclosure (3.4%), and fabricating data (2.4%).” Were any of these surprising, and why?

LM and AA: We knew honorary authorship was a big issue, but we wouldn’t have predicted that the prevalence was so high. Also, the fact that 14 people (2.4%) said they had fabricated data was sort of mind blowing. It’s not a large percentage, but even one researcher fabricating data is one too many.

RW: The findings focused on health professions education researchers — what types of researchers does that include? It’s a relatively small field, correct?

LM and AA: Health professions education is a relatively new and highly applied field that includes a variety of researchers, including, but not limited to, many health professionals (e.g., physicians, nurses, dentists), but also PhD-trained scientists, such as educational psychologists, basic scientists, linguists, biostatisticians, and information scientists. Our diversity allows us to bring a variety of methods and methodologies to work towards our common goal of educating the health professional workforce to improve human health. Despite the strength of our diverse approaches, this diversity made survey development quite challenging because our researchers are so heterogeneous in their research practices.

RW: Given that so many admitted to at least one QRP (and, as you note, self-reports often underestimate prevalence), could this have a potential impact on human health?

LM and AA: Yes! HPE research guides how we educate our healthcare providers to practice medicine, ranging from how to generate a diagnosis to how to share a prognosis with a patient and their loved ones. So, for example, if medical educators are using teaching strategies that are based on research evidence that was not responsibly conducted, then it has the potential to significantly – and negatively – impact human health and well-being.

RW: You posed an interesting question on Twitter: Which of the QRPs might pose the most damage to medical education research and science? What kinds of responses did you get, and what’s your reaction to them?

LM and AA: We were quite surprised that the greatest percentage of those polled (41%) selected “citing unread articles” as the most damaging practice. Although this practice is not optimal, we probably would have guessed that salami slicing or honorary authorship was more problematic. While it may seem that salami slicing is a minor offense, it can clutter the literature, unfairly reward authors, and inflate the significance of a finding, which in turn can negatively impact meta-analyses and other types of systematic reviews. Unfortunately, salami slicing is tough to identify, since it’s not necessarily clear cut and reasonable people may disagree. To us, honorary authorship is less about damaging science and more about fairness to researchers and their careers.

RW: You note: “…most HPE scientists surveyed did not report the vast majority of QRPs. They are presumably doing good science. Nevertheless, we believe reforms are needed.” What kinds of reforms? (An aside: We’ve seen research suggesting that some interventions designed to increase responsible conduct in research aren’t always effective.)

LM and AA: Coming from our perspective as health professions educators who train future researchers, we think responsible research must start with us. That is, we need to lead by example. This means showing junior researchers what it means to conduct research with integrity; for example, not accepting authorship when we don’t deserve it and not p-hacking messy data. It also means thinking out loud with our learners about how we navigate the ethical choices we make every day in our research, especially when dealing with ethical dilemmas that are almost never black and white.

Ultimately, we think responsible research is a cultural issue, and we agree with Peter Drucker who noted that “culture eats strategy for breakfast.” With this in mind, we feel more regulations are probably NOT the answer. Instead, we want to focus on culture and the contextual complexities of QRPs. In fact, we are moving this program of research forward using qualitative approaches to dig into the cultural aspects of research integrity in health professions education.

Like Retraction Watch? You can make a tax-deductible contribution to support our growth, follow us on Twitter, like us on Facebook, add us to your RSS reader, sign up for an email every time there’s a new post (look for the “follow” button at the lower right part of your screen), or subscribe to our daily digest. If you find a retraction that’s not in our database, you can let us know here. For comments or feedback, email us at team@retractionwatch.com.

Share this: Email

Facebook

Twitter

