When scholars express concern about trust in science, they often focus on whether the public trusts research findings. This study explores a different dimension of trust and examines whether and how frequently researchers misrepresent their research accomplishments when applying for a faculty position. We collected all of the vitae submitted for faculty positions at a large research university for 1 year and reviewed a 10% sample for accuracy. Of the 180 applicants whose vitae we analyzed, 141 (78%) claimed to have at least one publication, and 79 of these 141 (56%) listed at least one publication that was unverifiable or inaccurate in a self-promoting way. We discuss the nature and implications of our findings, and suggest best practices for both applicants and search committees in presenting and reviewing vitae.

Introduction When scholars express concern about trust in science, they often focus on whether the public trusts research findings. This article explores a different dimension of trust in science, specifically, whether researchers can trust each other. In the increasingly social world of science, researchers need to trust their collaborators and other scholars at nearly every point of the research process, including literature reviews, data collection, data analysis, manuscript preparation, and peer review. Evidence from non-academic settings, academic administration, and academic medical centers suggests that this trust might not be well placed. According to a recent study, 55% of resumes contain erroneous information and 31% of resumes include “misrepresentations that are purposely designed to mislead recruiters” (Henle, Dineen, & Duffy, 2017, p. 2). While we would like to believe that academics are more trustworthy, a number of sensational cases within higher education cast doubt on their integrity as well. In 2004, Henry Zimon, then president of Albright College, resigned after he was “accused of lying about his academic and publishing record” (Basinger, 2004, p. 1). Among his many embellishments were listing a forthcoming book for which he had neither a manuscript nor a publishing contract, and claiming a postdoctoral position at Harvard when in fact he had only given a guest lecture (Basinger, 2004). In 2007, Marilee Jones, then dean of admissions at the Massachusetts Institute of Technology (MIT), resigned after admitting that she had falsified her resume (Lewin, 2007). Ms. Jones misrepresented her academic degrees when she first applied for a job at MIT, and later confessed that over the 28 years of her employment with the university “did not have the courage to correct my resume” (Lewin, 2007, p. 1). Such sensational cases of misrepresentation are not only limited to academic administration but have also featured researchers in academic departments. In 2012, Anoop Shankar, then professor of epidemiology at West Virginia University, was accused of falsifying his credentials and research accomplishments (Aronowitz & Dokoupil, 2014). The resulting investigation found that he had falsified not only his credentials and research accomplishments, but the data for several of his research publications as well (Aronowitz & Dokoupil, 2014). These high-profile cases of professional misrepresentation in academia generate news, but they do not reveal the frequency or severity of the problem. Instead, they raise concerns about the truthfulness of academic vitae and questions about how often academics falsify their credentials and accomplishments when applying for jobs. Older studies in academic medical centers found that 5% of applicants for clinical faculty positions submitted vitae that contained falsified clinical credentials (Shaffer, Rollo, & Holt, 1988) and 15.6% of applicants submitted vitae that contained falsified research citations (Goe, Herrera, & Mower, 1998). More recent studies of applicants to residency and fellowship programs have shown that an average of 22% have falsified research citations, with internal medicine reporting the lowest (2%) and pediatric pulmonology reporting the highest (50%).1 While much work has been done on vitae falsification in academic health science, the prevalence and scope of vitae falsification in academic environments outside of clinical medicine is largely unexamined. To establish an initial measure of the incidence and types of vitae falsification among faculty applicants to non-health science programs, we conducted a pilot study of curriculum vitae (CVs) submitted to faculty searches at a large, land grant, doctoral university with very high research productivity.

Method After obtaining approval from our institutional review board (IRB), and the provost and general counsel at our field site, we monitored all searches during the 2015-2016 academic year. Our inclusion criteria for searches were (a) a faculty position (as opposed to staff), (b) with research expectations (as opposed to solely instructional or clinical faculty), and (c) in a non-health science program. To avoid a potential conflict between obligations to protect the confidentiality of human subjects and obligations to report suspicions of employee misconduct, our exclusion criteria for applicants within a search were (a) successful applicants and (b) any applicant who was an employee of our host institution when applying for the position. Given the objective of the study, and the potential for self-selection bias, we did not seek consent from applicants and instead obtained a waiver of informed consent from the IRB. In addition, the online portal through which applicants submitted their materials included a statement that materials could be used for, among other things, “statistical purposes” and “academic research.” Because the study had the potential to affect the reputations of the programs conducting the searches, we sought permission from the supervising chair or dean for every search that met our inclusion criteria. For the searches that met our inclusion criteria and for which we received administrative permission, we collected electronic copies of the CVs for all of the applicants (except those who met our exclusion criteria). After collecting the CVs, we waited 18 to 30 months to conduct our analysis and verification process to give forthcoming publications time to appear in print.2 We used systematic sampling with a random start to select approximately 10% of the CVs for analysis, and two coders independently analyzed all of the vitae in the 10% sample. For each CV, the coders extracted basic demographic information, including the type of position sought, the location (domestic vs. international) and Carnegie classification of each applicant’s degree granting institution, and time since degree. In addition, the coders tallied and categorized reported publications according to type (article, book, or book chapter3) and status (published or forthcoming4). All other types of publications (e.g., conference proceedings, blog posts, letters to the editor) were not recorded. The coders then completed a verification process for all published and forthcoming articles, published books, and published book chapters. For published and forthcoming articles, the coders proceeded through a five-step search process that included searching for the article title and author name(s) in (a) Google Scholar, (b) Academic Search Premier, (c) the aggregated database available through the university library,5 and (d) Google. If that failed, the coders also searched for the title of the journal in Google in an attempt to locate and search the journal’s website.6 For books and book chapters, coders used the same process searching for book titles and author name(s) in (a) Google Scholar, (b) Academic Search Premier, (c) the aggregated database available through university library, and (d) Google, and also searched (e) Google Books. If that failed, the coders searched the publisher’s name in Google in an attempt to locate and search the publisher’s website. If the coders were unable to find a publication through this process, they coded the publication as “unverified.” The coders then categorized the unverified publications as either “unverified journal” or “unverified publication.” Unverified journals were cases in which the coders could not find the journal. Unverified publications were cases in which the coders (a) found the journal but could not find the article, (b) could not find the book, or (c) found the book but could not find the book chapter. If the coders found the publication, they compared the publication’s official citation data to the citation information presented on the CV to determine whether there were discrepancies. The coders then categorized the discrepancies by type: authorship insertion, authorship promotion, and authorship omission. Authorship insertions were cases in which applicants listed themselves as an author on their CV but did not appear as an author in the published version of the work. Authorship promotions were cases in which the applicant claimed a better authorship position than what appeared on the actual publication.7 Authorship omissions were cases in which the published version of the work included more authors than the applicant listed on their vitae.8 The coders also noted other errors that were not self-promoting, such as incorrect titles, journals, and publication dates.9 The two coders met on a weekly basis to compare coding results and resolve discrepancies. Each time the coders disagreed (e.g., one verified a publication, and one did not), they went through the verification process again together. After all discrepancies were resolved, the coders created a new code sheet for each CV with their agreed-upon codes, attached the two original code sheets, and gave the code sheets to a research assistant who entered them into a database maintained in Excel (without identifiers).

Results Sample Characteristics In the 2015-2016 academic year, there were 45 searches across 26 programs that met our inclusion criteria. We asked 25 chairs and one dean for permission to include their searches in our study: 23 chairs and one dean granted permission, two chairs did not. Our sample included rejected applicants, who were not otherwise employed by the institution, for 43 searches across these 24 programs. The applicant pools for each search ranged in size from seven to 204 individuals, and the aggregated applicant pool was 1,837. A 10% random sample yielded 180 CVs to code.10 Most of the applicants in our sample were applying for entry-level faculty positions (see Table 1). Of the 180 applicants included in the analysis, four had applied for a post-doctoral position, 21 had applied for a visiting assistant professorship, 150 had applied for an assistant professorship, two had applied for an associate professorship, two had applied for a senior faculty position, and one had applied for a department chair (see Table 1). Table 1. Applicants by Rank of Open Position. View larger version Findings Of the 180 applicants whose CVs we reviewed, 141 (78%) reported at least one publication on their CV; 39 (19.4%) applicants reported no publications on their CV (see Table 2).11 The 141 applicants who claimed to be an author reported a range of 1 to 77 publications, with an average of eight and a median of four. Controlling for the career stage of the open position, the average number of publications for applicants to entry-level positions (post-doctoral positions, visiting assistant professorships, and assistant professorships) was 6.7; the average number of publications for applicants to mid-level and senior positions (associate professorships, senior faculty positions, and chair) was 34. Table 2. Unverified and Inaccurate Research Citations. View larger version Of the 141 applicants who claimed to be authors, 79 (56%) had at least one unverified or inaccurate research citation on their CV (see Table 2). The number of unverified or inaccurate citations per author ranged from 1 to 17, with an average of 2.4. Of these 79 authors, 35 had one unverified or inaccurate research citation and 44 had two or more. The percentage of unverified or inaccurate research citations per author ranged from 3.9 to 100 with an average of 40.3.12 The 141 applicants who claimed to be authors reported a total of 1,127 publications as published or forthcoming: 967 journal articles, 27 books, 76 book chapters, and 57 forthcoming journal articles (see Table 2).13 Of the 1,127 publications, 193 (17%) were unverified or inaccurately represented: 139 (14%) of the journal articles, 10 (37%) of the books, 20 (26%) of the book chapters, and 24 (42%) of the forthcoming articles (see Table 2). These 193 instances of unverified or inaccurate research citations included the following: six articles in journals that we could not locate (unverified journal), 72 articles we could not find in journals that we could locate (unverified publication, article), 31 books and book chapters that we could not find (unverified publication, book/book chapter), 24 forthcoming articles we could not find in journals that we could locate (unverified publication, forthcoming article), four instances of authorship insertion, 27 instances of authorship promotion,14 27 instances of authorship omission,15 and five generally categorized as “other” (see Table 3).16 Table 3. Types of Unverified and Inaccurate Citations. View larger version Demographics We analyzed the rates of unverified and inaccurate research citations according to the demographic information we collected for each applicant (see Table 4).17 When we compared authors with a doctoral degree from an international institution with authors with a doctoral degree from a domestic institution, we found a higher rate among applicants with a graduate degree from an international institution (67% and 52%, respectively). However, caution should be exercised with this finding, as international doctoral graduates are more likely to publish in foreign language journals, which were more difficult for the coders to verify. When we compared the Carnegie classification of doctoral granting institutions among applicants with a doctoral degree from a domestic institution, we found that 76 (78%) of domestically educated authors had a PhD from an institution with very high research productivity (institutions formerly known as an R1), and 22 (22%) of domestically educated authors had a PhD from a non-R1 institution. We found a slightly higher rate of unverified and inaccurate research citations among R1 doctoral graduates (54% vs. 45%, respectively). When we compared early-career authors (103, 73%) with authors who were more advanced in their careers (36, 26%), we found virtually no difference in the rates (56% and 53% respectively). We were unable to determine career stage of two (1%) authors, and unable to determine the location of the PhD granting institution for 10 (7%) authors. Table 4. Demographics. View larger version

Best Practices There are a number of things that individuals and institutions in the academic community can and should do to minimize both the incidence and the effect of CV falsification (see Table 5). Table 5. Best Practices. View larger version Just as academics should report their research findings clearly and completely, they should report their education, academic experience, and research accomplishments on their CV in a clear and complete fashion. This includes (a) choosing a single citation style (ideally one endorsed by the applicant’s discipline) and using it consistently throughout the CV; (b) bolding one’s name in the list of authors in the order in which it appears on the publication; (c) including the digital object identifier (doi), when available, at the end of the citation; (d) separating published, forthcoming, under review, and in-progress (not submitted) works in different sections with subheadings; and (e) providing complete information about credentials and degrees (e.g., include the city in which schools are located). In presenting their publications, academics should not use “with” or “co-authored with” language in lieu of providing the order of authors in a citation because this language can misrepresent all parties’ contribution to the project. Finally, when the publication appears in print, academics should cross-reference the final version with the citation reported on their CV to check for errors and ensure consistency. Titles, journals, and authorship order may change as projects evolve, and errors can be perpetuated when academics “cut and paste” entries as they transition from “works in progress” to “under review” to “forthcoming” to “published.” Graduate programs should include formal instruction and mentoring for students on constructing a truthful CV. In addition, academics serving as graduate student advisors should do four things to promote the accurate reporting of their students’ research accomplishments. First, advisors should help students understand why it is important to present their research accomplishments accurately. Second, advisors should help students understand how to report their research accomplishments accurately and completely, including helping advisees understand the different stages of publication and how to accurately present research projects at each stage. Third, advisors should review the CV of advisees for whom they write letters of recommendation and address any misrepresentation promptly. Fourth, advisors should provide advisees with a website or other place in which to publicly share their profile and CV, and make sure the public version of the CV matches the accomplishments they report to the advisor. Academics serving on search committees or in administrative positions overseeing hiring processes also have responsibilities toward ensuring the integrity of CVs. Job announcements and calls for applications should provide specific instructions on what applicants should include in their CV, and warn applicants that the search committee might fact-check submitted vitae.23 While it may seem unduly suspicious for a department to issue such a warning to applicants, early adopters of programs such as turnitin© likely experienced a similar reluctance to check manuscripts for plagiarism. Now such verification is not only common, but it is considered a best practice in teaching (see Note 22). Warning applicants that CVs will be fact-checked might be sufficient to deter some applicants who would have otherwise reported falsified research accomplishments. We also recommend that search committee members actually fact-check the CVs of applicants shortlisted for interviews. Ideally, this would happen when a short-list of candidates is identified, but at the very latest, a careful review should be conducted before candidates are invited for on-campus interviews. The process is not onerous—it took our coders an average of 29 min to fact-check the individual CVs in our sample, even though five applicants were applying for senior-level positions and had multiple publications, and the coders were collecting demographic data. We estimate that it would take an average of 20 min per vitae to verify publications for entry-level positions (without collecting demographics). Finally, journal editors should help authors understand how to report the status of their manuscripts by making it clear when an article is “under review,” “conditionally accepted,” “accepted,” “forthcoming,” or “published.” For example, when corresponding with an author to confirm receipt of a manuscript, the message could include a statement, “At this point in the process you may say that your manuscript is under review.” While some journals already include such statements in their communications with authors, and others provide this information more generally in their online instructions to authors, this is by no means a common practice. Making the manuscript’s status clear might help to reduce genuine confusion among graduate students and early career scholars about the various stages of the publishing process, and serve as a reminder that presenting the project otherwise is a lie.

Research Agenda This research is the first investigation—to our knowledge—of unverified and inaccurate content in CVs for positions in non-health science disciplines. As with many initial investigations, these results leave us with more questions than answers, and thus many directions for future work. Methodologically, we have three recommendations for future research: (a) a complete analysis (rather than 10% sample) of all applicants to more than one institution to fully ascertain disciplinary differences in falsification and to ensure there is no bias in the estimate from this sample; (b) a comparative analysis of applicants at a variety of institutions to determine whether CV inaccuracies are a problem only for research institutions or if they are also a problem for teaching institutions, small liberal arts colleges, and regional institutions; and (c) more robust demographic analyses to determine whether falsifications are more likely to come from various groups of applicants. Substantively, we have three recommendations for future research. First, researchers should examine whether the recommendations from the studies published earlier have been adopted and whether these practices have been effective. Many of the studies on applicants to residency and fellowship programs recommended that programs modify their application instructions to require that applicants include the PubMed identifier for their articles, or a reprint of their article if it was not indexed in PubMed. We could find no follow-up studies assessing whether programs implemented these changes, and if so, whether they found a reduction in the incidence of unverified and inaccurate citations on applicant vitae. Second, researchers should study the behavior of administrators and search committee members as they review applicants’ CVs to identify best practices for assessing candidates. We know that applicants who include unverified or inaccurate information on their CV bias the job market in their favor, but we do not know whether or how this bias would be corrected if and when search committee members discovered misrepresentations that they believe are deliberate. A recent study in a non-academic setting suggests that the discovery of misrepresentation does not always disqualify an applicant (Kuhn, Johnson, & Miller, 2013). Finally, researchers should study the relationship between an applicant’s willingness to include unverified or inaccurate information on their vitae and their likelihood to engage in other types of unethical behavior. We contend that at least some of the unverified and inaccurate citations we identified were willful misrepresentations, and we are concerned that academics who are willing to misrepresent themselves on their CV might also be willing to misrepresent their research findings through other types of QRP, DRP, or FFP.24

Educational Implications As discussed above, graduate programs should provide instruction and mentoring to their senior students on how to construct a truthful CV based on disciplinary standards. Formal and informal instruction in the responsible conduct of research can address both why it is important to the integrity of the research enterprise for scholars to present their education, employment, and publications accurately to others, and how to craft a CV that conveys such information honestly and completely. Particularly, as students and young investigators learn to cite others’ work, they should take note of the ways in which forthcoming work is referred to in the literature, and how they can describe and discuss their own pending contributions.

Acknowledgements The authors thank Ian Rockett for his inspiration, and Franchesca Nestor, Maxwell Nimako, Ashley Brash, and Madison Canales for their contributions to this project. The authors would also like to thank the WVU ADVANCE Center and the Eberly College of Arts and Sciences at West Virginia University for their support.

Declaration of Conflicting Interests

The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article. Funding

The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: In kind support from the Eberly College of Arts and Sciences at West Virginia University; partial funding from a subaward from the WVU Program for Retaining Institutional Diversity and Equity (NSF Award 1007978).

ORCID iDs

Trisha Phillips https://orcid.org/0000-0001-7982-4608 Elizabeth Heitman https://orcid.org/0000-0002-4855-8551