The exemplars identified a core set of 8 practices and provided strategies for employing them. The practices included holding regular team meetings, encouraging shared ownership, providing supervision, ensuring adequate training, fostering positive attitudes about compliance, scrutinizing data and findings, and following standard operating procedures. Above all, the use of these practices aim to create a psychologically safe work environment in which lab members openly collaborate to scrutinize their work and share in accountability for rigorous, compliant research.

Using a qualitative research design, we interviewed 52 principal investigators working in the United States at top research universities and the National Institutes of Health Intramural Research Program. We solicited nominations of researchers meeting two criteria: (1) they are federally-funded researchers doing high-quality, high-impact research, and (2) have reputations for professionalism and integrity. Each investigator received an initial nomination addressing both criteria and at least one additional endorsement corroborating criteria 2. A panel of researchers and our research team reviewed the nominations to select finalists who were invited to participate. The cohort of “Research Exemplars” includes highly accomplished researchers in diverse scientific disciplines. The semi-structured interview questions asked them to describe the routine practices they employ to foster rigor and regulatory compliance. We used inductive thematic analysis to identify common practices.

Conducting rigorous scientific inquiry within the bounds of research regulation and acceptable practice requires a principal investigator to lead and manage research processes and personnel. This study explores the practices used by investigators nominated as exemplars of research excellence and integrity to produce rigorous, reproducible research and comply with research regulations.

Funding: This study was funded by the National Human Genome Research Institute (K01HG008990 to ALA). This work also received support from the U.S. Office of Research Integrity (IR170030-01-00 to JMD) and a National Center for Advancing Translational Sciences Award (UL1 TR002345). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Data Availability: Interview data cannot be shared publicly because participants were promised confidentiality. There is the potential for the identities of participants to be revealed through indirect personally identifying information, even if direct identifiers are removed. The manuscript includes relevant, de-identified excerpts. Upon request, and subject to required approval by relevant ethics committees, transcript data will be made available after removal of direct identifiers and details that may indirectly compromise confidentiality of the participants. Data requests may be made to Alison L. Antes ( aantes@wustl.edu ). Data access requests may also be directed to the Joint Research Office for Contracts ( researchcontracts@wusm.wustl.edu ), Office of the Vice Chancellor for Research, Washington University in St. Louis.

Copyright: © 2019 Antes et al. This is an open access article distributed under the terms of the Creative Commons Attribution License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

It was our aim to identify the different practices and habits engaged in by PIs to foster high-quality research. We anticipated managerial, “task-oriented” behaviors to be particularly important in the domains of research rigor and compliance, as they require attention to detail, accuracy, consistency, and careful coordination of work activities [ 28 – 30 ].

The managerial and leadership behaviors identified in empirical research generally fit into two key domains. Task-oriented behaviors focus on providing structure and direction, for example clarifying roles, setting objectives, planning work, and coordinating tasks [ 24 ]. Relationship-oriented behaviors focus on ensuring the welfare and engagement of personnel, for example providing support, showing appreciation, providing mentoring, sharing in decision-making, and encouraging teamwork [ 24 , 26 ]. Other leadership behaviors include those required for innovation, adaptability, and change, such as identifying new opportunities, building relationships with potential collaborators, and anticipating how a proposal will be evaluated by key constituencies [ 27 ].

The research setting may be one of the most complex domains in which to engage in leadership and management. PIs are often simultaneously required to control and monitor details to ensure accurate work, clarify rules to promote compliance with policies and standards, and emphasize hard work while striking a pace that balances speed and accuracy. They must also foster the engagement of staff and trainees, develop, motivate, and support personnel, and coordinate their team members’ shared efforts. Furthermore, PIs must innovate and anticipate long-term scientific agendas and compete successfully for funding [ 25 ]. Thus, a PI must engage in a variety of complex, multi-faceted behaviors, and know how to adapt to the tensions inherent in their roles—e.g., the need for control but also creativity; the need for detailed oversight of projects but also staff engagement and autonomy; the need for accurate but also quick work. Effectively executing these demands as a PI requires a deep behavioral repertoire and the ability to execute behaviors at the right time and in the right fashion [ 25 ].

We define management and leadership in the following manner. Management focuses on how to do work; managers align processes and people so that work is consistent and efficient [ 23 ]. Leadership focuses on what to do and why; leaders articulate a vision and inspire people to work well together to do complex, innovative work [ 24 ]. In both roles, leaders and managers exercise influence; they influence people or work processes to achieve shared objectives [ 23 ]. An integrated model suggests the distinction between leadership and management roles is less important than achieving the right balance and enacting leadership behaviors and management structures that support efficiency and process reliability, human relations, innovation and adaptability [ 24 ].

Leadership and management are necessary when the efforts of people must be coordinated to achieve objectives [ 19 ]. When work is complex and innovative, and when it is high stakes, leadership is particularly critical to performance outcomes [ 20 , 21 ]. Although these are central characteristics of the work of researchers, the role of PIs as leaders and managers in producing the highest quality research has received limited attention [ 22 ].

Thus, the purpose of this study was to identify the practices that “Research Exemplars”—PIs nominated by their colleagues as exemplary scientists and models of professionalism and integrity—employ to foster rigorous research and regulatory compliance within their teams. Our aim was to identify concrete approaches and strategies they employ to manage the responsible conduct of research in their groups. We apply a leadership and management framework to address our two primary research questions.

Increasing attention to the interpersonal dynamics of effective research is reflected in the emergence of the “science of team science,” examining how research groups effectively perform research and foster interpersonal dynamics that facilitate good science [ 13 , 14 ]. Concurrently, concerns about reproducibility and transparency in science [ 15 ] and failures to establish healthy research work environments [ 16 – 18 ] reflect additional attention to “doing good research in a good manner.” We contend that the decision-making and behavior of PIs, as the leaders of research teams, are at the heart of ensuring the best possible execution of science.

Long-standing scholarly study of science has presented key values and norms that guide the practice and ethics of research (e.g., accuracy, disinterestedness, honesty, trust, respect, and fairness) [ 6 ]. Other work suggests key characteristics of researchers (e.g., persistence and openness) [ 7 , 8 ] and has elucidated situational influences (e.g., mentoring and competition) on scientific achievement [ 9 ]. It is in the day-to-day practice of science that individual characteristics, situational factors, and professional values operate to influence the behavior of researchers. However, recognition that the daily conduct of research often presents tensions between the ideals of science and actual practice [ 10 , 11 ] has led, in part, to education in the responsible conduct of research as a key remedy to these issues [ 12 ]. However, reconciling these tensions and realizing optimal research practices also requires attention to the social and human dynamics within research labs.

Addressing these stakeholder interests in the research enterprise requires attending to several dimensions of research ethics—or “doing good research in a good manner”—at the institutional level and within research labs and teams [ 4 , 5 ]. In their labs, principal investigators (PIs) must foster research rigor and reproducibility, compliance with research rules and regulations, and effective workplace relationships. Increasingly, researchers are also called upon to ensure the social value of their research. These dimensions are important to justify the financial investment in research and the risks posed to animal and human subjects.

The United States invests billions of dollars in research annually [ 1 – 3 ]. An array of stakeholders have an interest in how this research is carried out [ 4 ]. Funding agencies and the public are primarily interested in advancing scientific knowledge and solving real-world problems. Institutions have an interest in attracting the highest caliber students and faculty to advance their missions, and a record of high achievement in research serves this end. Investigators are interested in research productivity, which allows them to secure additional resources to advance their research agendas. Junior researchers training with investigators want to learn the skills for success to advance their careers and research goals. Other researchers in the scientific community have an interest in the trustworthiness of research findings.

We report the frequency of all of the practice codes linked to rigor and compliance across the 52 exemplar transcripts, provide illustrative quotes, and elucidate the themes represented within the codes. We include specific strategies used by exemplars to provide practical insights. Our discussion focuses on the higher-order connections and key findings cutting across the results.

The key purpose of this report is to identify practices linked to rigor and compliance. Thus, we examined each practice and their links with these outcomes by reading the excerpts for each practice to identify the themes represented within that concept. When practices linked to more than one outcome, we read and identified themes for each outcome. After reviewing the themes within excerpts for rigor and compliance, we reviewed the excerpts linked to the general “doing exemplary science” code to ensure that we had already identified all key ideas. During this process of analyzing themes associated with codes, we explored how the concepts and themes related to each other, identifying higher-level themes.

We utilized Dedoose qualitative data analysis software [ 39 ] to train coders and perform coding. In Dedoose, coders mark excerpts of text within transcripts with codes representing concepts in those excerpts. This process generates present (1) and not-present (0) quantitative data for each of the codes within each transcript, and allows the textual data linked to the code to be downloaded for subsequent thematic analysis. To train on the final codebook and add rules to the codebook, AA and AK coded the same set of 10 transcripts and discussed. To obtain an estimate of reliability, they selected and both coded 50 chunks of narrative from transcripts across scientific disciplines, male and female participants, and across the 18 questions. Kappa, an estimate of inter-rater agreement, was .81. Kappa estimates of .61–.80 are considered very good, and above .81 excellent [ 40 ]. AA and AK then each proceeded to code 26 transcripts each. After coding 10 each, we selected a set of chunks, one from each of the 18 questions from across the transcripts, and recalculated Kappa (.73) to ensure we did not have major rater drift in applying the codebook.

After developing this initial codebook framework, one of the interviewers (AK) read 10 transcripts to identify the workability of the code framework and to elaborate on the child codes. The full team (AA, JD, and both interviewers) met to discuss and revise the codebook. Next, AK and AA tested the codebook on a second batch of 10 transcripts and continued to develop a comprehensive list of child codes. We also generated a codebook document with the definitions of the codes, example quotes illustrating each code, and rules for applying each code. This process repeated until we reviewed all transcripts and identified no more child codes. After each batch, team meetings focused on clarifying code definitions, rules for applying codes, and collapsing redundant codes.

We developed codes to summarize concepts in the data inductively through several rounds of open-coding. First, AA and JD read the transcripts and marked emerging concepts to identify a coding framework consisting of parent codes (codes providing superordinate categories) for child codes (codes capturing specific concepts). The framework ( S1 Table ) included parent codes in several domains: e.g., research operations practices, relational and self-management practices, professional priorities, traits, and experiences. In addition, a parent-level “outcomes” code included the child codes of rigor and reproducibility, compliance, good relationships, balancing professional demands, and doing exemplary research. We linked practices to the outcomes when coding the excerpts of text containing practice codes by marking an accompanying outcome code each time we coded a practice. We only used the general “doing exemplary research” code when the practice discussed by the participant was not explicitly linked to the more specific outcome codes. We tracked outcome codes in this fashion to identify the casual links exemplars made between the practices and their intended outcomes.

We adopted a causation coding framework to identify causal explanations for why the exemplars had reputations for exemplary research [ 38 ], which we defined as fostering rigorous research, regulatory compliance, and good relationships [ 4 ]. In the interviews, we asked for the exemplars to discuss their own theory of what explains their success and reputations by asking them to identify the routine practices and habits they employed. In coding, we linked the practices they described to the outcomes of interest: rigor, compliance, and relationships. Our analysis of themes focused on building an understanding of the exemplars’, and our own, theory of the practices contributing to these outcomes. We anticipated leadership practices would emerge according to an existing broad framework of leadership behaviors: “initiating structure”—task-related behaviors focused on processes and procedures—and “consideration”—relationship-oriented behaviors focused on interpersonal aspects of leading people [ 26 ].

We asked the participants to focus on work within their own research lab or team, even though they collaborate across labs and teams. We did not explicitly tell participants we were studying “leadership and management” to allow their leadership practices to emerge naturally. The interview script was developed by AA (an industrial-organizational psychologist with expertise in workplace psychology, leadership, and responsible conduct in research) [ 19 , 32 – 34 ], and JD (a psychologist and bioethicist with expertise in research professionalism, research integrity, and professional decision-making, along with experience as PI of large federally-funded grants) [ 35 – 37 ]. Two public health graduate student research assistants conducted the interviews. They received 3 hours of training in interviewing practices and performed a practice interview observed by AA. After each interview, they documented how it went, and the team used this information to further standardize procedures between interviewers; although minimal problems arose. The interviews were digitally recorded and transcribed verbatim. They yielded 600 single-spaced pages of text.

The interview script consisted of 18 questions with follow-up prompts ( S1 Appendix ). First, the interview focused on rapport-building and background questions. Then, we told participants we wanted to understand the factors that contribute to the success and reputation for integrity of individuals nominated as research exemplars. We ask them to focus on things they do consciously to be the best researcher they can be, or things they do that others in their field might not do routinely. The next 11 questions asked about the individual traits, work strategies, and environmental or experiential factors that contributed to their success and reputation for integrity. The habits and routine practices questions focused on rigor and reproducibility, good team relationships, and compliance. Finally, we asked three questions about key career experiences and lessons.

We conducted one-hour, semi-structured telephone interviews. Before completing the telephone interview, the participants received a consent document electronically and indicated consent to participate. Questions about the study were addressed via email or on the phone before interviews began. Participants also completed a brief online demographics survey and a short questionnaire about their work (e.g., hours worked per week and grant applications submitted per year).

In our assessment of the panelists’ feedback and the materials to select finalists, we aimed to ensure inclusion of researchers who met the criteria and who were diverse in research discipline (e.g., clinical, lab science, social/behavioral research), gender, and nation of birth. We sought at least 30 researchers, however, the high caliber of nominees ultimately led to 55 finalists. We emailed the 55 researchers selected as finalists, and all but three finalists agreed to enroll in the study and participated in the interview.

It is of note that we did not define “professionalism and integrity” for nominators. Instead, we opted to allow nominators to make recommendations according to their views of professionalism and integrity in research. We found the exemplars were described as hardworking, enthusiastic, collaborative, meticulous, transparent, fair, and altruistic. Many held leadership positions at their institutions or in their fields. Nearly all were described as role models, advocates for others, or outstanding mentors. Additionally, exemplars were described as well-liked, highly-respected, and sought-after for their expertise and advice. The nomination and endorsement narratives offered examples of these qualities in practice; for example, taking the utmost care with their data and methods; being known for questioning their assumptions and bringing in outside perspectives; creating collaborative, engaged work environments; refusing to publish findings without confidence in their validity; being disciplined in requiring standardized procedures; discussing ethical considerations actively with their groups; sharing their data outside the lab; making sure team members receive due recognition; taking personal time to help colleagues and mentees.

Our team (AA and JD) reviewed the panelists’ scores and comments, and we evaluated all of the nomination and endorsement narratives. We focused our review on nominees scoring at or above 5. The average reviewer rating of the final cohort of research exemplars was M = 6.02, SD = .60. Our aim was to verify, to the extent possible, evidence of behaviors or traits of the nominees that reflected professionalism and integrity. We examined the nomination and endorsement materials for examples of what set the researchers apart in how they approached doing their work.

Panelists scored the nominees on the two criteria using a 1 (far below expectations) to 7 (far above expectations) scale. We also asked reviewers to respond to the yes-no question, “Do you have any reservations about this person being identified as an exemplar?”, and to briefly comment on their ratings (and reservations, if necessary). We produced mean rating scores across the two criteria and two reviewers.

We provided the review panelists with a one-page instruction sheet describing the two criteria for inclusion as an exemplar: scientific accomplishment as a federally-funded investigator and a reputation for professionalism and integrity. We instructed panelists that considerations for the expectation of high-quality, high-impact research might include: “receiving substantial grant funding, producing high-quality peer-reviewed journal articles, and addressing issues of social import.” Regarding the expectation of professionalism and integrity, we instructed panelists to consider “whether the characteristics or practices described in the materials are those that would be ideal for others in the scientific community to learn from or follow in their approach to lab management, mentorship, or leadership.”

The panel of volunteer reviewers consisted of 10 federally-funded researchers. We matched the reviewers’ scientific disciplines to the nominees’ fields, and two panelists reviewed each nomination. Five of the reviewers working in biomedical, public health, and social sciences reviewed the bulk of nominations (12 reviews each). The remaining three reviewers reviewed smaller sets of nominations from engineering, earth sciences, and physical sciences. Reviewers received the narrative information provided by the nominators and endorsers about the nominees’ research program and their professionalism and integrity. They also received the nominees’ CVs, and a summary of their academic rank, sources of funding, number of publications, and amount of grant funding (with a note that NIH intramural researchers do not seek external grants.)

We asked the nominators and endorsers to characterize the nature and length of their relationship with the nominee. The nominators knew the nominees, on average, for 9.96 years (SD = 6.37) and 80% were institutional administrators, academic deans, or department chairpersons. Endorsers, on average, knew the nominees for 14.62 years (SD = 8.95) and 59% were faculty collaborators, and the remaining were department chairpersons, program directors, or trainees. We received 81 nominations, and 74 received at least one additional endorsement, qualifying them to move to the review panel. Before panel review, we emailed the nominees to confirm their interest in participating in the project if selected as finalists. All but one individual responded and agreed to participate, thus 73 nominations were sent to the panel for review.

We also asked nominators to submit the nominee’s CV, and the contact information for three individuals who work closely with the researcher who we could contact for a secondary endorsement of the nominee. To obtain some verification that others held the same view of the nominee, we sent emails to the three individuals identified asking them to endorse the nominee if they agreed the researcher exemplified professionalism and integrity. The endorsers completed the two questions about how the researcher demonstrated professionalism and integrity and whether this view was widely held.

The nomination form asked the nominators to describe the nominee’s high-quality, high-impact federally-funded research program. Next, we asked for a specific description of how the nominee exemplified professionalism and integrity as a researcher. Finally, we asked the nominator to discuss how confident they were that their view was widely held by the researcher’s colleagues.

The announcement indicated we sought nominations of federally-funded researchers in any career stage who lead a research lab or team. We indicated that the nominees must meet two criteria: (1) perform high-quality, high-impact research in any scientific discipline, and (2) have an outstanding reputation for professionalism and integrity. The email provided a web-link to additional information about the project and a nomination form. We indicated we would invite finalists to participate in a 1-hour interview about the practices they employ to lead their teams, and we would send them a plaque and post their biography on the project website ( https://integrityprogram.org/exemplar-project/ ).

We solicited nominations from the Carnegie-classified “highest” and “higher” research activity universities, accredited medical schools and schools of public health, and the National Institutes of Health (NIH) Intramural Research Program in the United States. We sent an initial email announcement and three reminder emails to nearly 1,500 academic deans, department chairs, and institutional research administrators (e.g., Vice Presidents for Research, Research Integrity Officers, Scientific Directors). We also emailed the PIs at institutions affiliated with the National Center for Advancing Translational Sciences’ Clinical and Translational Science Awards program [ 31 ]. The announcements invited the recipients to make a nomination and to forward the call for nominations to their colleagues.

We employed an open-ended, qualitative methodology because we aimed to identify and relay concrete strategies employed by investigators to lead responsible research. We used purposive sampling to identify researchers representing a range of scientific disciplines who were exemplary in their scientific achievements and reputations for professionalism and integrity. Colleagues of the researchers provided initial nominations and secondary endorsements of their professionalism and integrity. A panel reviewed the nominations, and our research team reviewed their feedback and all of the nominations to select finalists. We conducted in-depth, semi-structured interviews with the participants we collectively refer to as “Research Exemplars.” The Washington University Institutional Review Board provided ethical approval for the study (IRB# 201601121).

Results

Hold regular team meetings Discussed by 83% of exemplars, the cornerstone of rigorous research is holding regular meetings. As one exemplar put it: “…the first round of vetting of what they have accomplished is through our internal group meeting or lab meeting” (Participant 41). Nearly all exemplars hold weekly meetings. Meetings serve several distinct purposes. Meetings, first and foremost, provide the opportunity to share data. This fosters transparency about data and permits team members to scrutinize results. This routine helps lab members feel comfortable putting their data, findings, and interpretations on display for others to critique. It creates a norm of openness about data, helps to identify potential problems in the data, and fosters a sense of team collaboration on projects. Several exemplars emphasized that a PI must never make lab members feel that they will be unhappy with the data presented to them. One noted, “…I let them know that progress isn’t getting the results I want…progress is getting the results that the experiment gives us” (Participant 17). A related purpose of meetings is to provide updates, coordinate, and identify potential problems with a study that impact the quality of data or compliance with a scientific or ethical protocol. Meetings allow team members, particularly when several must coordinate to execute protocols, to double-check procedures and talk about concerns. These meetings also allow investigators to understand what has been accomplished and allow the team to plan. The exemplars talked about using these meetings to make sure they are fully aware of how the research is going and to create accountability for next steps. Several PIs mentioned the importance of agendas, to-do lists, and action steps to foster awareness and accountability among team members. For example, one exemplar described requesting an outline of goals and progress: “They have to prepare a write up every single time that we meet…essentially outlines their goals, and then the next time, what they met on those goals” (Participant 51). Whether it relates to ensuring sound, high-quality research, or compliance with regulations and ethical standards, the exemplars repeatedly emphasized that people must be comfortable communicating about mistakes. Overall, communication at team meetings must be transparent and open. Exemplars stressed that people must feel the environment is “non-threatening.” A few exemplars even described sharing their own mistakes and trials to model openness. One exemplar put it this way: I tell [my team] that the only mistake that can’t be fixed is the one they don’t tell me about…if they’ve done something wrong, or they’ve discovered that something went wrong…I want to know about it.… I don’t get angry…because everyone’s human and we all make mistakes, including me. The only thing I ever get angry about is if I discover that they’ve hidden something from me. So, transparency and displaying that transparency is really key.… (Participant 4) Exemplars also hold meetings to discuss new study designs and plan out study protocols that have not yet launched. Occasionally, meetings serve the purpose of developing or training lab members by discussing new topics or having outside experts come to present on new regulations or important compliance issues.

Encourage shared ownership and decision-making Another top practice, shared by 73% of exemplars, was establishing an environment where everyone shares in responsibility for compliance and integrity. For example, one strategy was to design compliance protocols together as team. This not only ensures that everyone knows what each team member is responsible for, but also that the protocol can be discussed and revised if a problem arises. Another aspect of shared ownership was appointing a compliance point person. Many exemplars described someone they rely on as a source of expertise and consistency on matters of compliance. Typically, this person was an experienced member of the lab or designated compliance manager. The point person helps the PI oversee compliance matters and double-check procedures and documentation. A point person might perform specific checks of protocol compliance, for example, they might regularly review documentation like protocol checklists and signed informed consents. Moreover, this team member serves as the go-to person for daily compliance questions or issues. One exemplar described it this way: “…different staff or the research faculty are in charge of various components of the laboratory…they serve as an additional, immediate onsite resource for laboratory processes in a way that I can’t, given my administrative duties” (Participant 10). Other than selecting an experienced individual, some PIs mentioned appointing a person who is good with details, or a “perfectionist.” This person then contacts the PI when needed. Also, the point person is often involved in training newcomers and ensuring they are knowledgeable of the rules. One exemplar relied on a graduate student in this role: “…I also have a safety officer in the lab…it’s an advanced grad student…and he’ll basically train any new student coming in on a one-on-one basis…go through all of the things that we need to check” (Participant 45).

Provide supervision and guidance The next practice of utmost importance mentioned by the majority (71%) of exemplars was providing comprehensive, ongoing supervision and feedback to their students and personnel. Meetings provide an oversight mechanism for PIs, but exemplars also emphasized oversight and feedback more generally. They described regular, typically daily, interactions with smaller groups and individual team members. They make sure that they are interacting with and communicating with everyone on their team. One described it simply this way: “…I go to the lab every day and engage all of them” (Participant 38). At its core, this practice is about being an involved and engaged PI and mentor. The exemplars value awareness of how the research is going day-to-day so they can provide timely guidance and direction. The form that oversight took varied by level of the student or staff member: daily more intensive supervision for newcomers, inexperienced staff, or a struggling trainee; regular “check-in” meetings with more experienced individuals; and one-on-one meetings with the most senior personnel on an as-needed basis. Above all, their objective was to interact with everyone to identify potential problems and help identify ways to overcome them. One exemplar noted: “I’m meeting with them…and making sure there aren’t problems, that if there is a problem, they’re bringing it to my attention” (Participant 43). Regarding rigor specifically, exemplars described being involved in and overseeing the execution of research and the production of data. Through their awareness of how data collection is progressing, they can help trouble-shoot when studies are not going according to plan, or when they or lab members identify inconsistencies in data. For those doing experimental work, this typically meant they were in and out of the lab interacting with people. Those doing clinical work emphasized being sure that people know they are available. Further, they are regularly in contact with lab members, so they can learn immediately if there are any questions, concerns, or needs for guidance about protocols. Access to the PI was of critical importance; typically, PIs interacted daily with their teams. Some exemplars mentioned that they have an “open door” or share their calendar with their team. Other PIs noted it is imperative that their team works in a space in close proximity to their office. Similar to their oversight of research and data collection, engagement as a supervisor allows their teams to address problems related to compliance as quickly as possible. Often, supervision related to compliance focused on checking in with the compliance point person and establishing the procedure that dictates how the compliance person communicates concerns directly to the PI. Several exemplars expressed the importance of being mindful about the possibility that compliance or research integrity issues could arise, like compliance protocol violations, plagiarism, or data manipulation, and of being vigilant in monitoring for them. They described not being suspicious, but mindful. Several exemplars specifically noted the importance of supervision and feedback for new members to the team and for people who are doing new, unfamiliar tasks. Through regular interaction, the PI gains confidence that lab members are prepared to work more independently. The PIs noted that regular supervision and feedback are the key means to develop trust in the knowledge and skills of their team members, which ultimately allows them to delegate and share in decision-making and ownership. It is worth noting that the supervisory role of a PI evolves. With a smaller lab, or at an earlier career stage, some described being extremely involved, even working alongside students in the lab. With time, a PI’s involvement in the smallest details may diminish, but this heightens the need for regular interaction, feedback and open lines of communication. Also, PIs with large teams or labs may oversee managers or coordinators who help to provide some of the supervisory support to others. However, the exemplars described being both highly engaged in communicating with those managers or coordinators while also still interacting regularly with all the people on their teams. Overall, the investigators described being aware, being present and available, and interacting regularly with individuals. Approachability, in particular, was a prominent theme in their responses. Most exemplars explicitly noted that people should feel any concern or question may be brought to their attention at any time. They consistently reiterated the importance of openness and communication. By knowing the day-to-day activities of the team, they can provide timely, effective guidance and help their teams overcome challenges. Importantly, some exemplars explicitly pushed back against the idea that this kind of engagement and regular contact with people is “micromanagement.” Many felt that through their engagement on matters of data quality and compliance, they created the sense that every team members should play a role in monitoring and creating checks and balances about data and compliance. Furthermore, many noted that problems, mistakes, and concerns should be approached in an evenhanded fashion. One stated: “I think the key is to make sure that I am talking to them, and I’m not judging them or evaluating them when we talk. It’s all about discussion…” (Participant 39). Another noted: “If you yell at people, if you brand them as incapable researchers…then people will go into safe mode. If they go into safe mode, they are not really a scientist anymore…” (Participant 6). Finally, another exemplar reflected on the aim of establishing openness: “I want to create an environment of openness with my students so that they’re able to present to me the truth and not be afraid that I’m going to get angry” (Participant 35).

Ensure sufficient training Another common practice noted by 67% of exemplars was the need to ensure adequate training of newcomers to their research teams to foster rigor and compliance. Several mentioned the adult learning heuristic ‘tell, show, do’—tell someone how to do a task, then show them how to do it, and then watch them as they practice doing it. Some described assigning tasks with specific metrics or outcomes they could check to ensure the person knew how to carry out the task accurately. Others described using a tiered, experience-based system in which experienced lab members train the beginners while the mid-level trainees observe. Later, mid-level trainees may become the experienced trainers. Several exemplars noted the need to draw on the help of lab members to train others, and to formalize training as much as possible, especially as the lab gets bigger and the PI accumulates greater responsibilities. Several exemplars noted using training guides and standardized training manuals to ensure that everyone is taught to do things the proper, rigorous way. Regarding compliance, several noted that they do not assume that people know the rules, and they make sure that team members meet the institutional compliance training requirements. Other strategies for training on compliance-related matters were to work together on regulatory applications, have occasional special meetings about regulatory issues, or to bring in outside regulatory experts if extra training needs arise. One exemplar relayed their approach as follows: We have a monthly staff meeting, and we talk about compliance and regulations. I will send reminders as needed about the importance of compliance with regulations, compliance with our research protocols with the university requirements and regulations, and we discuss a lot of that also when we’re writing proposals. (Participant 12) Another described building staff knowledge through experience with compliance protocols: I require all of them to write and submit IRB or IACUC application, of course with my supervision, so that we all will be familiar with the issues, not just in terms of research, but also in treating humans and animals in ethical ways. I also require them to personally get trained in every aspect of research regardless of whether they’re going to do that type of work in their research or not. (Participant 38) Overall, the PIs described the critical need to ensure that no one, whether an undergraduate, graduate, or post-doctoral researcher, performs a task until they reach adequate mastery. Without this level of mastery, the team risks wasting resources, producing bad data, or failing to ensure compliance. By attending carefully to training, the PI develops trust in their team members, allowing them to work independently. Notably, several investigators mentioned that, because the period of necessary learning before being productive is extensive, they ask people to commit at least two years in their lab.

Foster positive attitudes about compliance In line with the idea that shared ownership fosters compliance, 65% of the exemplars described the need to foster positive attitudes about regulatory compliance. This included several elements. First and foremost was the need to have good relationships with regulatory agencies and institutional officials. PIs noted the importance of viewing these officials as helpful experts rather than antagonizers. As noted above, some even reported inviting such experts into their labs to give presentations or share best practices. They noted the value of viewing compliance experts as part of a team’s competitive advantage, that is, viewing them as a resource to support the good work of the team, rather than criticize it. However, this particular strategy will be most effective at institutions with compliance offices that have robust educational and support services. Another element of this practice was to provide continued professional development around issues of compliance; for example, by dedicating a lab meeting to discussing changes in regulations or to discussing a case example. Keeping the importance of compliance salient in the lab, particularly by making compliance a topic of regular discussion, allows the team to integrate compliance into how they naturally go about their work. An exemplar described encouraging awareness as follows: “We just have kind of ‘safety moments’ and safety updates every couple of months to make sure that everybody is remembering the proper set of procedures to go through if there’s an emergency in the lab…” (Participant 11). Additional strategies for fostering a culture of compliance were to set expectations of compliance by explicitly insisting that lab members follow the rules, and to make compliance easy, or the default approach to doing science. For example, this could be achieved by using standard operating procedures (SOPs) and ensuring adequate training. Several exemplars went even further, noting that the team must value doing what is ethically best for patients, animals, lab, or science, even when it is above and beyond the requirements. Finally, even when the PI finds compliance matters a burden, several exemplars noted it is important not to communicate that attitude to team members.

Scrutinize data and findings The exemplars described meticulousness and skepticism towards raw data and their team’s interpretations of findings. This strategy was mentioned by 61% of exemplars. They sought to cross-check data and findings as thoroughly as possible and to ensure a culture of transparency about data. To achieve this, they urged people to talk about data regularly. They encouraged lab members to show each other their data to create a culture in which people are comfortable talking about data. Exemplars ensure that lab members participate in both receiving and offering feedback so that they become comfortable with critique. One PI noted people should never be afraid to show their data: “I tell them I want all data. There is no good data, there is no bad data unless somebody makes a decision…I am ultimately responsible, or…the team is.… You have to have all original data.” (Participant 6). Some exemplars described bringing in outside colleagues who can offer fresh perspectives on their findings or inviting others to collaborate, especially those who they know will disagree, in order to generate a greater level of scrutiny. Their aim of setting a tone of objectivity and scrutiny towards data was especially achieved through regular meetings and interactions about data. Many exemplars described that this mindset carries beyond meetings to the way members of their teams approach data as individuals, and in the way they work together collaboratively outside of meetings to ensure the best research.

Express values and expectations We heard, from 60% of exemplars, about the significance of a PI explicitly stating what is important as the team goes about their research. That is, what the PI values and, in turn, what the PI expects. To foster rigor, the exemplars communicate high standards. Specifically, they express the importance of doing research accurately, meticulously, transparently, and with integrity. They expect lab members to work hard, but they also tell them not to sacrifice quality by cutting corners or rushing. Several exemplars mentioned the need to maintain the credibility of the lab, noting they must always be able to defend their results with confidence. Furthermore, PIs tell lab members it is essential to maintain openness and objectivity as researchers. One exemplar described the mindset they expect from team members in the following way: I try to, on a daily basis, remind my students…be very, very skeptical…it starts first with each individual being skeptical of their own data, and not assuming anything. Never assume anything. That’s the fatal flaw in a scientist is to assume that you know what you’re doing. (Participant 22) Included in this notion of objectivity was the idea that lab members must never feel that the PI expects or demands a particular result. Furthermore, several exemplars noted the need to explicitly tell lab members that they must be willing and able to admit mistakes, and it is ok to “be human.” Similarly, PIs express high standards regarding compliance. We heard from several exemplars that they tell lab members that the rules are important, and that PIs insist members make it a priority to know and follow the rules. In addition, exemplars state the consequences of compliance failures: that is, they tell team members that the privacy of participants, or the safety of fellow lab members, is at stake. Importantly, many noted the need to state these expectations particularly when a new member starts in the lab, and not to assume that people know the rules and expectations. Some exemplars also explicitly expected everyone to work together as a team to ensure all lab members understand and follow the rules. For example, some exemplars appointed existing lab members to help newcomers learn the rules. Other strategies to reinforce the importance of compliance were to have compliance concerns as the first agenda item at meetings and to ask new members to sign an agreement stating their commitment to following the rules.

Establish and follow standard operating procedures Another practice, mentioned by 50% of the exemplars, was to establish and follow standard operating procedures (SOPs) for scientific and compliance-related procedures. Exemplars stated it is unwise to take for granted that everyone understands how to perform procedures, or that they reliably remember every step. As one put it: …we write protocol manuals…we actually list out every step of what someone is supposed to do as they go through the study…so everything is laid out. We don’t assume that people know or will always remember every single step. So there’s a checklist for every protocol that we use, and people have to sign off. (Participant 4) Exemplars described the importance of highly specific SOPs and the need to use them for everything in their research. SOPs provide PIs with some peace of mind that lab members will consistently perform procedures. They also ensure that best practices in the lab will not be lost over time as members of the lab change. Moreover, SOPs provide a means to discuss and update scientific protocols and regulatory compliance in meetings, particularly to troubleshoot problems. Some exemplars noted drawing on the guidance and expertise of compliance offices to develop or update SOPs. As illustrated in Table 4, the remaining set of 9 less frequently mentioned practices emphasized procedures to ensure their teams carry out research in a meticulous and coordinated manner. This set of practices focused exclusively on rigor, except for one practice related to compliance. PPT PowerPoint slide

PowerPoint slide PNG larger image

larger image TIFF original image Download: Table 4. Lab management practices fostering rigor and compliance. https://doi.org/10.1371/journal.pone.0214595.t004

Verify findings We heard of the need to verify findings from 48% of exemplars. For those doing experimental work, in particular, it was essential to verify findings through repeated experiments, and often through having multiple people replicate experiments. As one expressed: “…you repeat it, and you repeat it…we want to be sure…repeat is the mother of success” (Participant 13). Exemplars recommended finding help outside the lab if the lab is having trouble replicating. They also shared the importance of digging into results when they find something they expected, not only when they find something they did not expect. They stated the lab must figure out whether their result is real or not. One exemplar described it as follows: …being extremely slow and methodical and careful and going back and checking and rechecking and looking at things from different perspectives, bringing in a lot of people to consult, others to independently replicate your results, others to still independently replicate those. I’ve never put anything out that hasn’t been replicated at least twice independently by people. (Participant 14) The bottom line was to be skeptical of what they think they have in front of them and to look at their findings from new angles, for example, to repeat experiments, assess new controls, or look to other studies to triangulate their findings. Verify findings was mentioned by about half of exemplars, and most investigators meant replicating findings. It might be expected that replicating would be mentioned more frequently, but we think this may be a function of two factors. First, the investigators were conducting diverse types of research: some were doing basic laboratory studies where explicit replication is customary, whereas others performed clinical or behavioral research and explicit replication of datasets is not possible. Instead, in these fields “scrutinizing data and findings” captures the standard approach to validating findings. Second, we asked investigators to describe practices they thought others in their field might not do routinely, which may have led them to focus less on practices they potentially viewed as typical.

Document procedures We heard from 44% of exemplars that rigorous research requires meticulous, constant documentation of every decision made, even the smallest detail. One investigator noted: “I also ask my students, even though we’re a computational group, I ask them to maintain a research log, a lab notebook” (Participant 9). Whether documentation takes the form of lab notebooks, written plans, detailed notes, or metadata, exemplars conveyed that everything should be documented and researchers should plan ahead for documentation in their studies. There should be an expectation that personnel keep detailed records and notes about virtually everything. A few PIs provided the advice to give students or staff example lab notebooks, research reports, or lab meeting agendas and meeting minutes to illustrate the desired format and detail.

Design scientifically sound studies Another issue raised by 40% of exemplars was that rigor starts at step one. They described the need to spend the necessary time to plan everything as carefully as possible from the outset. As one PI put it: “You think before the study, and you don’t think during the study, or you think only if you have to, as needs arise” (Participant 42). Ensuring that their team planned the most scientifically sound, rigorous designs included several elements. A solid scientific foundation was one concern. They noted the need to study the literature to determine a theoretical framework, conceptual model, or what findings are already available. This allows the team to make sound research design decisions and find creative ways to extend and improve upon what already has been done. A few PIs noted using outside guidelines and checklists for rigor available from agencies like the NIH and journals. Some also described going outside their teams to obtain external feedback and criticism on ideas and study methods from the very beginning. Finally, specific study design considerations included the following: blinding, the use of controls, validating measures and techniques, calibrating equipment, having multiple methods of data collection to triangulate findings, and having repeated measures of the same data point. Several described thinking backwards; beginning with the end goal in mind and then thinking backwards to design studies. Above all, they reiterated that during the study design phase, PIs must not be afraid of findings that run counter to their hypotheses.

Coordinate the study team Most of the exemplars have multiple team members working together to carry out studies. As alluded to in other practices, the need to coordinate the activities of the team is essential, and thus was explicitly noted by 33% of exemplars. This practice emphasized assigning and discussing roles down to the minute details from the very beginning of the process. This practice included getting the team together at the beginning of the project to map out goals and roles. As noted by one exemplar: “…we are planning an experiment, we all sit down as a group and go through it and say here’s what we’re asking, here’s what we need to do…who’s going to do what…” (Participant 5). Then, it was essential to repeat check-ins during data collection and when creating a data analysis plan, executing analysis, or drafting results. Coordination and specification of roles was particularly important in how some of the exemplars ran their large research teams because they had students doing research while being supervised by post-doctoral researchers. Thus, this coordination and delineation of roles was important so that people knew who to contact for immediate, day-to-day guidance when doing their research.

Handle data properly Finally, 31% of exemplars emphasized the importance of recording, storing, and backing up data appropriately. The PIs described storing data in a central place (often a shared drive) that is accessible to them at all times (and often to multiple lab members) and over a long period of time. They insisted that all data be stored in this central location, or that there be documented justification for anything that is not archived or is missing. They felt multiple researchers should be involved to improve data integrity—that is, typically, just one individual should not regulate data collection, analysis, and storage. Some PIs had co-leads on projects, and they made practices like double-data entry and group data analysis the norm. One investigator described data management this way: …I always tell my students…back up all your data…at least in two places…and everyone is able to go into every single user’s directory…we are all collaborative, we look at everyone’s raw data in the analysis…and we maintain the data as long as we can. (Participant 9)

Involve multiple researchers on projects Many practices implied the importance of teamwork, but several investigators (25%) explicitly noted the importance of internal collaboration for rigor and integrity. By having multiple people work on the same or parallel projects, access to, and review of data, are not isolated to one person, improving the integrity of the project. Strategies included “co-leads” of projects, group data entry and analysis, and the explicit expectation of co-authorship in their group. This approach lends to a collaborative environment and better research because cross-checks and feedback are built into the scientific process in the lab. One even noted the practice can impact the experiments researchers take on: “They also help each other so they co-author on each other’s papers, and so they’re a little less afraid…to do a labor extensive experiment because they know that they can have help for it” (Participant 35). It was noted, however, that this approach may be less feasible in smaller groups.

Share data outside the lab Some exemplars (21%) discussed sharing their data or findings outside their lab early and throughout the research process. Exemplars felt this transparency promotes accountability, reproducibility, rigor, and, overall, advances science. Critique from other investigators on emerging findings could be obtained by presenting at conferences, or by seeking out the input of other investigators. Sometimes this latter strategy includes inviting outside collaborators to join their projects. One exemplar described this practice as follows: …show the data to others and get feedback from people who may be naïve so that we can get a fresh and unbiased perspective, because if you’re only talking to people in your group you can get a bit myopic…it’s essential to take your findings outside of the group, ideally when they’re still in their preliminary stages so you can get feedback as the work is developing.… (Participant 46) Sharing also included submitting data or methods to repositories, as one exemplar described: …you have to get your data onto some kind of data repository and have it well explained enough that other people can understand what it is and make use of it…that even goes to code…everything you do is potentially retrievable and can be checked. It adds a burden to us, but I think it’s really a good thing, and it really kind of increases the value of the work if you can document everything and put everything out there… (Participant 16)

Report findings completely and accurately Another practice shared by 17% of the exemplars was to ensure as much certainty in findings as possible before publication. This included, at times, explicitly ensuring they did not rush to publication. As one noted, “…we are careful not to talk about our results until we have confidence in them…I tend to be conservative in not publishing too quickly…we’re telling a story in each paper, and is the story really clear?” (Participant 21). They also discussed that it is essential to report all findings completely and precisely.