We used an integrated knowledge translation approach [9, 10] to engage stakeholders in a consensus-based process to develop a minimum set of core competencies for scientific editors of biomedical journals that was informed by a scoping review and editors’ perspectives. At the program outset, the team from the Centre for Journalology at the Ottawa Hospital Research Institute (JG, DM, KDC, and LS) assembled a core group of experts to represent scientific editing and publisher stakeholder groups. The experts included scientific editors from different parts of the world and various types and sizes of journals, editors-in-chief, and representatives from editorial organizations, biomedical journals, and publishers (Table 1). Our goal was to include diverse perspectives representing the spectrum of work involved in scientific editing.

Table 1 List of participating stakeholder groups Full size table

We followed a three-step process to develop the core competencies, which is followed by a fourth step to be implemented post-publication:

1. Pre-meeting activities (conduct scoping review and environmental scan; survey of editors’ perceptions/training needs; modified Delphi exercise) 2. Face-to-face consensus meeting (present results of pre-meeting research; hold consensus-based discussions) 3. Post-meeting activities (finalize competencies; solicit feedback from managing editors; survey editors for usefulness of competencies) 4. Post-publication activities (seek endorsement; plan for dissemination and implementation activities)

Pre-meeting activities

Scoping review and environmental scan

A subset of authors from the current publication (VB, PB, SB-S, KDC, JD, JG, PG, HM, DM, LS, SS, PT, EW, and MW) conducted a scoping review and environmental scan of the literature related to core competencies for scientific editors [2]. This included a review of the published and unpublished scientific and non-scientific literature that contained competency-related statements pertaining to scientific editors. They found a total of 225 full-text documents, 25 of which were research articles. From the 225 documents, they extracted a total of 1566 statements possibly related to core competencies for scientific editors of biomedical journals, which ultimately produced a list of 202 unique competency-related statements after de-duplication [2] (Fig. 1).

Fig. 1 Flow diagram for core competency development Full size image

Survey of editors’ perceptions and training needs

Another subset of authors from the current publication (VB, PB, SB-S, KDC, JD, JG, PG, DM, LS, SS, PT, and MW) engaged stakeholder organizations by inviting their scientific editor members to participate in an online survey of editors’ perceptions and their training needs [11]. The participants were respondents to advertisements seeking current or former scientific editors of journals. Advertisements for the research were sent to organizations having a large scientific editor membership (e.g., World Association of Medical Editors [WAME], Council of Science Editors [CSE], European Association of Science Editors [EASE], Cochrane), who forwarded an announcement about the survey to their membership. They collected demographic data and invited respondents to share their perceptions of the relevance of competency-related statements in their role as editors. They also asked respondents to share their perceptions of their own competence related to these statements. There were 38 statements, developed based on data collected in our scoping review [2] and from input from the publication’s authors. These statements were chosen to broadly cover major areas associated with the scientific editor role, including editors’ knowledge, expertise, skills, and experience. Finally, they asked respondents to create a ranked list of their training needs. A total of 148 participants from around the world contributed to the needs assessment survey. The ranked list of needs provided an additional 12 unique competency-related statements that were not previously included in the scoping review and environmental scan (Fig. 1). This provided valuable insight into the views and needs of scientific editors from different demographics and circumstances in the journal publishing landscape.

Modified Delphi process

A final subset of authors from the current publication (VB, PB, SB-S, KDC, JD, JG, PG, DM, LS, SS, PT, and MW) invited the respondents from the editor survey to participate in a three-round modified Delphi process to rate the importance of the 214 competency-related statements arising from the scoping review, environmental scan, and editor survey [2]. During the first round of the Delphi, they also invited participants to suggest any missing items, from which a further 16 unique items were found, bringing the total number of competency-related statements to 230. A total of 105 participants participated in the Delphi, with 27 of them completing one round, 20 completing two rounds, and 58 participants completing all three rounds. Their responses produced a list of 23 “highly rated” and 86 other “included” competency-related statements to help inform the decision-making process during the consensus meeting (Fig. 1). (The manuscript describing this process and the survey of editors’ perceptions and training needs [11].)

Face-to-face consensus meeting

In early June 2016, the Centre for Journalology group, in consultation with the other authors of the pre-meeting activities publications, assembled a group of 23 stakeholders in Strasbourg, France for a one-and-a-half-day meeting to work towards a minimum set of core competencies for scientific editors of biomedical journals. This group included nine stakeholders previously involved in the program (PB, SB-S, JG, PG, HM, DM, PT, EW, and MW) and 13 new stakeholders (SA, KB, JC, AG, KG, FH, SJ, DK, JL, AM, JM, JS, and GZ). The group was purposively sampled using snowballing principles; we invited our core group of experts to attend the consensus meeting and also asked them to contribute the names of other relevant editors (and others) who could potentially represent a range of perspectives, for example, due to their geographical location, size and type of journal where they work, experience with the publishing process, etc.). Participants were invited via a formal letter of invitation emailed by the lead author. We did not specifically solicit representatives of author and peer reviewer groups, as most of the consensus meeting participants were, or had been at one time, authors and/or peer reviewers and therefore could provide insight concerning these perspectives. The results of the scoping review and environmental scan, survey of editors’ perceptions and training needs, and modified Delphi were presented to the group. The presentation was followed by focused discussions on the 23 highly rated competency-related statements resulting from the Delphi, which were divided into four broad categories. Within these discussions, the group identified the competency-related statements that represented core competencies and suggested how to improve each statement. Other competency-related statements from the list of 86 included statements were also considered. Following these discussions, the selected core competencies were reviewed to determine whether there were any missing competencies. At the conclusion of the consensus meeting, the group emerged with a draft list of 24 core competencies (Fig. 1).

Post-consensus meeting activities

Finalizing the competencies

Following the consensus meeting, numerous email rounds of editing and feedback took place among consensus meeting participants (led by JG), stakeholders who did not attend the consensus meeting (KDC, JD, LS, and SS), and other stakeholders who were invited to the consensus meeting but were unable to attend (VB, LC, and TG). After removing redundancies and overlap between items, combining similar items, refining wording, and removing items after further discussion, the group finally arrived at a final set of 14 core competencies for scientific editors of biomedical journals (Table 2).

Table 2 Minimum set of core competencies for scientific editors of biomedical journals Full size table

External validation

We also asked two managing editors (one not involved in this initiative) to review the proposed competencies, and we incorporated their feedback into the refining process. The managing editor of The Journal of the American Medical Association (JAMA) and Jason Roberts of Headache: The Journal of Head and Face Pain are responsible for facilitating the peer review operations of their respective journals, the implementation of editorial policies and procedures, and ensuring that accepted manuscripts are formatted to fit the needs of the publisher.

Survey of editors on the usefulness of the core competencies

After reaching agreement on the final version of the competencies, we solicited the feedback of scientific editors from a small (Headache) and a medium-sized (Canadian Medical Association Journal [CMAJ]) journal. These editors were asked to take 2–3 weeks to consider and reflect on the relevance of the competencies in the context of their role as a scientific editor. Eight editors answered a short survey (hosted on SurveyMonkey.com) asking about the usefulness, aspirational qualities, and relevance of the competencies and whether any important competencies were missing. Their answers were generally supportive of the competencies as useful and relevant and somewhat mixed on their aspirational qualities. Two new items were suggested, which were later determined to already be included in the list of core competencies.