There is currently a paucity of evidence‐based strategies that have been shown to increase citations of peer‐reviewed articles following their publication. We conducted a 12‐month randomized controlled trial to examine whether the promotion of article links in an online cross‐publisher distribution platform (TrendMD) affects citations. In all, 3,200 articles published in 64 peer‐reviewed journals across eight subject areas were block randomized at the subject level to either the TrendMD group ( n = 1,600) or the control group ( n = 1,600) of the study. Our primary outcome compares the mean citations of articles randomized to TrendMD versus control after 12 months. Articles randomized to TrendMD showed a 50% increase in mean citations relative to control at 12 months. The difference in mean citations at 12 months for articles randomized to TrendMD versus control was 5.06, 95% confidence interval [2.87, 7.25], was statistically significant ( p < .001) and found in three of eight subject areas. At 6 months following publication, articles randomized to TrendMD showed a smaller, yet statistically significant ( p = .005), 21% increase in mean citations, relative to control. To our knowledge, this is the first randomized controlled trial to demonstrate how an intervention can be used to increase citations of peer‐reviewed articles after they have been published.

1 INTRODUCTION The number of published scholarly articles is growing exponentially, forcing publishers, journals, and authors to compete avidly for readers' attention. There are now between an estimated 80 to 120 million scholarly articles available online (Himmelstein et al., 2018), and over 8,000 newly published articles added to 33,100 journals in the scientific, technical, or medical disciplines on a daily basis (Johnson, Watkinson, & Mabe, 2018). Despite this avalanche of content, the average academic reads only about 250 articles a year (Tenopir, Christian, & Kaufman, 2019; Van Noorden, 2014). While ongoing improvements to search technology (for example, PubMed and Google Scholar) and the increasing availability of Open Access has vastly enhanced access to content, their effectiveness to enhance the visibility of content is highly influenced by the extent to which the reader has a preconceived interest and anticipation of that topic or article. Neither search nor Open Access help readers serendipitously discover new content that they did not already know what they are looking for; the growing number of articles makes discovering new and relevant articles increasingly challenging for readers (Kudlow, Rutledge, Shachak, McIntyre, & Eysenbach, 2016). For authors, the deluge of articles makes it less likely that any given article will be read, used, and ultimately cited (Van Noorden, 2017). Van Noorden's data suggest that roughly 35% of articles published between 1990 and 2015 remain uncited; a figure that may be increasing. The growing mismatch between the number of articles published and the ability for readers to come across those that are relevant has created a pressing need for discoverability strategies that augment the visibility, usage, and impact of scholarly content. However, there is currently a paucity of evidence that shows which strategies yield a positive effect on scholarly article impact, such as that reflected by citation counts. One of the most widely studied strategies to augment citations is publishing content in Open Access (OA) journals, but the results remain controversial (Björk & Solomon, 2012). Several extensive studies have compared the citation counts of OA articles to closed‐access articles and have found that OA articles accrue more citations than their closed‐access counterparts, an effect that has been named the “open access citation advantage” (OACA; Eysenbach, 2006; Koler‐Povh, Južnič, & Turk, 2013; Kurtz et al., 2005; Laakso & Björk, 2013; Mikki, 2017; Ottaviani, 2016; Piwowar et al., 2018; Wang, Liu, Mao, & Fang, 2015). For example, Archambault, Côté, Struck, and Voorons (2016) described a 40% OACA in a large sample of over 1 million articles (Archambault et al., 2016). A 2018 study examining 67 million articles by Piwowar et al. found an 18% citation advantage for OA articles relative to closed‐access articles (Piwowar et al., 2018). The OACA, however, has not been universally accepted (Anderson, 2014; Davis, 2011; Davis, Lewenstein, Simon, Booth, & Connolly, 2008). For example, Philip Davis has conducted two major studies on the OACA and has not found evidence of an effect (Anderson, 2014; Davis, 2011; Davis et al., 2008). Davis et al. have also criticized much of the early research in support of the OACA on methodological grounds (Walters & Davis, 2011). Among the biggest confounders likely to be driving the OACA in observational studies is selection bias: authors select the best articles to be published in OA, which in and of itself leads to the greater citation counts observed among OA articles. However, recent observational studies that attempted to control for selection bias have found that the OACA still exists. For example, McCabe and Snyder (2014) used a statistical model to remove confounding effects of author selection and reported a small but meaningful 8% OACA. In 2016, Ottaviani utilized a natural experiment study design that controlled completely for author selection; citation rates of closed articles were monitored as they became OA due to the natural 1‐year embargo period being lifted (Ottaviani, 2016). Ottaviani (2016) found that OA articles received 19% more citations than closed articles. Greater article accessibility and time likely drive the OACA; OA articles are more accessible and tend to appear online earlier than in print, which in turn may lead to more citations. The estimated size of the OACA varies across and even within studies, but is often measured to be between 15% and 200% more citations than closed‐access articles (Ottaviani, 2016). However, the evidence supporting the OACA is still mixed, and the extent to which the OACA is real or due to a confounding effect of author selection bias is still not known. Aside from publishing in OA, very few strategies to increase citations of academic articles have been identified. A 2016 study published by employees of Academia.edu found that articles uploaded to Academia.edu received 16% more citations after 1 year than similar articles not available online, 51% more citations after 3 years, and 69% more citations after 5 years (Niyazov et al., 2016). These data, however, have been challenged in the literature (Davis, 2015). The first issue raised was that of selection bias. The Academia.edu study used a case–control observational study; 66.5% of On‐Academia articles were found freely available from other websites, compared to 36.9% of Off‐Academia websites. The citation advantage found by Niyazov et al. (2016) may therefore be mediated by the differences in accessibility between cases and controls. For further examination of the limitations of the Academia.edu study, see the review conducted of Davis (2015). Another strategy, the promotion of articles in social media, has been widely studied, but whether social media confers any visibility, usage, or impact advantage remains inconclusive (Dixon, Fitzgerald, & Gaillard, 2015; Erdt, Aung, Aw, Rapple, & Theng, 2017; Fox et al., 2015, 2016; Hawkins, Hunter, Kolenic, & Carlos, 2017; Hayes, Kobner, Trueger, Yiu, & Lin, 2015; Trueger, Bokarius, Carroll, April, & Thoma, 2018). Recent studies published by Hawkins et al. (2017) and Trueger et al. (2018) found that intensive social media promotion significantly increased pageviews of scholarly articles (Hawkins et al., 2017; Trueger et al., 2018). In contrast, two rigorous randomized controlled trials completed by Fox et al. (2015, 2016) found that the promotion of articles in social media did not yield any increase in article pageviews (Fox et al., 2015, 2016). There are, however, numerous limitations associated with these studies. Few studies yield the same outcomes, and none of the social media studies have examined the effects on article‐level metrics other than pageviews and downloads. No studies have examined how promotion of articles in social media affect Mendeley saves, a metric that predicts citations, or citations directly. Therefore, the extent to which promoting articles in social media leads to increases in citations is not known. Our group previously published studies examining the extent to which the promotion of articles in a cross‐publisher distribution platform (TrendMD) increases article visibility and usage (Kudlow et al., 2016, 2017; Kudlow, Rutledge, & Eysenbach, 2014;). We first examined the effects of promoting articles in the TrendMD Network on pageviews. TrendMD conferred an 87% increase in pageviews relative to control in a 4‐week randomized controlled trial (Kudlow et al., 2014). We replicated these initial results in a 3‐week crossover trial and found that promotion of articles in the TrendMD Network yielded a 30% and 49% weekly increase in pageviews relative to control (Kudlow et al., 2016). In 2017, we completed a 4‐week randomized controlled trial examining whether the promotion of articles in the TrendMD Network confers an increase in article Mendeley saves. We examined Mendeley saves as an outcome because it is a robust article usage metric and strongly correlated with future citations (Aduku, Thelwall, & Kousha, 2017; Li & Thelwall, 2012; Thelwall, 2017; Thelwall & Nevill, 2018; Thelwall & Sud, 2016). In the study, articles randomized to TrendMD had a 77% increase in article saves on Mendeley relative to control (Kudlow et al., 2017). The results so far suggest that promotion of scholarly articles in the TrendMD Network appears to increase article visibility and usage. However, there are significant limitations that warrant mention. First, all of the studies completed to date have only examined articles published in a single OA journal, the Journal of Medical Internet Research. This journal, although multidisciplinary, likely has a technology‐savvy audience. Therefore, the extent to which the earlier results are generalizable as opposed to being confounded by an audience bias is unknown. Second, although several lines of evidence demonstrate that Mendeley saves correlate to citations, we have yet to measure how article promotion in the TrendMD Network affects citation counts directly. In the current study, we conducted a 12‐month multidisciplinary randomized controlled trial to address the limitations of our earlier work and to examine if, and to what extent, promotion of articles in the TrendMD Network confers a citation advantage. We also sought to determine whether TrendMD's effect on citation counts are specific to particular disciplines or consistent across all disciplines. We hypothesized that promotion of articles in TrendMD would yield a citation advantage at 12 months and a Mendeley save advantage at 6 months.

2 METHODS We conducted a 12‐month randomized controlled trial that included 3,200 articles published in 64 peer‐reviewed journals across eight subject areas. The prespecified length of the study was 6 months for the intervention and an additional 6 months of observation for a total of 12 months. Citations and Mendeley reader counts (that is, saves) were measured as outcomes of the study. We measured citations at 6 and 12 months and Mendeley saves at 6 months. We did not measure Mendeley saves at 12 months because the intervention only lasted 6 months; it is highly unlikely that TrendMD could yield any benefit to Mendeley saves once articles are no longer promoted since Mendeley saves occur immediately following a click on an article. The subject areas/categories were selected based on the eight categories listed in Google Scholar (https://scholar.google.ca/citations?view_op=top_venues&hl=en). The categories selected were: Business Economics and Management; Chemical and Materials Sciences; Engineering and Computer Science; Health and Medical Sciences; Humanities, Literature and Arts; Life Sciences and Earth Sciences; Physics and Mathematics; and Social Sciences. For each subject area, the top 20 journals ranked by Google Scholar's h‐5 index were selected (Google Scholar only displays the top 20 journals in each subject area). Please see the supplementary material (Appendix S1) for our rationale for using Google Scholar and the h‐5 index in our selection criteria. Eight journals were then randomly selected with a random number generator from the top 20 in each subject area to be included in the study; we did this as opposed to selecting the top journals in each subject area so that our sample would include a randomized mixture of journals of high and lower impact in each subject area. Including both high and lower impact journals was important to our study because we wanted to mitigate the potential confounder that TrendMD promotion is only effective in high‐impact journals. Journals that were not indexed in Scopus and/or Web of Science were excluded from the study. Preprint servers such as ArXiv were also excluded from the study. Starting from April 2018, 50 of the most recently published original articles or review articles in each journal were selected for inclusion in the study. Articles selected for inclusion were published online between January 2017 and April 2018; this includes early view articles. Articles were excluded if they did not contain an abstract or DOI. Block randomization using a random number generator at the subject‐level was used to randomize articles to either control or the intervention arm of the study. For each subject area, 200 articles were randomized to control and 200 articles were randomized to intervention. In total, 1,600 articles were randomized to control, and 1,600 articles were randomized to the intervention. The overall study design is presented in Figure 1. Figure 1 Open in figure viewer PowerPoint Overall study design [Color figure can be viewed at wileyonlinelibrary.com 2.1 Intervention TrendMD (www.trendmd.com) is a cross‐publisher article recommendation and distribution platform that, as of May 2019, was embedded in over 4,700 journals and websites from 300 publishers and seen by ~125 million readers per month. Participating publishers use TrendMD to distribute their published article links within the article recommendations displayed on articles within their journals (nonsponsored recommendations) or third‐party journals within the TrendMD Network (sponsored recommendations; Figure 2). TrendMD's content distribution model is benchmarked to similar services in the consumer web, where the leading networks Outbrain (www.outbrain.com) and Taboola (www.taboola.com) generate the “From the web” and “You may like” recommendations seen alongside the content on many popular websites like CNN or BBC (Kudlow et al., 2016; Figure 3). Figure 2 Open in figure viewer PowerPoint The difference between sponsored and nonsponsored TrendMD links [Color figure can be viewed at wileyonlinelibrary.com Figure 3 Open in figure viewer PowerPoint How TrendMD works [Color figure can be viewed at wileyonlinelibrary.com The intervention consisted of promotion of 1,600 articles in the TrendMD Network for 6 months, between May 1, 2018 and November 2, 2018. Articles included in the TrendMD Network are displayed as recommended article links (Kudlow et al., 2016). Links to articles randomized to TrendMD were displayed as sponsored recommended links on publications participating in the TrendMD Network. There was an average of 4,300 participating journals and 121 million readers per month during the course of the study. The frequency of sponsored article link placements was determined by a relevancy score based on the following: relatedness (that is, keyword overlap), collaborative filtering (similar to Amazon's “people who bought this item also bought that item”), and user clickstream analysis (the Netflix approach, basing recommendations on the users' interests expressed through their online history; Kudlow et al., 2014, 2016). As a result of the relevancy scoring system, some articles randomized to TrendMD were both seen more often (that is, accrued more link impressions) and clicked on more frequently than others in the TrendMD Network. The sponsored links are displayed in the TrendMD Network as long as they are relevant, and the advertiser account balance is greater than $0. An account in TrendMD was created for this study. The 1,600 articles randomized to TrendMD received a maximum total budget of $9,600 at a cost‐per‐click of $0.1 USD for 96,000 sponsored TrendMD clicks. The account was allowed to spend up to a maximum for $1,600 per month, or 16,000 clicks per month, over the 6‐month study period for a total of 96,000 clicks. The actual amount spent by the account was $1,600 per month; all clicks were delivered each month throughout the 6‐month study. A summary of how TrendMD works and the outcomes measured can be seen in Figure 4. Figure 4 Open in figure viewer PowerPoint How TrendMD works and outcomes measured [Color figure can be viewed at wileyonlinelibrary.com 2.2 Control Articles randomized to control (n = 1,600) received no promotion in the TrendMD Network. Articles randomized to control received traffic by organic means (for example, Google, Google Scholar, PubMed, and so on) and other means implemented by publishers and/or authors of content outside the context of this study. 2.3 Primary outcome The primary outcome of our study was the mean citation counts for articles randomized to TrendMD compared to control. Article citation counts at 12 months were abstracted through the Scopus API on May 1, 2019. 2.4 Secondary outcomes 2.4.1 Twelve‐month analysis Twelve‐month mean citation counts were compared for articles randomized to TrendMD versus control for each of the eight subject areas. This analysis was completed in order to assess whether the effects of TrendMD promotion were discipline‐specific. Article citation counts were abstracted from the Scopus API on May 2, 2019. 2.4.2 Six‐month analysis Differences in mean citation counts were compared at 6 months for articles randomized to TrendMD versus control. Mean citation counts were compared at the aggregate level as well as within each of the eight subject areas included in the study. Six‐month citation counts were abstracted from the Scopus API on November 2, 2018. The 6‐month mean difference in Mendeley reader counts (that is, saves) for articles randomized to TrendMD versus control was also a secondary outcome. Mendeley saves were compared for articles randomized to TrendMD versus control in aggregate and within each of the eight subject areas. A Mendeley save is counted when an article has been saved to a Mendeley user library account. Mendeley saves were selected because this metric is a sensitive future predictor of citations across all academic disciplines (Ebrahimy, Mehrad, Setareh, & Hosseinchari, 2016; Thelwall & Wilson, 2016). Mendeley reader counts were abstracted through the Mendeley API (Mendeley, n.d.) on November 2, 2018. TrendMD pageview data (that is, click, impression, and click‐through rate) were collected for articles randomized to TrendMD. A TrendMD click is counted when a user clicks on a promoted article link displayed in the TrendMD widget; a TrendMD click leads to a TrendMD pageview. A TrendMD impression is counted when a hyperlink is displayed to a user viewing a TrendMD widget. The click‐through rate is calculated by dividing TrendMD clicks by TrendMD impressions. TrendMD pageview data were abstracted from the TrendMD database on November 2, 2018. 2.5 Statistical analysis We performed an a priori power calculation to determine the necessary sample size to detect differences in our primary outcome of mean citation counts between groups at 12 months. Based on prior research, we assumed that both the primary and secondary outcomes would have a log‐normal distribution (Fox et al., 2016). We estimated that the difference in citation counts at 12 months would be five, with a standard deviation of 45. Therefore, assuming a log‐normal distribution for 12‐month citation counts, an effect of the intervention could be detected at 80% power using a two‐tailed independent samples t‐test (alpha = .05) by a sample size of 1,272 articles in each group (Kadam & Bhalerao, 2010). Given the uncertainty in our citation count assumptions, we added additional articles in order to have 1,600 articles in each arm of the study. The study was also powered to detect differences in Mendeley saves at 6 months. However, the study was not powered to detect differences in citation counts at 6 months, nor was it powered to detect differences across the subject area‐level comparisons. Baseline characteristics of articles at the start of the study were tabulated and compared across randomized arms of the study. We categorized articles by subject area, access type (closed vs. open access), Journal Impact Factor, as well as citations and Mendeley saves. Both the primary and secondary outcomes were analyzed with the two‐sample t‐test on log‐transformed data. In the event that mean differences were statistically significant, we calculated effect sizes using Cohen's d (Cohen, 1977) on log‐transformed data. Cohen's d is defined as the difference between two means divided by a standard deviation for the data. Lastly, we performed a stepwise multivariate Ordinary Least Squares (that is, linear) regression analysis to determine the predictors of citations at 6 and 12 months, and Mendeley saves at 6 months; these were log‐transformed dependent variables in the model. Journal Impact Factor, access type (that is, OA vs. closed), baseline Mendeley saves and citations, and TrendMD clicks (that is, clicks on promoted article links) and impressions (that is, display of promoted article links), were covariates in the regression model; they were selected as covariates because each of them has known independent effects on citations (for example, Journal Impact Factor is a predictor of citation counts) and do not have issues of multicollinearity. All regressions adjusted the standard errors for clustering of citations and Mendeley saves using Huber–White standard errors; this corrected for heteroscedasticity (White, 1980). A two‐tailed p < .05 was considered statistically significant. In order to mitigate the type I error rate, the Bonferroni correction method was used to control for multiple comparisons made for the eight disciplinary differences in mean citation and Mendeley reader counts (Chen, Feng, & Yi, 2017). A two‐tailed p < .00625 was considered statistically significant for mean differences in Mendeley reader and citation counts across subject areas. Arithmetic mean values are shown with 95% confidence intervals (CIs) on nonlog‐transformed data unless otherwise specified. Tests for normality were included in the model. SPSS v. 25 (IBM, Armonk, NY) was used to complete the statistical analyses.

4 DISCUSSION To the best of our knowledge, this was the first randomized controlled trial to demonstrate that an intervention can be used to increase citations of peer‐reviewed articles after they have been published. Promotion of articles in the TrendMD Network conferred an overall mean Mendeley save and citation advantage relative to nonpromoted articles after 6 months. Although promotion of articles lasted only 6 months, TrendMD conferred a more significant increase in citations at 12 months in comparison to 6 months. The effects of TrendMD, however, were not consistent across all disciplines. At 6 months, TrendMD increased Mendeley saves and citations for articles published in seven and two out of the eight disciplines tested, respectively. At 12 months, TrendMD increased citations for three of eight disciplines tested. At both 6 and 12 months, TrendMD conferred the largest citation advantage to articles published in the subject areas of Health, Medical and Life Sciences, and the least in Humanities, Literature and Arts; the same disciplines had the largest and smallest increases in Mendeley saves, respectively, at 6 months. This study significantly adds to the scant corpus of literature examining the efficacy of strategies to distribute peer‐reviewed content. The most well‐studied strategy to increase citations of articles is publishing content in OA journals (Eysenbach, 2006; Koler‐Povh et al., 2013; Kurtz et al., 2005; Laakso & Björk, 2013; Lawrence, 2001; Mikki, 2017; Ottaviani, 2016; Piwowar et al., 2018; Wang et al., 2015). However, the extent to which the OACA is mediated by author selection bias as opposed to increased accessibility driven by OA publication is not yet known (Anderson, 2014; Davis, 2011; Davis et al., 2008). The rise of OA publications reduces one barrier to effective dissemination by making literature freely available to all who wish to consult it. However, OA articles still rely on end‐readers pulling out knowledge and knowing what to search for, which dramatically limits OA's ability, in and of itself, to increase article impact. In our current study, OA was not found to confer any citation advantage relative to closed‐access articles (Table 6). Other, more active dissemination strategies, such as promoting articles in social media channels, have similarly yielded inconclusive results (Dixon et al., 2015; Erdt et al., 2017; Fox et al., 2015, 2016; Hawkins et al., 2017; Hayes et al., 2015; Trueger et al., 2018). Data presented in this study, therefore, address a pressing unmet need of authors, publishers, and funders for evidence‐based strategies that can be used to augment the impact of peer‐reviewed content. Article promotion in the TrendMD Network likely conferred the observed citation advantage by direct and indirect means. The direct effects of TrendMD were the result of readers clicking on promoted article links displayed in the widget. A portion of readers who clicked on promoted article links also saved these articles to their Mendeley reference libraries and ultimately went on to cite these articles while creating other scholarly work. The results of this study provide evidence for this direct mechanism; pageviews (that is, clicks) driven by TrendMD were an independent predictor of both Mendeley saves at 6 months and citations at 6 and 12 months (Table 6). In addition to direct means, the results of this study suggest that TrendMD conferred a citation advantage through indirect means. The indirect mechanism of TrendMD was not readers clicking on promoted article links, but rather readers seeing the recommended article links displayed in the widget while reading other relevant scholarly material. Once readers saw the links, they may have saved the promoted articles as bookmarks, shared the articles with their colleagues over email, or taken note of the article to visit later. This indirect mechanism is evidenced by results that demonstrate that link impressions were an independent predictor to citations at both 6 and 12 months. In contrast to the direct mechanism, the indirect mechanism of TrendMD does not appear to be mediated by users saving articles more frequently to their Mendeley libraries; impressions were not shown to predict Mendeley saves at 6 months. Although TrendMD conferred an overall citation advantage to promoted articles, this was not true for articles in all subject areas. One possible explanation for the differing effects of TrendMD on citations by subject area could be the varying length of publication cycles among subject areas. Articles in the fields of medicine, life sciences, and physics are typically published faster in comparison to articles in the humanities and social sciences (Johnson et al., 2018). The results of this study indicate that time had a significant impact on citation growth; TrendMD was found to have a greater impact on citations measured at 12 months in comparison to 6 months. Had we measured citations at 2 years, we may have found an increase in citations across more disciplines; this possibility is strengthened by the fact that we found a Mendeley save advantage for seven out of eight disciplines. Replicated evidence suggests that for some disciplines, like humanities and social sciences, a Mendeley save advantage may take years to materialize into citations (Thelwall, 2018). In contrast, it is also possible that TrendMD was not able to confer a citation advantage after 1 year in all disciplines because of the uneven distribution of audience interests and publications participating in the TrendMD Network at the time of the study. As shown in Table 5, subject areas received differing amounts of impressions and clicks from TrendMD, a reflection of the disciplinary composition of the TrendMD Network. Impressions and clicks were both independent predictors of citations, so it is possible that the relatively low numbers of impressions and clicks for particular subject areas was the underlying reason for the lack of efficacy of TrendMD to increase citations. If true, the implication is that even if the observation length of this study was increased, we still would not have found an increase in citations across all disciplines. Extending the observation window of the study to 2 or 3 years would help to discern whether TrendMD's lack of efficacy in all disciplines was due to time or other factors, such as the uneven subject area distribution of the TrendMD Network. The research presented here has several strengths. First, the outcomes—Mendeley saves and citation counts—are unbiased and objective, increasing the reproducibility of the results. Second, we employed a rigorous randomized controlled trial design, which minimizes the likelihood of bias and confounding. Third, our sample size was large, and our study was adequately powered for our primary outcome of differences in mean citations at 12 months. There are, however, several limitations to this research. First, authors P.K., D.B.D., A.R., and G.E. have a conflict of interest with the results presented as creators, owners, and/or employees of TrendMD. Risk of bias, however, was mitigated by the randomized controlled trial design as well as the inclusion of coauthor A.S., who does not have a conflict of interest. Second, our inclusion criteria selected only for articles published in a random sample of eight out of the top 20 journals with the highest h‐5 index in Google Scholar categories, which may make our results less generalizable to articles published in journals with lower impact factors. Since the journal name is displayed in the TrendMD widget, users may have recognized these names and clicked on them more frequently than they otherwise would have; it is, therefore, possible that had we included lower impact factor journals, TrendMD would have had less effect on article usage and visibility. Future studies are needed to determine whether TrendMD still confers a usage and citation advantage to articles published in lower impact factor journals. That said, we did not find a correlation between Journal Impact Factor and TrendMD clicks or impressions in this study. Another possible limitation, discussed earlier, is the length of observation in this study. Our observation period may explain why we found an increase in citations at 6 and 12 months for disciplines like medicine, which have relatively short publication cycles, but not for articles published in the humanities, which tend to have longer publication cycles (Johnson et al., 2018). In addition, it's possible that TrendMD's effect size on citations and Mendeley saves may saturate and diminish over a longer period of time as users become accustomed to seeing the same recommended articles. We plan to examine the citation rates again at 2 and 3 years. Fourth, although our study was adequately powered to detect mean differences in all articles, our study may not have been adequately powered to detect mean differences in citation counts between disciplines. We therefore cannot refute the possibility of type II errors, that is, falsely concluding there were no differences in mean citation counts within disciplines. Another limitation is that the number and type of publishers participating in the TrendMD Network may change over time, which may affect the reproducibility of these findings. Just as the efficacy of social media to promote articles is dependent on the account's number of followers, the efficacy of TrendMD promotion is likely dependent on the number and type of publishers in the Network (Hawkins et al., 2017; Hayes et al., 2015; Trueger et al., 2018). In general, the more publishers using TrendMD, the greater the efficacy of the channel to promote articles, and the greater the likelihood it confers a citation benefit. If, in the future, publishers stopped using TrendMD, the channel is unlikely to be as effective as described in this study. Lastly, our results were limited by the fact that no other online interventions, including social media, were tested. Future studies are planned to test the effects of the distribution of articles via both paid and unpaid social media channels, in parallel with cross‐publisher recommendations via TrendMD. Despite the limitations, this study demonstrates that the promotion of articles in a cross‐publisher online distribution channel (TrendMD) can be used to increase citations of articles across several academic disciplines. The citation advantage conferred to promoted articles was observed as early as 6 months and increased at 12 months.

ACKNOWLEDGMENTS This research was supported in part by two investment grants made by the Ontario Centres of Excellence (OCE), a Canadian government‐funding program. The grants were OCE Market Readiness CC (Grant Number 22292) and OCE Market Readiness CB (Grant Number 23811). We also thank members of the TrendMD team for help with data acquisition. Dr. Reithmeier is thanked for his comments on the article.

CONFLICT OF INTEREST Dr. Paul Kudlow, Alan Rutledge, and Dr. Gunther Eysenbach are cofounders and owners of TrendMD Inc. Devin Bissky Dziadyk was a full‐time employee of TrendMD Inc. at the time of the study. Dr. Aviv Shachak was cosupervising Dr. Paul Kudlow for his graduate work but declares no financial conflict of interest with TrendMD Inc. or the data presented herein.

Supporting Information Filename Description asi24330-sup-0001-Supinfo.docxWord 2007 document , 17.4 KB Appendix S1. Supporting Information. Please note: The publisher is not responsible for the content or functionality of any supporting information supplied by the authors. Any queries (other than missing content) should be directed to the corresponding author for the article.