Results Policies were highly variable. Of 23 companies eligible from the top 25 companies by revenue, 21 (91%) committed to register all trials and 22 (96%) committed to share summary results; however, policies commonly lacked timelines for disclosure, and trials on unlicensed medicines and off-label uses were only included in six (26%). 17 companies (74%) committed to share the summary results of past trials. The median start date for this commitment was 2005. 22 companies (96%) had a policy on sharing CSRs, mostly on request: two committed to share only synopses and only two policies included unlicensed treatments. 22 companies (96%) had a policy to share IPD; 14 included phase IV trials (one included trials on unlicensed medicines and off-label uses). Policies in the exploratory group of smaller companies made fewer transparency commitments. Two companies fell short of industry body commitments on registration, three on summary results. Examples of contradictory and ambiguous language were documented and summarised by theme. 23/42 companies (55%) responded to feedback; 7/1806 scored policy elements were revised in light of feedback from companies (0.4%). Several companies committed to changing policy; some made changes immediately.

Individual patient data (IPD) is the raw data collected during a clinical trial, with detailed information on each individual participant. As such it presents important opportunities for research—for example, by allowing third parties to verify trialists’ initial analyses; permitting meta-analysis of pooled IPD for more accurate point estimates of benefits; giving greater power for subgroup analyses; and allowing new hypotheses to be explored in existing data, including on abandoned products and treatments. 26 27 28 However, it also presents a risk of re-identification of pseudonymised participants. Because of this, IPD is not generally posted in public but shared through various controlled access mechanisms, as with other forms of rich electronic health record data used by epidemiologists.

Clinical study reports (CSRs) are large documents, sometimes thousands of pages long, which are generated for regulatory purposes and follow a standard format set out under international guidance. 22 They are routinely created for industry trials, and less well known in the academic community, but contain a wealth of detail on methods and results that is often missing from other sources 23 : one recent study estimates that CSRs contain twice as much information on benefits and harms as academic papers on trials. 24 From 2010 the European Medicines Agency began releasing CSRs on request, after a European ombudsman ruling of maladministration against the agency for withholding such information. 25

Registration is the most basic level of transparency: an entry on a publicly accessible registry to note that the trial exists, with some core information on features such as the intervention and the patient population. 15 Registration does not guarantee disclosure of results, but it can be used by researchers to identify completed trials as a step towards establishing whether they have subsequently disclosed their results.

Audit is a simple tool that is widely used throughout medicine to help improve standards by establishing a reference standard, against which performance can be measured. 7 Through audit, those performing badly can be targeted for action to improve standards, and those performing to the highest standards can be identified, so that others can learn from their best practice. Audit data can be used by stakeholders such as regulators, patient groups, professional bodies, ethical investors, and healthcare workers to advocate for improvements in poorly performing companies and to help improve standards—for example, by informing individual consumer decisions and policy activity, 8 or by informing decisions made by ethical investors, as exemplified by the Access to Medicines Index. 9 There have been calls for audits of access to trial results for performance monitoring and comparison, 8 with several recent examples including trials from individual research centres, 10 11 12 all drugs approved in one year, 13 and all trials approved by individual ethics committees. 14 No attempts have, however, been made to systematically compare funder or sponsor policies on transparency.

The methods and results of completed clinical trials are routinely left unpublished. 1 This is a longstanding structural problem that impacts negatively on patient care. 2 3 Anecdotally there is a wide range of variation in policies and actions on trial transparency between different companies. For example, GlaxoSmithKline has publicly committed to share clinical study reports (CSRs) for all clinical trials back to 2000, 4 and it has set up a unit within the company to deliver this. 5 In contrast, AbbVie and Intermune sued the European Medicines Agency in a bid to prevent the regulator from disclosing the equivalent documents. 6

We generated descriptive statistics summarising the proportion of companies making key commitments and the extent to which commitments applied retrospectively. For companies that were members of an industry body, we assessed whether their policy was consistent with the minimum commitments made by four pharmaceutical industry bodies: EFPIA, PhRMA, International Federation of Pharmaceutical Manufacturers and Associations (IFPMA), and Japan Pharmaceutical Manufacturers Association (JPMA). All raw underlying data are shared online, 40 permitting others to critically review assessments or create composite measures of overall transparency commitments to compare companies.

In 2015 before commencing this study we wrote to each companies’ representatives inviting them to meet us for an hour. This was to explain our project, allow for feedback, and ensure that it was understood we would be collecting data on policies and publishing our findings. In June 2016 we sent the companies the full set of data extracted for their company’s policy and invited responses setting out any disagreements on any element, by making reference to the text of their publicly accessible policy as it stood at 17 April 2016. The data sent did not indicate how the company compared with other companies. We sent emails to the chief executive officer, medical director, and email accounts of other individuals in relevant roles who had previously responded to us on related queries about transparency. All responses were read, themes and disagreements extracted and reviewed by at least two members of the team (BG and SL), and changes made where appropriate.

Five experienced researchers (BG, CH, KRM, IO, and SL) with a background in clinical trials, transparency, or research integrity (or a combination of specialties) independently extracted the data from retrieved documentation and websites into a data extraction sheet reflecting the questions in web appendix 1. At least three researchers independently extracted data, and then met to agree the final coding by consensus. In some circumstances it was not possible to code answers as “yes” or “no”: these were coded as “unclear.” We attempted to minimise use of this code and achieve consensus where possible. Additionally, we collected examples of ambiguous, contradictory, or problematic commitments during coding, and grouped these by theme.

In line with best practice for audits, we established the reference standard for a transparency policy and developed a data structure to reflect this standard. Our reference standard for transparency was that all trials should be registered as per International Committee of Medical Journal Editors (ICJME) requirements, 30 31 World Health Organisation guidance, 32 33 and legislative requirements 34 ; with methods and results reported in summary form within 12 months of trial completion through online results reporting or other publication, as required under WHO guidance, 1 EU legislation, 35 36 and Food and Drug Administration Amendments Act (FDAAA) of 2007; that CSRs should be made publicly available if they have been created, in accordance with current EU legislation 36 and various calls from civic society and academia, 23 37 and that IPD should be available on request in some form to researchers. 38 39 We then operationalised these broad commitments into structured questions across the four domains of registration, methods and results sharing, CSRs, and IPD, assessing the policy commitments on each domain in detail (see web appendix 1). Prospective commitments and retrospective commitments were coded separately. Because some companies’ retrospective commitments only applied to recent trials, whereas others went back several decades, we extracted the start dates for retrospective commitments into our coding sheet. We also assessed whether certain categories of trial, such as phase IV trials conducted after approval of a new product, or trials of unlicensed medicines and off-label uses, were included under each policy.

We set out to include 50 companies: the top 25 pharmaceutical companies by global sales 29 and an arbitrary selection of smaller companies for exploratory analysis of smaller firms’ policies. Baxter was excluded as it no longer makes pharmaceutical products and Teva was excluded as it is principally a generics company; during the audit period six smaller companies ceased to exist, largely through merger, leaving 42 companies. We searched Google for company policies and statements on clinical trial transparency using the key terms “company name” “clinical trial transparency” and “company name” “clinical trials” and by navigating through company websites for formal standalone transparency policies. We also searched for policies on clinical trial transparency from the European Federation of Pharmaceutical Industries and Associations (EFPIA) and Pharmaceutical Research and Manufacturers of America (PhRMA). We saved archive copies of all company website pages containing transparency commitments, downloaded archive copies of all standalone documents such as PDFs, and downloaded policy documents from EFPIA and PhRMA, as they stood at 17 April 2016.

Results

Overall, 42 companies were assessed: 22 were based in the European Union, 13 in the USA, six in Japan, and one in Canada. Forty companies (95%) had a publicly accessible policy, and in total we reviewed 527 pages of policy documentation. Table 1⇓ shows the proportion of companies that met each of the transparency criteria. In total, 21 (91%) of the 23 top companies by revenue had a commitment to register all trials, 15 (65%) described their registration policy covering past trials, and two (9%) conducted an audit of compliance. Twenty two (96%) companies made a commitment to make all summary results available, 17 (74%) committed to share the summary results of past trials; however, policies commonly did not include timelines for disclosure, and only six (26%) included trials on unlicensed medicines and off-label uses. Twenty two companies (96%) had a policy on clinical study reports (CSRs), of whom 21 offered some form of sharing (17 on request and two sharing synopses); two included trials on unlicensed medicines and off-label uses. Twenty two companies (96%) had a policy to share individual patient data (IPD). One included unlicensed medicines and off-label uses and 14 included phase IV trials. Table 1⇓ shows that the exploratory group of smaller companies made fewer transparency commitments.

Table 1 Proportion of companies meeting each transparency criteria View this table:

Table 2⇓ lists all the companies and policy documents coded, and summarises whether each committed to the standards described. The full extracted data on every company’s detailed commitments are available online.40 The median start date for retrospective policies on the reporting of both registration and summary results was 2005. The median start date for sharing of both CSRs and IPD was 2012. Table 3⇓ shows the range of the start dates for policy commitments.

Table 2 List of companies and summary policy commitments View this table:

Table 3 Summary of start dates for policy commitment in each of the four policy domains View this table:

Problematic, inaccurate, and contradictory language Table 4⇓ gives examples of problematic, inaccurate, and contradictory language in policy documents, grouped by theme (see web appendix 2 for a longer list). Several policies used the word “all” problematically: it was stated or implied in one place that a commitment applied to “all” trials but then a caveat was applied elsewhere in the documentation. Several policies used ambiguous language. For example, Merck Serono stated, as a commitment: “All Merck Serono clinical trials in patients will be considered [our emphasis] for publication in the scientific literature, regardless of outcome.” Similarly, several policies included poorly defined caveats about which trials were covered by the commitments. For example, Purdue “has committed to publish in a publicly available database the results of many of its clinical trials”; and Sanofi commits to post results for “phase I to IV clinical trials conducted in patients, and for some [our emphasis] vaccines trials conducted in healthy subjects.” Some companies made commitments so broad that they were either improbable or contradicted by other parts of the policy. For example, Lundbeck lists a series of exclusions to its transparency policy, setting out those trials of which they will not report results, but also said it adheres to the Declaration of Helsinki, which requires all results to be made publicly available. Further anomalies were identified. Some companies committed to share results on platforms that do not appear to exist. Several companies’ policies contained clauses implying concern that sharing summary results within a 12 month timeline required by various regulations would compromise academic journal publication. Table 4 Examples of shortcomings in pharmaceutical company transparency policies, arranged by theme View this table: We did, however, also find examples of good policies and exemplary clear language. For example, Merck explicitly states that it applies “the same ethical standards to clinical trials in all countries including the developing world”; whereas Bristol-Myers Squibb makes an uncommon explicit commitment to submitting all phase IV trials for journal publication (“We commit to submitting all phase III and IV clinical trials regardless of outcome to peer reviewed journals for publication”). Conversely, Novartis explicitly excludes phase IV trials from CSR and IPD sharing: although less than ideal, the policy is clear and unambiguous on this issue.

Coding challenges We encountered various coding challenges. Some companies had different policies for trials in different territories. Because clinical trials research is a mobile global enterprise we coded according to the elements of the policy that applied globally. Some companies committed to adhere to the law on sharing summary results (eg, Pfizer: “after the completion of those studies, we provide results on clinicaltrials.gov and other registries, in accordance with local regulations and guidelines”). However, these statements are difficult to interpret, since the regulations and guidelines themselves are often poorly specified and implemented: for example, the rules implementing the FDA Amendment Act 2007 were still not published when this audit was completed, and they do not apply to all trials. We therefore scored companies for a commitment only if it was explicitly stated what they would share, rather than alluding to compliance with regulations. Some companies stated they shared information on trials for treatments that were either approved or where the research programme was terminated. We coded these as not committing to share results of trials on unapproved medicines (or off-label uses of approved medicines) because the process of termination is discretionary and may not happen within a consistent timeframe. Related to this, many companies stated they will submit results of previous trials to clinicaltrials.gov but gave no timeline: again, this meant results might never appear and yet not formally breach the company’s policy commitments. Undated documents often made it impossible to assess how far back policies go; for these companies we could not give a policy start date. Where policy start dates were given, it was challenging to make dates comparable between companies, as some stated “trials started after” a given date, some “trials completed after” a given date, some “drugs approved after” a given date, and some simply gave a date without further specification. We normalised dates using the method described in web appendix 1.

Company feedback When sent details of our extracted summary of their commitments, 23/42 companies (55%) replied. These replies were highly variable in length (mean 12 pages, range 1-39) and content. Two companies (Pfizer and GlaxoSmithKline) provided publicly accessible documents we had not found by searching companies’ websites. Neither document was easily discoverable by anyone seeking a company’s public policy: both had been prepared by communications departments (a press release and a policy PDF), both were housed in the media section of the website, neither were linked from other parts of their site, and both had only minimal links from the wider internet (on a refined Google search for “pages linking to” the web addresses for these documents). We accepted both, however, as the companies provided evidence that they were publicly accessible at April 2016. Combining the two additional policy documents, and 217 pages of company responses in total, raising over 300 points of contention, we identified seven elements in our database that we changed in the light of critical feedback of our assessments from companies: this was 0.4% (7/1806) of all coded policy elements. The full raw text of all responses is shared as underlying raw data alongside our own data sheets for each company.40 Several companies acknowledged that their policy was ambiguous, or indicated that their policy did not reflect their actions. Specifically: 11 stated that their actions exceeded their public policy commitments, four stated that they will change their public policy in response to our contact, three stated that they planned to change their policy, without being explicit that this was in response to our contact, and four stated that they had already made a change to their public policies on the issues we raised, without being clear whether this was in response to our contact. Some responses about policy changes were problematic. In challenging our coding, some company responses were constructive, making reference to specific phrases in their policies the reading of which they wished to contest. However, most were lengthy, used vague language, and failed to address the specific questions raised. There were recurring themes in responses. Four companies argued that our coding of their public policy was wrong, but the evidence they gave for this assertion was a private document not available in the public domain, such as an internal standard operating procedure. Five companies stated that our coding of their public policy was wrong, but gave no evidence as to why, and their response was not consistent with their public policy. Three companies argued that it was unfair to describe the limitations on their IPD or CSR sharing commitments because they also considered applications for data and documents that fall outside these commitments (this commitment was coded separately in our audit). Notably, Novo Nordisk stated in its response to our coding: “Just for reference, we do not say it anywhere, because we do not want to encourage submission of research proposals that go beyond the stated scope, but we do actually consider all received proposals.”