Notes on this post:

Stuart Buck of the Laura and John Arnold Foundation, as well as Prof. Steven Goodman of METRICS, reviewed a draft of this post.

Throughout the post, “we” refers to GiveWell and Good Ventures, who work as partners on GiveWell Labs. [Added August 27, 2014: GiveWell Labs is now known as the Open Philanthropy Project.]

We’re very excited for the launch of Meta-Research Innovation Center at Stanford (METRICS), co-founded by John Ioannidis and Steven Goodman and supported by a grant from the Laura and John Arnold Foundation (LJAF). METRICS will bring together researchers to study the state of medical research quality, including questions such as how concerned we should be about publication bias (which the founders have published helpful papers on in the past), and to advocate for potential solutions. (Also see coverage in The Economist.)

Our work on GiveWell Labs was responsible for initially connecting the METRICS founders to LJAF, which is providing a commitment of up to ~$6 million to help METRICS through its initial years, during which time METRICS will be seeking more sources of support. We find it worthwhile to lay out the events that led to this connection, partly because they indicate some degree of impact on our part (though not of our usual kind) and partly because they make for an interesting case study in how to source giving opportunities.

Our role in connecting METRICS and LJAF

In 2012, we investigated the US Cochrane Center, in line with the then-high priority we placed on meta-research. As part of our investigation, we were connected – through our network – to Dr. Steven Goodman, who discussed Cochrane with us (notes). We also asked him about other underfunded areas in medical research, and among others he mentioned the idea of “assessing or improving the quality of evidence in the medical literature” and the idea of establishing a center for such work.

During our follow-up email exchange with Dr. Goodman, we mentioned that we were thinking of meta-research in general as a high priority, and sent him a link to our standing thoughts on the subject. He responded, “I didn’t know of your specific interest in meta-research and open science … Further developing the science and policy responses to challenges to the integrity of the medical literature is also the raison d’etre of the center I cursorily outlined, which is hard to describe to folks who don’t really know the area; I didn’t realize how far down that road you were.” He mentioned that he could send along a short proposal, and we said we’d like to see it.

At the same time, we were in informal conversations with Stuart Buck at the Laura and John Arnold Foundation (LJAF) about the general topic of meta-research. LJAF had expressed enthusiasm over the idea of the Center for Open Science (which it now supports), and generally seemed interested in the topic of reproducibility in the social sciences. I asked Stuart whether he would be interested in doing some funding related to the field of medicine as well, and in reviewing a proposal in that area that I thought looked quite promising. He said yes, and (after checking with Dr. Goodman) I sent along the proposal.

From that point on, LJAF followed its own process, though we stayed posted on the progress, reviewed a more fleshed-out proposal, shared our informal thoughts with LJAF, and joined (by videoconference) a meeting between LJAF staff and Drs. Ioannidis and Goodman.

Following the meeting with Drs. Ioannidis and Goodman, we told Stuart that we would consider providing a modest amount of co-funding (~10%) for the initial needs of METRICS. He said this wouldn’t be necessary as LJAF planned to provide the initial funding.

Takeaways

Case study in “active funding.” We’ve written before about the evolution of our thinking on active vs. passive funding . Specifically, while we used to imagine reviewing a large number of project proposals and recommending the best for funding, we now take the approach of choosing “causes” – broad areas/problems of interest, even when it isn’t obvious what the interventions or grantees would be – and looking for giving opportunities only after we’ve identified causes to focus on. Part of the reasoning behind this approach is that project proposals often emerge in response to funder interests, so it’s may not be possible to find all the possible projects to fund in an area until one starts actively expressing interest in the area.

The above case is a good example of this. The conversations we had with LJAF and with Dr. Goodman sprang out of a pre-declared interest in the cause of meta-research, and in Dr. Goodman’s case it seems he would never have sent the proposal if not for our clearly articulated interest in the subject.

Our impact. GiveWell never issued an official recommendation in favor of funding METRICS. While we shared informal opinions with LJAF, LJAF followed its own evaluation process and decided on its own to provide funding. So we don’t think the LJAF grant ought to be considered as GiveWell money moved in the usual sense. In addition, we don’t know what the counterfactual impact of our role was; it’s possible that LJAF would eventually have found and funded METRICS. With that said, it seems to us that by actively prioritizing the area of meta-research and following the lines of investigation we did, we raised the speed and probability of this project’s moving forward. On review of this post, Stuart stated “I’m certain that I wouldn’t have come across [the] initial proposal in any other way.”

Note that this post initially included a discussion of another project that we thought had had a similar dynamic. We introduced Elizabeth Iorns, founder of the Reproducibility Initiative, to Stuart Buck of LJAF in April 2013 (after a February 2013 conversation with Dr. Iorns as part of our continuing meta-research investigation); LJAF has since provided support for the Reproducibility Initiative through the Center for Open Science. However, Stuart corrected us on this point, pointing out that he was introduced to Elizabeth independently about a month after our intro; we now believe we played little to no role (relative to the counterfactual) in that connection.

Where we stand on meta-research

In mid-2013, we gave two updates on our thinking regarding meta-research: a landscape of the “open science” community and a more general update listing several things that “meta-research” could mean. We noted that we were planning to pause work on this front until we had examined some other causes.

At this time, our take is that: