<p>How can we evaluate all evidence in order to draw conclusions, as a single study is no longer enough? An obvious answer are meta-analyses, which are considered to be at the top of the "evidence pyramid". But meta-analyses are not perfect; even systematic collections of studies might be limited or biased. Or a meta-analyst might not have uncovered some file-drawer studies. In addition, meta-analyses are outdated as soon as new studies on the same topic emerge. Finally, the results in a paper, including a meta-analysis, are always a selection of possible analyses and the specific subset or moderator test that would have been most useful for a particular purpose might not be included.</p> <p>In response to these concerns, community-augmented meta-analyses (CAMAs, Tsuji, Bergmann, & Cristia, 2014) were proposed, which make meta-analyses compatible with cumulative open science, in line with recent recommendations to improve the reliability of meta-analyses (Lakens, Hilgard, & Staaks, 2016). This concept is implemented and extended in the MetaLab project, which focuses on Developmental Psychology ( <a href="http://metalab.stanford.edu" rel="nofollow">http://metalab.stanford.edu</a>; Bergmann et al., 2017). MetaLab provides interactive visualizations, dynamic reports, tutorials, and open data and scripts. Different levels of engaging with the meta-analyses are thus possible, ranging from visualizing effect sizes to downloading data for further analysis. MetaLab continues to grow, and currently contains 15 datasets (with 5 meta-analyses in progress). Crucially, MetaLab can be adapted to other domains of Psychology, thanks to being built on open science principles.</p> <p>MetaLab (or related tools like metaBUS) can be useful at all stages of a typical experimental study; from providing a comprehensive literature overview for hypothesis generation over methodological and design advice—particularly regarding power—to the contextualization of results. MetaLab also shows how open data and code paired with easy-to-use interactive visualizations have the power to facilitate adopting best research practices.</p> <p>References</p> <p>Bergmann, C., Tsuji, S., Piccinini, P. E., Lewis, M. L., Braginsky, M. B., Frank, M. C., & Cristia, A. (2017). Promoting replicability in developmental research through meta-analyses: Insights from language acquisition research. In press at Child Development.</p> <p>Lakens, D., Hilgard, J., & Staaks, J. (2016). On the reproducibility of meta-analyses: Six practical recommendations. BMC Psychology, 4(1), 24.</p> <p>Lewis, M. L., Braginsky, M., Tsuji, S., Bergmann, C., Piccinini, P. E., Cristia, A., & Frank, M. C. (2017). A Quantitative Synthesis of Early Language Acquisition Using Meta-Analysis. Preprint DOI: 10.17605/ <a href="http://OSF.IO/HTSJM" rel="nofollow">OSF.IO/HTSJM</a></p> <p>Tsuji, S., Bergmann, C., & Cristia, A. (2014). Community-augmented meta-analyses: Toward cumulative data assessment. Perspectives on Psychological Science, 9(6), 661-665.</p> <p>-- Researcher at the Max Planck Institute for Psycholinguistics, Language Development Department, Nijmegen, The Netherlands Website: <a href="http://sites.google.com/site/chbergma" rel="nofollow">sites.google.com/site/chbergma</a> Blog: <a href="http://cogtales.wordpress.com" rel="nofollow">cogtales.wordpress.com</a></p>