The Camelyon16 ISBI challenge took place on Wednesday, 13 April. The presentations from the organizing team are now available. Camelyons16 was closed in November 2016 following the launch of Camelyon17.

Title of presentation Presenter Download How computers shape the future of pathology Jeroen van der Laak Camelyon16: Aim, dataset, and evaluation Babak Ehteshami Bejnordi Statistics, Leaderboards, Results and Comparison to Pathologist Babak Ehteshami Bejnordi

Public Leaderboard 1 - Whole-slide-image classification

The results are computed on the independent test set.

Evaluation 1: Teams are ranked based on area under ROC curve (AUC).

Top-five ranked teams until the challenge event deadline (Apr 1, 2016):

Rank Team AUC Submission date Description 01 Harvard Medical School and MIT, Method 1 0.9234 01 Apr 2016 02 EXB Research and Development co., Germany 0.9156 01 Apr 2016 03 Independent participant, Germany 0.8654 01 Apr 2016 04 Middle East Technical University, Departments of EEE, NSNT and HS, Turkey 0.8642 01 Apr 2016 05 NLP LOGIX co., USA 0.8298 01 Apr 2016

Leaderboard including all submissions

* Indicates that the team has achieved an AUC value that surpasses the AUC of the pathologist in our study.

Rank Team AUC Submission date Description 01 * Harvard Medical School and MIT, Method 2 (updated) 0.9935 06 Nov 2016 02 * Harvard Medical School, Gordon Center for Medical Imaging, MGH, Method 3 0.9763 24 Oct 2016 03 Harvard Medical School, Gordon Center for Medical Imaging, MGH, Method 1 0.9650 07 Sep 2016 04 The Chinese University of Hong Kong (CU lab, Hong Kong), Method 3 0.9415 29 Aug 2016 05 Harvard Medical School and MIT, Method 1 0.9234 01 Apr 2016 06 EXB Research and Development co., Germany 0.9156 01 Apr 2016 07 The Chinese University of Hong Kong (CU lab), Hong Kong, Method 1 0.9086 08 June 2016 08 Harvard Medical School, Gordon Center for Medical Imaging, MGH, Method 2 0.9082 24 Oct 2016 09 The Chinese University of Hong Kong (CU lab), Hong Kong, Method 2 0.9056 20 July 2016 10 DeepCare Inc, China 0.8833 05 Nov 2016 11 Independent participant, Germany 0.8654 01 Apr 2016 12 Middle East Technical University, Departments of EEE, NSNT and HS, Turkey 0.8642 01 Apr 2016 13 NLP LOGIX co., USA 0.8298 01 Apr 2016 14 Smart Imaging Technologies co., USA 0.8207 14 May 2016 15 University of Toronto, Electrical and Computer Engineering, Canada 0.8149 01 Apr 2016 16 The Warwick-QU Team, United Kingdom 0.7958 01 Apr 2016 17 Radboud University Medical Center (DIAG), Netherlands 0.7786 01 Apr 2016 18 HTW-BERLIN, Germany 0.7676 01 Apr 2016 19 University of Toronto, Electrical and Computer Engineering, Canada 0.7621 01 Apr 2016 20 BioMediTech, University of Tampere, Finland 0.7612 01 Apr 2016 21 Smart Imaging Technologies co., USA 0.7574 01 Apr 2016 22 Technical University of Munich (CAMP), Germany - Method 2 0.7367 30 Aug 2016 23 Osaka University, Department of Bioinformatic Engineering, Japan 0.7319 01 Apr 2016 24 University of South Florida, Computer Science and Engineering, USA 0.7270 01 Apr 2016 25 NSS college of Engineering, India 0.7269 01 Apr 2016 26 BioMediTech, University of Tampere, Finland 0.7132 01 Apr 2016 27 Technical University of Munich (CAMP), Germany 0.6910 01 Apr 2016 28 United Institute of Informatics Problems, Belarus 0.6890 01 Apr 2016 29 VISILAB, University of Castilla-La Mancha, Spain 0.6531 01 Apr 2016 30 VISILAB, University of Castilla-La Mancha, Spain 0.6513 01 Apr 2016 31 Mines Paris Tec, France 0.6277 01 Apr 2016 32 Sorbonne Universites, Laboratoire d’Imagerie Biomdicale, France 0.5561 01 Apr 2016

Public Leaderboard 2 - Tumor localization

The results are computed on the independent test set.

Evaluation 2: The detection/localization performance is summarized using Free Response Operating Characteristic (FROC) curves. The final score is defined as the average sensitivity at 6 predefined false positive rates: 1/4, 1/2, 1, 2, 4, and 8 FPs per whole slide image.

Top-five ranked teams until the challenge event deadline (Apr 1, 2016):

Rank Team score Submission date Description 01 Harvard Medical School and MIT, Method 1 0.6933 01 Apr 2016 02 Radboud University Medical Center (DIAG), Netherlands 0.5748 01 Apr 2016 03 EXB Research and Development co., Germany 0.5111 01 Apr 2016 04 Middle East Technical University, Departments of EEE, NSNT and HS, Turkey 0.3889 01 Apr 2016 05 NLP LOGIX co., USA 0.3859 01 Apr 2016

Leaderboard including all submissions