Convolutional neural networks for an automatic classification of prostate tissue slides with high–grade Gleason score

Jimenez–del–Toro, Oscar (University of Applied Sciences and Arts Western Switzerland (HES-SO Valais-Wallis) ; University of Geneva, Switzerland) ; Atzori, Manfredo (University of Applied Sciences and Arts Western Switzerland (HES-SO Valais-Wallis)) ; Otalora, Sebastian (University of Applied Sciences and Arts Western Switzerland (HES-SO Valais-Wallis) ; University of Geneva, Switzerland) ; Andersson, Mats (ContextVision AB, Stockholm, Sweden) ; Eurén, Kristian (ContextVision AB, Stockholm, Sweden) ; Hedlund, Martin (ContextVision AB, Stockholm, Sweden) ; Rönnquist, Peter (ContextVision AB, Stockholm, Sweden) ; Müller, Henning (University of Applied Sciences and Arts Western Switzerland (HES-SO Valais-Wallis) ; University of Geneva, Switzerland ; Martinos Center for Biomedical Imaging, Charlestown, USA)

The Gleason grading system was developed for assessing prostate histopathology slides. It is correlated to the outcome and incidence of relapse in prostate cancer. Although this grading is part of a standard protocol performed by pathologists, visual inspection of whole slide images (WSIs) has an inherent subjectivity when evaluated by different pathologists. Computer aided pathology has been proposed to generate an objective and reproducible assessment that can help pathologists in their evaluation of new tissue samples. Deep convolutional neural networks are a promising approach for the automatic classification of histopathology images and can hierarchically learn subtle visual features from the data. However, a large number of manual annotations from pathologists are commonly required to obtain sufficient statistical generalization when training new models that can evaluate the daily generated large amounts of pathology data. A fully automatic approach that detects prostatectomy WSIs with high–grade Gleason score is proposed. We evaluate the performance of various deep learning architectures training them with patches extracted from automatically generated regions–of–interest rather than from manually segmented ones. Relevant parameters for training the deep learning model such as size and number of patches as well as the inclusion or not of data augmentation are compared between the tested deep learning architectures. 235 prostate tissue WSIs with their pathology report from the publicly available TCGA data set were used. An accuracy of 78% was obtained in a balanced set of 46 unseen test images with different Gleason grades in a 2–class decision: high vs. low Gleason grade. Grades 7–8, which represent the boundary decision of the proposed task, were particularly well classified. The method is scalable to larger data sets with straightforward re–training of the model to include data from multiple sources, scanners and acquisition techniques. Automatically generated heatmaps for the WSIs could be useful for improving the selection of patches when training networks for big data sets and to guide the visual inspection of these images.


Keywords:
Conference Type:
full paper
Faculty:
Economie et Services
School:
HEG-VS
Institute:
Institut Informatique de gestion
Subject(s):
Informatique
Publisher:
Orlando, USA, 12-13 February 2017
Date:
Orlando, USA
12-13 February 2017
2017
Pagination:
9 p.
Published in:
Proceedings of SPIE Medical Imaging 2017 : Digital Pathology
Appears in Collection:

Note: The status of this file is: restricted


 Record created 2017-11-11, last modified 2019-06-11

Fulltext:
Download fulltext
PDF

Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)