000002223 001__ 2223
000002223 005__ 20190611210131.0
000002223 037__ $$aCONFERENCE
000002223 245__ $$aConvolutional neural networks for an automatic classification of prostate tissue slides with high–grade Gleason score
000002223 260__ $$c2017$$b12-13 February 2017$$aOrlando, USA
000002223 269__ $$a2017-02
000002223 300__ $$a9 p.
000002223 506__ $$avisible
000002223 520__ $$9eng$$aThe Gleason grading system was developed for assessing prostate histopathology slides. It is correlated to the outcome and incidence of relapse in prostate cancer. Although this grading is part of a standard protocol performed by pathologists, visual inspection of whole slide images (WSIs) has an inherent subjectivity when evaluated by different pathologists. Computer aided pathology has been proposed to generate an objective and reproducible assessment that can help pathologists in their evaluation of new tissue samples. Deep convolutional neural networks are a promising approach for the automatic classification of histopathology images and can hierarchically learn subtle visual features from the data. However, a large number of manual annotations from pathologists are commonly required to obtain sufficient statistical generalization when training new models that can evaluate the daily generated large amounts of pathology data. A fully automatic approach that detects prostatectomy WSIs with high–grade Gleason score is proposed. We evaluate the performance of various deep learning architectures training them with patches extracted from automatically generated regions–of–interest rather than from manually segmented ones. Relevant parameters for training the deep learning model such as size and number of patches as well as the inclusion or not of data augmentation are compared between the tested deep learning architectures. 235 prostate tissue WSIs with their pathology report from the publicly available TCGA data set were used. An accuracy of 78% was obtained in a balanced set of 46 unseen test images with different Gleason grades in a 2–class decision: high vs. low Gleason grade. Grades 7–8, which represent the boundary decision of the proposed task, were particularly well classified. The method is scalable to larger data sets with straightforward re–training of the model to include data from multiple sources, scanners and acquisition techniques. Automatically generated heatmaps for the WSIs could be useful for improving the selection of patches when training networks for big data sets and to guide the visual inspection of these images.
000002223 592__ $$aHEG-VS
000002223 592__ $$bInstitut Informatique de gestion
000002223 592__ $$cEconomie et Services
000002223 65017 $$aInformatique
000002223 6531_ $$aprostate cancer grading$$9eng
000002223 6531_ $$aconvolutional neural networks$$9eng
000002223 6531_ $$acomputer aided pathology$$9eng
000002223 655_7 $$afull paper
000002223 700__ $$uUniversity of Applied Sciences and Arts Western Switzerland (HES-SO Valais-Wallis) ; University of Geneva, Switzerland$$aJimenez–del–Toro, Oscar
000002223 700__ $$uUniversity of Applied Sciences and Arts Western Switzerland (HES-SO Valais-Wallis)$$aAtzori, Manfredo
000002223 700__ $$uUniversity of Applied Sciences and Arts Western Switzerland (HES-SO Valais-Wallis) ; University of Geneva, Switzerland$$aOtalora, Sebastian
000002223 700__ $$uContextVision AB, Stockholm, Sweden$$aAndersson, Mats
000002223 700__ $$uContextVision AB, Stockholm, Sweden$$aEurén, Kristian
000002223 700__ $$uContextVision AB, Stockholm, Sweden$$aHedlund, Martin
000002223 700__ $$uContextVision AB, Stockholm, Sweden$$aRönnquist, Peter
000002223 700__ $$aMüller, Henning$$uUniversity of Applied Sciences and Arts Western Switzerland (HES-SO Valais-Wallis) ; University of Geneva, Switzerland ; Martinos Center for Biomedical Imaging, Charlestown, USA
000002223 711__ $$d12/02/2017 / 13/02/2017$$cOrlando, USA$$aSPIE Medical Imaging 2017 : Digital Pathology
000002223 773__ $$tProceedings of SPIE Medical Imaging 2017 : Digital Pathology
000002223 8564_ $$uhttps://hesso.tind.io/record/2223/files/Jimenez_2017_convolutional_neural_networks.pdf$$s4996810
000002223 8564_ $$xpdfa$$uhttps://hesso.tind.io/record/2223/files/Jimenez_2017_convolutional_neural_networks.pdf?subformat=pdfa$$s4636740
000002223 906__ $$aGREEN
000002223 950__ $$aI1
000002223 980__ $$aconference