Large-scale retrieval for medical image analytic : a comprehensive review

Li, Zhongyu (University of North Carolina at Charlotte, USA) ; Zhang, Xiaofan (University of North Carolina at Charlotte, USA) ; Müller, Henning (University of Applied Sciences and Arts Western Switzerland (HES-SO Valais-Wallis)) ; Zhang, Shaoting (University of North Carolina at Charlotte, USA)

Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis.


Keywords:
Article Type:
scientifique
Faculty:
Economie et Services
School:
HEG-VS
Institute:
Institut Informatique de gestion
Subject(s):
Informatique
Date:
2018-01
Pagination:
19 p.
Published in:
Medical image analysis
Numeration (vol. no.):
2018, vol. 43, pp. 66-84
DOI:
ISSN:
1361-8415
Appears in Collection:



 Record created 2018-04-11, last modified 2019-03-22

Fulltext:
Download fulltext
PDF

Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)