Medical case-based retrieval : integrating MeSH terms into visual query reweighting

García Seco de Herrera, Alba (University of Applied Sciences and Arts Western Switzerland (HES-SO Valais-Wallis)) ; Foncubierta-Rodriguez, Antonio (University of Applied Sciences and Arts Western Switzerland (HES-SO Valais-Wallis)) ; Müller, Henning (University of Applied Sciences and Arts Western Switzerland (HES-SO Valais-Wallis))

Advances in medical knowledge give clinicians more objective information for a diagnosis. Therefore, there is an increasing need for bibliographic search engines that can provide services helping to facilitate faster information search. The ImageCLEFmed benchmark proposes a medical case-based retrieval task. This task aims at retrieving articles from the biomedical literature that are relevant for dierential diagnosis of query cases including a textual description and several images. In the context of this campaign many approaches have been investigated showing that the fusion of visual and text information can improve the precision of the retrieval. However, fusion does not always lead to better results. In this paper, a new query-adaptive fusion criterion to decide when to use multi-modal (text and visual) or only text approaches is presented. The proposed method integrates text information contained in MeSH (Medical Subject Headings) terms extracted and visual features of the images to nd synonym relations between them. Given a text query, the query-adaptive fusion criterion decides when it is suitable to also use visual information for the retrieval. Results show that this approach can decide if a text or multi{modal approach should be used with 77:15% of accuracy.

Type de conférence:
full paper
Economie et Services
Institut Informatique de gestion
Adresse bibliogr.:
Orlando, USA, 21-26 February
Orlando, USA
21-26 February
10 p.
Publié dans
SPIE medical imaging 2015
Le document apparaît dans:

 Notice créée le 2015-07-30, modifiée le 2018-12-20

Télécharger le document

Évaluer ce document:

Rate this document:
(Pas encore évalué)