Please use this identifier to cite or link to this item: http://hdl.handle.net/2440/73304
Citations
Scopus Web of Science® Altmetric
?
?
Type: Journal article
Title: Shape similarity analysis by self-tuning locally constrained mixed-diffusion
Author: Luo, L.
Shen, C.
Zhang, C.
Van Den Hengel, A.
Citation: IEEE Transactions on Multimedia, 2013; 15(5):1174-1183
Publisher: Institute of Electrical and Electronic Engineers
Issue Date: 2013
ISSN: 1520-9210
1941-0077
Statement of
Responsibility: 
Lei Luo, Chunhua Shen, Chunyuan Zhang and Anton van den Hengel
Abstract: Similarity analysis is a powerful tool for shape matching/retrieval and other computer vision tasks. In the literature, various shape (dis)similarity measures have been introduced. Different measures specialize on different aspects of the data. In this paper, we consider the problem of improving retrieval accuracy by systematically fusing several different measures. To this end, we propose the locally constrained mixeddiffusion method, which partly fuses the given measures into one and propagates on the resulted locally dense data space. Furthermore, we advocate the use of self-adaptive neighborhoods to automatically determine the appropriate size of the neighborhoods in the diffusion process, with which the retrieval performance is comparable to the best manually tuned kNNs. The superiority of our approach is empirically demonstrated on both shape and image datasets. Our approach achieves a score of 100% in the bull’s eye test on the MPEG-7 shape dataset, which is the best reported result to date.
Keywords: Shape similarity analysis; shape/image retrieval; locally constrained mixed-diffusion.
Rights: Copyright © 2012 IEEE
RMID: 0020131818
DOI: 10.1109/TMM.2013.2242450
Grant ID: http://purl.org/au-research/grants/arc/LP120200485
http://purl.org/au-research/grants/arc/FT120100969
Appears in Collections:Computer Science publications

Files in This Item:
File Description SizeFormat 
hdl_73304.pdfAccepted version403.85 kBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.