Please use this identifier to cite or link to this item: https://idr.l2.nitk.ac.in/jspui/handle/123456789/14810
Full metadata record
DC FieldValueLanguage
dc.contributor.authorGupta D.
dc.contributor.authorAnantharaman A.
dc.contributor.authorMamgain N.
dc.contributor.authorSowmya Kamath S.
dc.contributor.authorBalasubramanian V.N.
dc.contributor.authorJawahar C.V.
dc.date.accessioned2021-05-05T10:15:48Z-
dc.date.available2021-05-05T10:15:48Z-
dc.date.issued2020
dc.identifier.citationProceedings - 2020 IEEE Winter Conference on Applications of Computer Vision, WACV 2020 , Vol. , , p. 1198 - 1206en_US
dc.identifier.urihttps://doi.org/10.1109/WACV45572.2020.9093384
dc.identifier.urihttp://idr.nitk.ac.in/jspui/handle/123456789/14810-
dc.description.abstractObject detection has been at the forefront for higher level vision tasks such as scene understanding and contextual reasoning. Therefore, solving object detection for a large number of visual categories is paramount. Zero-Shot Object Detection (ZSD) - where training data is not available for some of the target classes - provides semantic scalability to object detection and reduces dependence on large amount of annotations, thus enabling a large number of applications in real-life scenarios. In this paper, we propose a novel multi-space approach to solve ZSD where we combine predictions obtained in two different search spaces. We learn the projection of visual features of proposals to the semantic embedding space and class labels in the semantic embedding space to visual space. We predict similarity scores in the individual spaces and combine them. We present promising results on two datasets, PASCAL VOC and MS COCO. We further discuss the problem of hubness and show that our approach alleviates hubness with a performance superior to previously proposed methods. © 2020 IEEE.en_US
dc.titleA multi-space approach to zero-shot object detectionen_US
dc.typeConference Paperen_US
Appears in Collections:2. Conference Papers

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.