Please use this identifier to cite or link to this item: https://idr.l2.nitk.ac.in/jspui/handle/123456789/7306
Full metadata record
DC FieldValueLanguage
dc.contributor.authorRao, T.J.N.
dc.contributor.authorGirish, G.N.
dc.contributor.authorRajan, J.
dc.date.accessioned2020-03-30T09:58:48Z-
dc.date.available2020-03-30T09:58:48Z-
dc.date.issued2017
dc.identifier.citationAdvances in Intelligent Systems and Computing, 2017, Vol.459 AISC, , pp.133-147en_US
dc.identifier.urihttp://idr.nitk.ac.in/jspui/handle/123456789/7306-
dc.description.abstractAnomalous event detection is the foremost objective of a visual surveillance system. Using contextual information and probabilistic inference mechanisms is a recent trend in this direction. The proposed method is an improved version of the Spatio-Temporal Compositions (STC) concept, introduced earlier. Specific modifications are applied to STC method to reduce time complexity and improve the performance. The non-overlapping volume and ensemble formation employed reduce the iterations in codebook construction and probabilistic modeling steps. A simpler procedure for codebook construction has been proposed. A non-parametric probabilistic model and adaptive inference mechanisms to avoid the use of a single experimental threshold value are the other contributions. An additional feature such as event-driven high-resolution localization of unusual events is incorporated to aid in surveillance application. The proposed method produced promising results when compared to STC and other state-of-the-art approaches when experimented on seven standard datasets with simple/complex actions, in non-crowded/crowded environments. � Springer Science+Business Media Singapore 2017.en_US
dc.titleAn improved contextual information based approach for anomaly detection via adaptive inference for surveillance applicationen_US
dc.typeBook chapteren_US
Appears in Collections:2. Conference Papers

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.