Paper
13 April 2009 SAM: an interoperable metadata model for multimodal surveillance applications
Peter Schallauer, Werner Bailer, Albert Hofmann, Roland Mörzinger
Author Affiliations +
Abstract
Metadata interoperability is crucial for various kinds of surveillance applications and systems, e.g. metadata mining in multi-sensor environments, metadata exchange in networked camera systems or information fusion in multi-sensor and multi-detector environments. Different metadata formats have been proposed to foster metadata interoperability, but they show significant limitations. ViPER, CVML and MPEG Visual Surveillance MAF support only the visual modality, CVML's frame based approach leads to inefficient representation, and MPEG-7's comprehensiveness handicaps its efficient usage for a specific application. To overcome these limitations we propose the Surveillance Application Metadata (SAM) model, capable of describing online and offline analysis results as a set of time lines containing events. A set of sensors, detectors, recorded media items and object instances is described centrally and linked from the event descriptions. The time lines can be related to a subset of sensors and detectors for any modality and different levels of abstraction. Hierarchical classification schemes are used for many purposes, such as types of properties and their values, event types, object classes, coordinate systems etc. in order to allow for application specific adaptations without modifying the data model while ensuring the controlled use of terms. The model supports efficient representation of dense spatio-temporal information such as object trajectories. SAM is not bound to a specific serialization but can be mapped to different existing formats within the limitations evoked by the target format. SAM specifications and examples have been made available
© (2009) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Peter Schallauer, Werner Bailer, Albert Hofmann, and Roland Mörzinger "SAM: an interoperable metadata model for multimodal surveillance applications", Proc. SPIE 7344, Data Mining, Intrusion Detection, Information Security and Assurance, and Data Networks Security 2009, 73440C (13 April 2009); https://doi.org/10.1117/12.818481
Lens.org Logo
CITATIONS
Cited by 3 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Sensors

Data modeling

Surveillance

Video surveillance

Systems modeling

Video

Visualization

RELATED CONTENT

Evaluation of privacy in high dynamic range video sequences
Proceedings of SPIE (September 23 2014)
Wide area persistent surveillance with no gimbal
Proceedings of SPIE (May 03 2012)
SeeCoast port surveillance
Proceedings of SPIE (May 12 2006)
Traffic camera markup language (TCML)
Proceedings of SPIE (February 15 2012)
Tracking people in mixed modality systems
Proceedings of SPIE (January 29 2007)

Back to Top