You are here

Person Tracking in Camera and Microphone Network

Integrated audio-visual monitoring can provide detailed reporting on when who is speaking, and towards whom or what, that is especially relevant for realizing multi-modal interfaces operating from a distance, and for the analysis of social interactions occurring in a closed space. Based on spatial reasoning, detected acoustic events can be either associated with one or more speaking individuals that are tracked persistently by the cameras, or be ignored as background noise. A more precise head orientation estimation of the speaker is also obtained through early fusion of audio-visual cues. 

In collaboration with the SHINE team of FBK, we combine real-time tracking with acoustic source localization techniques into an integrated solution for audio-visual monitoring in smart spaces.

Selected publications:

A. Brutti, O. Lanz: A joint particle filter to track the position and head orientation of people using audio visual cues. European Signal Processing Conference - EUSIPCO, Aalborg, Denmark, August 23-27, 2010

A Brutti, O. Lanz: An Audio-Visual Particle Filter for Monitoring Interactive People Behaviour. Workshop on Pattern Recognition and Artificial Intelligence for Human Behaviour Analysis, Reggio Emilia, Italy, December 12, 2009

R. Brunelli, A. Brutti, P. Chippendale, O. Lanz, M. Omologo, P. Svaizer, F. Tobia. A Generative Approach to Audio-Visual Person Tracking. International Evaluation Workshop on Classification of Events, Activities and Relationships - CLEAR, Southampton, UK, April 6-7, 2006

Research topics: