A shadow elimination approach in video-surveillance context

Year: 2006

Authors: Leone A., Distante C., Buccolieri F.

Autors Affiliation: CNR, IMM, Ist Microelettron & Microsistemi, I-73100 Lecce, Italy

Abstract: Moving objects tracking is all important problem in many applications such as video-surveillance. Monitoring systems call be improved using vision-based techniques able to extract and classify objects in the scene. However, problems arise due to unexpected shadows because shadow detection is critical for accurate objects detection in video stream, since shadow points are often misclassified as object points causing errors in localization, segmentation, measurements, tracking and classification of moving objects. The paper presents a new approach for removing shadows from moving objects, starting from it frame-difference method using a grey-level textured adaptive background. The shadow detection scheme uses photometric properties and the notion of shadow as semi-transparent region which retains a reduced-contrast representation of the underlying surface pattern and texture. We analyze the problem of representing texture information in terms of redundant systems of functions for texture identification. The method for discriminating shadows from moving objects is based oil a Pursuit scheme using all over-complete dictionary. The basic idea is to use the simple but powerful Matching Pursuit algorithm (MP) for representing texture as linear combination of elements of a big set of functions. Particularly, MP selects the best little set of atoms of 2D Gabor dictionary for features selection representative of properties of the texture in the image. Experimental results validate the algorithm

Journal/Review: PATTERN RECOGNITION LETTERS

Volume: 27 (5)      Pages from: 345  to: 355

KeyWords: GABOR FILTERS; TEXTURE SEGMENTATION; MATCHING PURSUITS; OBJECT DETECTION; DICTIONARIES
DOI: 10.1016/j.patrec.2005.08.020

Citations: 16
data from “WEB OF SCIENCE” (of Thomson Reuters) are update at: 2024-04-28
References taken from IsiWeb of Knowledge: (subscribers only)
Connecting to view paper tab on IsiWeb: Click here
Connecting to view citations from IsiWeb: Click here