Foreground Estimation Based on Linear Regression Model With Fused Sparsity on Outliers

Abstract

Foreground detection is an important task in computer vision applications. In this paper, we present an efficient foreground detection method based on a robust linear regression model. First, a novel framework is proposed where foreground detection has been cast as an outlier signal estimation problem in a linear regression model. We regularize this problem by imposing a so called fused sparsity constraint, which encourages both sparsity and smoothness of vector coefficients, on the outlier signal. Second, we convert this outlier signal estimation problem into an equivalent Fused Lasso problem, and then use existing solutions to obtain an optimized solution. Third, a new foreground detection method is presented to apply this new model to the 2D image domain by merging the results from different vectorizations. Experiments on a set of challenging sequences show that the proposed method is not only superior to many state-of-the-art techniques, but also robust to noise.

Publication
IEEE Transactions on Circuits and Systems for Video Technology
Gengjian Xue
Gengjian Xue
Master Student

I am interested in reading and running.Also I often go to gym for health building.

Li Song
Li Song
Professor, IEEE Senior Member