Foreground Estimation Based on Linear Regression Model With Fused Sparsity on Outliers


Foreground detection is an important task in computer vision applications. In this paper, we present an efficient foreground detection method based on a robust linear regression model. First, a novel framework is proposed where foreground detection has been cast as an outlier signal estimation problem in a linear regression model. We regularize this problem by imposing a so called fused sparsity constraint, which encourages both sparsity and smoothness of vector coefficients, on the outlier signal. Second, we convert this outlier signal estimation problem into an equivalent Fused Lasso problem, and then use existing solutions to obtain an optimized solution. Third, a new foreground detection method is presented to apply this new model to the 2D image domain by merging the results from different vectorizations. Experiments on a set of challenging sequences show that the proposed method is not only superior to many state-of-the-art techniques, but also robust to noise.

IEEE Transactions on Circuits and Systems for Video Technology
Li Song
Li Song
Professor, IEEE Senior Member

Professor, Doctoral Supervisor, the Deputy Director of the Institute of Image Communication and Network Engineering of Shanghai Jiao Tong University, the Double-Appointed Professor of the Institute of Artificial Intelligence and the Collaborative Innovation Center of Future Media Network, the Deputy Secretary-General of the China Video User Experience Alliance and head of the standards group.