Sparse regressions for joint segmentation and linear prediction | IEEE Conference Publication | IEEE Xplore

Sparse regressions for joint segmentation and linear prediction


Abstract:

Regularizing the least-squares criterion with the total number of coefficient changes, it is possible to estimate time-varying (TV) autoregressive (AR) models with piecew...Show More

Abstract:

Regularizing the least-squares criterion with the total number of coefficient changes, it is possible to estimate time-varying (TV) autoregressive (AR) models with piecewise-constant coefficients. Such models emerge in various applications including speech segmentation using linear predictors. To cope with the large-size optimization task, the problem is cast as a sparse regression one, and is solved by resorting to an efficient block-coordinate descent algorithm. This enables joint segmentation and linear predictor coefficients identification with linear computational complexity per iteration. Modern trends in linear prediction for speech processing also envision sparsity in the model residuals. Indeed, sparse residuals allow for an improved representation of voiced speech. So far, sparse linear coding was proposed in a stationary scenario, i.e, after speech segmentation. This paper extends joint segmentation and linear prediction coefficients identification to sparse linear coding. Fortunately, coordinate descent approaches are still applicable to carry out the optimization tasks. Numerical tests have shown the benefits of the proposed algorithm.
Date of Conference: 04-09 May 2014
Date Added to IEEE Xplore: 14 July 2014
Electronic ISBN:978-1-4799-2893-4

ISSN Information:

Conference Location: Florence, Italy

References

References is not available for this document.