Loading [a11y]/accessibility-menu.js
The generalization performance of kernelized elastic net regularization based on exponentially strongly mixing observations | IEEE Conference Publication | IEEE Xplore

The generalization performance of kernelized elastic net regularization based on exponentially strongly mixing observations


Abstract:

The study of kernelized elastic net regularization (KENReg) is a promising topic in machine learning community. Different from the elastic net regularizer in Zou and Hast...Show More

Abstract:

The study of kernelized elastic net regularization (KENReg) is a promising topic in machine learning community. Different from the elastic net regularizer in Zou and Hastie [1] which focuses on selecting group and correlated features, KENReg aims at obtaining a sparse and stable approximation to the regression function. In terms of generalization, sparseness and stability, KENReg performs better than kernelized Lasso, we are concerned about the generalization performance of KENReg in this paper. For the purpose of this paper is to study the generalization ability of KENReg for the dependent samples, and the dependent samples we choose are exponentially strongly mixing sequence. So as to study the generalization ability of KENReg for exponentially strongly mixing samples, we first use the stepping-stone technique proposed in Wu and Zhou [2], then we shown the generalization bounds of KENReg based on exponentially strongly mixing samples, and finally we make the results more accurate with uniformly ergodic Markov chains (u.e.M.c.) samples.
Date of Conference: 29-31 July 2017
Date Added to IEEE Xplore: 25 June 2018
ISBN Information:
Conference Location: Guilin, China

References

References is not available for this document.