skip to main content
10.1145/2020408.2020421acmconferencesArticle/Chapter ViewAbstractPublication PageskddConference Proceedingsconference-collections
research-article

An improved GLMNET for l1-regularized logistic regression

Published:21 August 2011Publication History

ABSTRACT

GLMNET proposed by Friedman et al. is an algorithm for generalized linear models with elastic net. It has been widely applied to solve L1-regularized logistic regression. However, recent experiments indicated that the existing GLMNET implementation may not be stable for large-scale problems. In this paper, we propose an improved GLMNET to address some theoretical and implementation issues. In particular, as a Newton-type method, GLMNET achieves fast local convergence, but may fail to quickly obtain a useful solution. By a careful design to adjust the effort for each iteration, our method is efficient regardless of loosely or strictly solving the optimization problem. Experiments demonstrate that the improved GLMNET is more efficient than a state-of-the-art coordinate descent method.

References

  1. J. Friedman, T. Hastie, and R. Tibshirani, "Regularization paths for generalized linear models via coordinate descent," Journal of Statistical Software, vol. 33, no. 1, pp. 1--22, 2010.Google ScholarGoogle ScholarCross RefCross Ref
  2. A. Genkin, D. D. Lewis, and D. Madigan, "Large-scale Bayesian logistic regression for text categorization," Technometrics, vol. 49, no. 3, pp. 291--304, 2007.Google ScholarGoogle ScholarCross RefCross Ref
  3. K. Koh, S.-J. Kim, and S. Boyd, "An interior-point method for large-scale l1-regularized logistic regression," Journal of Machine Learning Research, vol. 8, pp. 1519--1555, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. G. Andrew and J. Gao, "Scalable training of L1-regularized log-linear models," in Proceedings of the Twenty Fourth International Conference on Machine Learning (ICML), 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. J. Liu, J. Chen, and J. Ye, "Large-scale sparse logistic regression," in Proceedings of The 15th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 547--556, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. M. Schmidt, G. Fung, and R. Rosales, "Fast optimization methods for l1 regularization: A comparative study and two new approaches," in Proceedings of European Conference on Machine Learning, pp. 286--297, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. G.-X. Yuan, K.-W. Chang, C.-J. Hsieh, and C.-J. Lin, "A comparison of optimization methods and software for large-scale l1-regularized linear classification," Journal of Machine Learning Research, vol. 11, pp. 3183--3234, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. P. Tseng and S. Yun, "A coordinate gradient descent method for nonsmooth separable minimization," Mathematical Programming, vol. 117, pp. 387--423, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. S. Yun and K.-C. Toh, "A coordinate gradient descent method for l1-regularized convex minimization," Computational Optimizations and Applications, vol. 48, no. 2, pp. 273--307, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. K.-W. Chang, C.-J. Hsieh, and C.-J. Lin, "Coordinate descent method for large-scale L2-loss linear SVM," Journal of Machine Learning Research, vol. 9, pp. 1369--1398, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. G.-X. Yuan, C.-H. Ho, and C.-J. Lin, "An improved GLMNET for l1-regularized logistic regression and support vector machines," tech. rep., National Taiwan University, 2011.Google ScholarGoogle Scholar
  12. H.-F. Yu, H.-Y. Lo, H.-P. Hsieh, J.-K. Lou, T. G. McKenzie, J.-W. Chou, P.-H. Chung, C.-H. Ho, C.-F. Chang, Y.-H. Wei, J.-Y. Weng, E.-S. Yan, C.-W. Chang, T.-T. Kuo, Y.-C. Lo, P. T. Chang, C. Po, C.-Y. Wang, Y.-H. Huang, C.-W. Hung, Y.-X. Ruan, Y.-S. Lin, S.-D. Lin, H.-T. Lin, and C.-J. Lin, "Feature engineering and classifier ensemble for KDD cup 2010," in JMLR Workshop and Conference Proceedings, 2011. To appear.Google ScholarGoogle Scholar
  13. R.-E. Fan, K.-W. Chang, C.-J. Hsieh, X.-R. Wang, and C.-J. Lin, "LIBLINEAR: A library for large linear classification," Journal of Machine Learning Research, vol. 9, pp. 1871--1874, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. An improved GLMNET for l1-regularized logistic regression

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        KDD '11: Proceedings of the 17th ACM SIGKDD international conference on Knowledge discovery and data mining
        August 2011
        1446 pages
        ISBN:9781450308137
        DOI:10.1145/2020408

        Copyright © 2011 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 21 August 2011

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        Overall Acceptance Rate1,133of8,635submissions,13%

        Upcoming Conference

        KDD '24

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader