Skip to main content
Log in

Active Learning Query Strategies for Classification, Regression, and Clustering: A Survey

  • Survey
  • Published:
Journal of Computer Science and Technology Aims and scope Submit manuscript

Abstract

Generally, data is available abundantly in unlabeled form, and its annotation requires some cost. The labeling, as well as learning cost, can be minimized by learning with the minimum labeled data instances. Active learning (AL), learns from a few labeled data instances with the additional facility of querying the labels of instances from an expert annotator or oracle. The active learner uses an instance selection strategy for selecting those critical query instances, which reduce the generalization error as fast as possible. This process results in a refined training dataset, which helps in minimizing the overall cost. The key to the success of AL is query strategies that select the candidate query instances and help the learner in learning a valid hypothesis. This survey reviews AL query strategies for classification, regression, and clustering under the pool-based AL scenario. The query strategies under classification are further divided into: informative-based, representative-based, informative- and representative-based, and others. Also, more advanced query strategies based on reinforcement learning and deep learning, along with query strategies under the realistic environment setting, are presented. After a rigorous mathematical analysis of AL strategies, this work presents a comparative analysis of these strategies. Finally, implementation guide, applications, and challenges of AL are discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Mitchell T. Machine Learning (1st edition). MacGraw-Hill Education, 1997.

  2. Hu R. Active learning for text classification [Ph.D. Thesis]. Dublin Institute of Technology, 2011.

  3. Tuia D, Ratle F, Pacifici F, Kanevski M F, Emery W J. Active learning methods for remote sensing image classification. IEEE Trans. Geoscience and Remote Sensing, 2009, 47(7-2): 2218-2232.

    Google Scholar 

  4. Guo J, Chen H, Sun Z, Lin Y. A novel method for protein secondary structure prediction using dual-layer SVM and profiles. PROTEINS: Structure, Function, and Bioinformatics, 2004, 54(4): 738-743.

    Google Scholar 

  5. Zhu X. Semi-supervised learning literature survey. Technical Report, University of Wisconsin-Madison, 2008. http://pages.cs.wisc.edu/∼jerryzhu/pub/ssl survey.pdf, Nov. 2019.

  6. Settles B. Active learning literature survey. Technical Report, University of Wisconsin-Madison, 2009. http://apophenia.wdfiles.com/local–files/start/settles active.learning.pdf, Nov. 2019.

  7. Cohn D, Atlas L, Ladner R. Improving generalization with active learning. Machine Learning, 1994, 15(2): 201-221.

    Google Scholar 

  8. Wang M, Hua X S. Active learning in multimedia annotation and retrieval: A survey. ACM Trans. Intelligent Systems and Technology, 2011, 2(2): Article No. 10.

  9. Lewis D D, Catlett J. Heterogeneous uncertainty sampling for supervised learning. In Proc. the 11th Int. Conference on Machine Learning, July 1994, pp.148-156.

  10. Zhu X, Zhang P, Lin X, Shi Y. Active learning from data streams. In Proc. the 7th IEEE Int. Conference on Data Mining, October 2007, pp.757-762.

  11. Zhu X, Zhang P, Lin X, Shi Y. Active learning from stream data using optimal weight classifier ensemble. IEEE Trans. Systems, Man, and Cybernetics, Part B, 2010, 40(6): 1607-1621.

    Google Scholar 

  12. Zliobaite I, Bifet A, Pfahringer, Holmes G. Active learning with drifting streaming data. IEEE Trans. Neural Networks and Learning Systems, 2014, 25(1): 27-39.

    Google Scholar 

  13. Wang P, Zhang P, Guo L. Mining multi-label data streams using ensemble-based active learning. In Proc. the 12th SIAM International Conference on Data Mining, April 2012, pp.1131-1140.

  14. Angluin D. Queries and concept learning. Machine Learning, 1988, 2(4): 319-342.

    MathSciNet  Google Scholar 

  15. Wang L, Hu X, Yuan B, Lu J. Active learning via query synthesis and nearest neighbour search. Neurocomputing, 2015, 147: 426-434.

    Google Scholar 

  16. Sun L L, Wang X Z. A survey on active learning strategy. In Proc. the Int. Conference on Machine Learning and Cybernetics, July 2010, pp.161-166.

  17. Fu Y, Zhu X, Li B. A survey on instance selection for active learning. Knowledge and Information Systems, 2012, 35(2): 249-283.

    Google Scholar 

  18. Aggarwal C, Kong X, Gu Q, Han J, Yu P. Active learning: A survey. In Data Classification: Algorithms and Applications, Aggarwal C C (ed.), CRC Press, 2014, pp.571-605.

  19. Lewis D D, Gale W A. A sequential algorithm for training text classifiers. In Proc. the 17th Annual International ACM-SIGIR Conference on Research and Development in Information Retrieval, July 1994, pp.3-12.

  20. Atlas L, Cohn D A, Ladner R E. Training connectionist networks with queries and selective sampling. In Proc. the 3rd Annual Conference on Neural Information Processing Systems, November 1989, pp.566-573.

  21. Culotta A, Mccallum A. Reducing labeling effort for structured prediction tasks. In Proc. the 20th National Conference on Artificial Intelligence, July 2005, pp.746-751.

  22. Shannon C E. A mathematical theory of communication. Bell System Technical Journal, 1948, 27(3): 379-423.

    MathSciNet  MATH  Google Scholar 

  23. Scheffer T, Decomain C, Wrobel S. Active hidden Markov models for information extraction. In Proc. the 4th International Conference on Advances in Intelligent Data Analysis, September 2001, pp.309-318.

  24. Seung H, Opper M, Sompolinsky H. Query by committee. In Proc. the 5th Annual Conference on Computational Learning Theory, July 1992, pp.287-294.

  25. Abe N, Mamitsuka H. Query learning strategies using boosting and bagging. In Proc. the 15th International Conference on Machine Learning, July 1998, pp.1-9.

  26. Melville P, Mooney R J. Diverse ensembles for active learning. In Proc. the 21st Int. Conference on Machine learning, July 2004, Article No. 56.

  27. Muslea I, Minton S, Knoblock C A. Selective sampling with redundant views. In Proc. the 17th National Conference on Artificial Intelligence, July 2000, pp.621-626.

  28. Cortes C, Vapnik V. Support-vector networks. Machine Learning, 1995, 20(3): 273-297.

    MATH  Google Scholar 

  29. Kremer J, Pedersen K S, Igel C. Active learning with support vector machines. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 2014, 4(4): 313-326.

    Google Scholar 

  30. Tong S, Koller D. Support vector machine active learning with applications to text classification. Journal of Machine Learning Research, 2002, 2: 45-66.

    MATH  Google Scholar 

  31. Vapnik V. An overview of statistical learning theory. IEEE Trans. Neural Networks, 1999, 10(5): 988-999.

    Google Scholar 

  32. Schohn G, Cohn D. Less is more: Active learning with support vector machines. In Proc. the 17th Int. Conference on Machine Learning, June 2000, pp.839-846.

  33. Campbell C, Cristianini N, Smola A. Query learning with large margin classifiers. In Proc. the 17th Int. Conference on Machine Learning, June 2000, pp.111-118.

  34. Indyk P, Motwani R. Approximate nearest neighbors: Towards removing the curse of dimensionality. In Proc. the 30th Annual ACM Symposium on Theory of Computing, May 1998, pp.604-613.

  35. Gionis A, Indyk P, Motwani R. Similarity search in high dimension via hashing. In Proc. the 25th Int. Conference on Very Large Data Bases, September 1999, pp.518-529.

  36. Jain P, Vijayanarasimhan S, Grauman K. Hashing hyperplane queries to near points with applications to large-scale active learning. In Proc. the 24th Annual Conference on Neural Information Processing Systems, December 2010, pp.928-936.

  37. Vijayanarasimhan S, Jain P, Grauman K. Hashing hyperplane queries to near points with applications to large-scale active learning. IEEE Trans. Pattern Analysis and Machine Intelligence, 2014, 36(2): 276-288.

    Google Scholar 

  38. Basri R, Hassner T, Zelnik-Manor L. Approximate nearest subspace search. IEEE Trans. Pattern Analysis and Machine Intelligence, 2011, 33(2): 266-278.

    Google Scholar 

  39. Basri R, Hassner T, Zelnik-Manor L. Approximate nearest subspace search with applications to pattern recognition. In Proc. the 2017 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, June 2007.

  40. Wang J, Shen H, Song J, Ji J. Hashing for similarity search: A survey. arXiv:1408.2927, 2014. http://arxiv.org/abs/1408.2927, Nov. 2019.

  41. Settles B, Craven M. An analysis of active learning strategies for sequence labeling tasks. In Proc. the 2008 Conference on Empirical Methods in Natural Language Processing, October 2008, pp.1070-1079.

  42. Wu Y, Kozintsev I, Bouguet J Y, Dulong C. Sampling strategies for active learning in personal photo retrieval. In Proc. the IEEE International Conference on Multimedia and Expo, July 2006, pp.529-532.

  43. Ienco D, Bifet A, Zliobaite I et al. Clustering based active learning for evolving data streams. In Proc. the 16th Int. Conference on Discovery Science, October 2013, pp.79-93.

  44. Brinker K. Incorporating diversity in active learning with support vector machines. In Proc. the 20th Int. Conference on Machine Learning, August 2003, pp.59-66.

  45. Hoi S C H, Jin R, Lyu M R. Large-scale text categorization by batch mode active learning. In Proc. the 15th Int. Conference on World Wide Web, May 2006, pp.633-642.

  46. Hoi S C H, Jin R, Zhu J, Lyu M R. Batch mode active learning and its application to medical image classification. In Proc. the 23rd Int. Conference on Machine Learning, June 2006, pp.417-424.

  47. Xu Z, Akella R, Zhang Y. Incorporating diversity and density in active learning for relevance feedback. In Proc. the 29th Eur. Conf. Inf. Retrieval Research, April 2007, pp.246-257.

  48. Sen P, Namata G, Bilgic M, Getoor L, Galligher B, Eliassi- Rad T. Collective classification in network data. AI Magazine, 2008, 29(3): 93-106.

    Google Scholar 

  49. Neville J, Jensen D. Iterative classification in relational data. In Proc. the AAAI 2000 Workshop on Learning Statistical Models from Relational Data, July 2000, pp.42-49.

  50. Richardson M, Domingos P. Markov logic networks. Machine Learning, 2006, 62(1/2): 107-136.

    Google Scholar 

  51. Bilgic M, Mihalkova L, Getoor L. Active learning for networked data. In Proc. the 27th Int. Conference on Machine Learning, June 2010, pp.79-86.

  52. Wang Z, Ye J. Querying discriminative and representative samples for batch mode active learning. In Proc. the 19th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, August 2013, pp.158-166.

  53. Nguyen H T, Smeulders A. Active learning using preclustering. In Proc. the 21st Int. Conference on Machine Learning, July 2004, Article No. 19.

  54. Huang S J, Jin R, Zhou Z H. Active learning by querying informative and representative examples. IEEE Trans. Pattern Analysis and Machine Intelligence, 2014, 36(10): 1936-1949.

    Google Scholar 

  55. Hoi S C, Jin R, Zhu J, Lyu M R. Semi-supervised SVM batch mode active learning for image retrieval. In Proc. the 2008 IEEE Conference on Computer Vision and Pattern Recognition, June 2008, Article No. 10.

  56. Belkin M, Niyogi P, Sindhwani V. Manifold regularization: A geometric framework for learning from labeled and unlabeled examples. Journal of Machine Learning Research, 2006, 7: 2399-2434.

    MathSciNet  MATH  Google Scholar 

  57. Du B, Wang Z, Zhang L, Zhang L, Liu W, Shen J, Tao D. Exploring representativeness and informativeness for active learning. IEEE Trans. Cybernetics, 2017, 47(1): 14-26.

    Google Scholar 

  58. Gretton A, Borgwardt K M, RaschM J, Schölkopf B, Smola A. A kernel two-sample test. Journal of Machine Learning Research, 2012, 13: 723-773.

    MathSciNet  MATH  Google Scholar 

  59. Luo W, Schwing A, Urtasun R. Latent structured active learning. In Proc. the 27th Annual Conference on Neural Information Processing Systems, December 2013, pp.728-736.

  60. Anderson N, Hall P, Titterington D. Two-sample test statistics for measuring discrepancies between two multivariate probability density functions using kernel-based density estimates. Journal of Multivariate Analysis, 1994, 50(1): 41-54.

    MathSciNet  MATH  Google Scholar 

  61. Wang Z, Fang X, Tao X et al. Multi-class active learning by integrating uncertainty and diversity. IEEE Access, 2018, 6: 22794-22803.

    Google Scholar 

  62. Krempl G, Kottke D, Spiliopoulou M. Probabilistic active learning: Towards combining versatility, optimality and efficiency. In Proc. the 17th Int. Conference on Discovery Science, October 2014, pp.168-179.

  63. Chapelle O, Sch¨olkopf B, Zien A. Semi-Supervised Learning. The MIT Press, 2010.

  64. Krempl G, Kottke D, Lemaire V. Optimised probabilistic active learning (OPAL) — For fast, non-myopic, costsensitive active classification. Machine Learning, 2015, 100(2-3): 449-476.

    MathSciNet  MATH  Google Scholar 

  65. Settles B, Craven M, Ray S. Multiple-instance active learning. In Proc. the 21st Annual Conference on Neural Information Processing Systems, December 2007, pp.1289-1296.

  66. Roy N, McCallum A. Toward optimal active learning through sampling estimation of error reduction. In Proc. the 18th Int. Conference on Machine Learning, June 2001, pp441-448.

  67. Moskovitch R, Nissim N, Stopel D et al. Improving the detection of unknown computer worms activity using active learning. In Proc. the 30th German Conference on AI, September 2007, pp.489-493.

  68. Fang M, Li Y, Cohn T. Learning how to active learn: A deep reinforcement learning approach. In Proc. the Conference on Empirical Methods in Natural Language Processing, September 2017, pp.595-605.

  69. Liu M, Buntine W, Haffari G. Learning how to actively learn: A deep imitation learning approach. In Proc. the 56th Annual Meeting of the Association for Computational Linguistics, July 2018, pp.1874-1883.

  70. Pang K, Dong M, Wu Y et al. Meta-learning transferable active learning policies by deep reinforcement learning. arXiv:1806.04798, 2008. https://arxiv.org/abs/1806.04798, Nov. 2019.

  71. Bachman P, Sordoni A, Trischler A. Learning algorithms for active learning. In Proc. the 34th Int. Conference on Machine Learning, August 2017, pp.301-310.

  72. Cohn D, Ghahramani Z, Jordan M. Active learning with statistical models. In Proc. the 1994 Annual Conference on Neural Information Processing Systems, December 1994, pp.705-712.

  73. Geman S, Bienenstock E, Doursat R. Neural networks and the bias/variance dilemma. Neural Computation, 1992, 4(1): 1-58.

    Google Scholar 

  74. Schervish M. Theory of Statistics (1st edition). Springer, 1995.

  75. Long B, Chapelle O, Zhang Y, Chang Y, Zheng Y, Tseng B. Active learning for ranking through expected loss optimization. IEEE Trans. Knowledge and Data Engineering, 2015, 27(5): 1180-1191.

    Google Scholar 

  76. Freund Y, Seung H, Shamir E, Tishby N. Selective sampling using the query by committee algorithm. Machine Learning, 1997, 28(23): 133-168.

    MATH  Google Scholar 

  77. Krogh A, Vedelsby J. Neural network ensembles, cross validation, and active learning. In Proc. the 8th Annual Conference on Neural Information Processing Systems, November 1995, pp.231-238.

  78. Burbidge R, Rowland J J, King R D. Active learning for regression based on query by committee. In Proc. the 8th Int. Conference on Intelligent Data Engineering and Automated Learning, December 2007, pp.209-218.

  79. Cai W, Zhang Y, Zhou J. Maximizing expected model change for active learning in regression. In Proc. the 13th International Conference on Data Mining, December 2013, pp.51-60.

  80. Bottou L. Large-scale machine learning with stochastic gradient descent. In Proc. the 19th Int. Conference on Computational Statistics, August 2010, pp.177-186.

  81. Cai W, Zhang Y, Zhou S Y et al. Active learning for support vector machines with maximum model change. In Proc. the 2014 European Conference on Machine Learning and Knowledge Discovery in Databases, September 2014, pp.211-226.

  82. Dasgupta S. The two faces of active learning. In Proc. the 20th Int. Conference on Algorithmic Learning Theory, October 2009, Article No. 1.

  83. Dasgupta S, Hsu D. Hierarchical sampling for active learning. In Proc. the 25th Int. Conference on Machine Learning, June 2008, pp.208-215.

  84. Urner R, Ben-David S. Probabilistic lipschitzness: A niceness assumption for deterministic labels. In Proc. the 27th NIPS Learning Faster from Easy Data Workshop, December 2013.

  85. Steinwart I, Scovel C. Fast rates for support vector machines using Gaussian kernels. The Annals of Statistics, 2007, 35(2): 575-607.

    MathSciNet  MATH  Google Scholar 

  86. Urner R, Shalev-Shwartz S, Ben-David S. Access to unlabeled data can speed up prediction time. In Proc. the 28th Int. Conference on Machine Learning, June 2011, pp.641-648.

  87. Verma N, Kpotufe S, Dasgupta S. Which spatial partition trees are adaptive to intrinsic dimension? In Proc. the 25th Conference Uncertainty Artif. Intell., June 2009, pp.565-574.

  88. Urner R, Wulff S, Ben-David S. PlAL: Cluster-based active learning. In Proc. the 26th Conference on Learning Theory, June 2013, pp.376-397.

  89. Wang M, Min F, Zhang Z H, Wu Y X. Active learning through density clustering. Expert Systems with Applications, 2017, 85: 305-317.

    Google Scholar 

  90. Rodriguez A, Laio A. Clustering by fast search and find of density peaks. Science, 2014, 344(6191): 1492-1496.

    Google Scholar 

  91. Yan Y, Rosales R, Fung G, Dy J. Active learning from crowds. In Proc. the 28th Int. Conference on Machine Learning, June 2011, pp.1161-1168.

  92. Fang M, Zhu X, Li B, Ding W, Wu X. Self-taught active learning from crowds. In Proc. the 12th Int. Conference on Data Mining, December 2012, pp.858-863.

  93. Shu Z, Sheng V S, Li J. Learning from crowds with active learning and self-healing. Neural Computing and Applications, 2018, 30(9): 2883-2894.

    Google Scholar 

  94. Lampert C H, Nickisch H, Harmeling S. Attribute-based classification for zero-shot visual object categorization. IEEE Trans. Pattern Analysis and Machine Intelligence, 2014, 36(3): 453-465.

    Google Scholar 

  95. Ertekin S, Huang J, Giles C L. Active learning for class imbalance problem. In Proc. the 30th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, July 2007, pp.823-824.

  96. Attenberg J, Ertekin S¸. Class imbalance and active learning. In Imbalanced Learning: Foundations, Algorithms, and Applications, He H B, Ma Y Q (eds.), John Wiley & Sons, Inc., 2013, pp.101-149.

  97. Tomanek K, Morik K. Inspecting sample reusability for active learning. In Proc. the Workshop on Active Learning and Experimental Design, May 2010, pp.169-181.

  98. Hu R, Namee B M, Delany S J. Active learning for text classification with reusability. Expert Systems with Applications, 2016, 45(C): 438-449.

    Google Scholar 

  99. Settles B, Craven M, Friedland L. Active learning with real annotation costs. In Proc. the 2008 NIPS Workshop on Cost-Sensitive Learning, December 2008.

  100. Tomanek K, Hahn U. A comparison of models for costsensitive active learning. In Proc. the 23rd Int. Conference on Computational Linguistics, August 2010, pp.1247-1255.

  101. Liu A, Jun G, Ghosh J. Active learning of hyperspectral data with spatially dependent label acquisition costs. In Proc. the 2009 IEEE International Geoscience and Remote Sensing Symposium, July 2009, pp.256-259.

  102. Persello C, Boularias A, Dalponte M et al. Cost-sensitive active learning with lookahead: Optimizing field surveys for remote sensing data classification. IEEE Trans. Geoscience and Remote Sensing, 2014, 52(10): 6652-6664.

    Google Scholar 

  103. Margineantu D. Active cost-sensitive learning. In Proc. the 19th International Joint Conference on Artificial Intelligence, July 2005, pp.1622-1623.

  104. Krishnamurthy A, Agarwal A, Huang T et al. Active learning for cost-sensitive classification. arXiv:1703.01014, 2017. https://arxiv.org/abs/1703.01014, May 2019.

  105. Zhang D, Wang F, Shi Z et al. Interactive localized content based image retrieval with multiple-instance active learning. Pattern Recognition, 2010, 43(2): 478-484.

    MATH  Google Scholar 

  106. Wang R, Wang X, Kwong S et al. Incorporating diversity and informativeness in multiple-instance active learning. IEEE Trans. Fuzzy Systems, 2017, 25(6): 1460-1475.

    Google Scholar 

  107. Wu J, Sheng V S, Zhang J, Zhao P, Cui Z. Multi-label active learning for image classification. In Proc. the 21st IEEE Int. Conference on Image Processing, October 2014, pp.5227-5231.

  108. Yang B, Sun J T, Wang T, Chen Z. Effective multi-label active learning for text classification. In Proc. the 15th ACM SIGKDD Int. Conference on Knowledge Discovery and Data Mining, June 2009, pp.917-926.

  109. Pupo O, Morell C, Ventura S. Effective active learning strategy for multi-label learning. Neurocomputing, 2017, 273: 494-508.

    Google Scholar 

  110. Cherman E A, Papanikolaou Y, Tsoumakas G et al. Multilabel active learning: Key issues and a novel query strategy. Evolving Systems, 2017, 10(1): 63-78.

    Google Scholar 

  111. Rani M, Dhok S, Deshmukh R. A systematic review of compressive sensing: Concepts, implementations and applications. IEEE Access, 2018, 6: 4875-4894.

    Google Scholar 

  112. Som S. Learning label structure for compressed sensing based multilabel classification. In Proc. the 2016 SAI Computing Conference, July 2016, pp.54-60.

  113. Wu J, Ye C, Sheng V et al. Active learning with label correlation exploration for multi-label image classification. IET Computer Vision, 2017, 11(7): 577-584.

    Google Scholar 

  114. Pupo O, Ventural S. Evolutionary strategy to perform batch-mode active learning on multi-label data. ACM Trans. Intelligent Systems and Technology, 2018, 9(4): Article No. 46.

  115. Reichart R, Tomanek K, Hahn U, Rappoport A. Multitask active learning for linguistic annotations. In Proc. the 46th Association for Computational Linguistics, June 2008, pp.861-869.

  116. Zhang Y. Multi-task active learning with output constraints. In Proc. the 24th AAAI Conference on Artificial Intelligence, July 2010, pp.667-672.

  117. Harpale A. Multi-task active learning [Ph.D. Thesis]. School of Computer Science, Carnegie Mellon University, 2012.

  118. Gavves E, Mensink T, Tommasi T et al. Active transfer learning with zero-shot priors: Reusing past datasets for future tasks. In Proc. the 2015 IEEE International Conference on Computer Vision, December 2015, pp.2731-2739.

  119. Wang X, Huang T, Schneider J. Active transfer learning under model shift. In Proc. the 31st Int. Conference on Machine Learning, June 2014, pp.1305-1313.

  120. Guo Y, Schuurmans D. Discriminative batch mode active learning. In Proc. the 21st Annual Conference on Neural Information Processing Systems, December 2007, pp.593-600.

  121. Chakraborty S, Balasubramanian V, Panchanathan S. Adaptive batch mode active learning. IEEE Trans. Neural Networks and Learning Systems, 2015, 26(8): 1747-1760.

    MathSciNet  Google Scholar 

  122. Shen P, Li C, Zhang Z. Distributed active learning. IEEE Access, 2016, 4: 2572-2579.

    Google Scholar 

  123. Hinton G E, Osindero S, The Y. A fast learning algorithm for deep belief nets. Neural Computing, 2006, 18(7): 1527-1554.

    MathSciNet  MATH  Google Scholar 

  124. Wang K, Zhang D, Li Y et al. Cost-effective active learning for deep image classification. IEEE Trans. Circuits and Systems for Video Technology, 2017, 27(12): 2591-2600.

    Google Scholar 

  125. Rahhal M M A, Bazi Y, Alhichri H et al. Deep learning approach for active classification of electrocardiogram signals. Information Sciences, 2016, 345(C): 340-354.

    Google Scholar 

  126. Zhou S, Chen Q, Wang X. Active deep learning method for semi-supervised sentiment classification. Neurocomputing, 2013, 120: 536-546.

    Google Scholar 

  127. Valiant L G. A theory of the learnable. Communications of the ACM, 1984, 27(11): 1134-1142.

    MATH  Google Scholar 

  128. Hanneke S. A bound on the label complexity of agnostic active learning. In Proc. the 24th Int. Conference on Machine Learning, June 2007, pp.353-360.

  129. Hanneke S. Theoretical foundations of active learning [Ph.D. Thesis]. Machine Learning Department, CMU, 2009.

  130. Hanneke S. Theory of disagreement-based active learning. Foundations and Trends in Machine Learning, 2014, 7(2/3): 131-309.

    MATH  Google Scholar 

  131. Dasgupta S. Coarse sample complexity bounds for active learning. In Proc. the 19th Annual Conference on Neural Information Processing Systems, December 2005, pp.235-242.

  132. Tosh C, Dasgupta S. Diameter-based active learning. In Proc. the 34th International Conference on Machine Learning, August 2017, pp.3444-3452.

  133. Audibert J Y, Tsybakov A B. Fast learning rates for plug-in classifiers. The Annals of Statistics, 2005, 35(2): 608-633.

    MathSciNet  MATH  Google Scholar 

  134. Minsker S. Plug-in approach to active learning. Journal of Machine Learning Research, 2012, 13: 67-90.

    MathSciNet  MATH  Google Scholar 

  135. Locatelli A, Carpentier A, Kpotufe S. Adaptivity to noise parameters in nonparametric active learning. In Proc. the 30th Conference on Learning Theory, July 2017, pp.1383-1416.

  136. Schein A I, Ungar L H. Active learning for logistic regression: An evaluation. Machine Learning, 2007, 68(3): 235-265.

    Google Scholar 

  137. Melville P, Mooney R. Diverse ensembles for active learning. In Proc. the 21st Int. Conference on Machine Learning, July 2004, pp.584-591.

  138. Yang Y, Loog M. A benchmark and comparison of active learning for logistic regression. Pattern Recognition, 2018, 83: 401-415.

    Google Scholar 

  139. Ramirez-Loaiza M E, Sharma M, Kumar G et al. Active learning: An empirical study of common baselines. Data Mining and Knowledge Discovery, 2017, 31(2): 287-313.

    MathSciNet  Google Scholar 

  140. Pupo O, Altalhi H, Ventura S. Statistical comparisons of active learning strategies over multiple datasets. Knowledge-Based Systems, 2018, 145(1):274-288.

    Google Scholar 

  141. Merz C, Murphy P. UCI repository of machine learning databases. http://www.ics.uci.edu/ mlearn/MLRepository. html, Nov. 2019.

  142. Frey PW, Slate D J. Letter recognition using Holland-style adaptive classifiers. Machine Learning, 1991, 6(2): 161-182.

    Google Scholar 

  143. Xu L, Krzyzak A, Suen C. Methods of combining multiple classifiers and their applications to handwritten recognition. IEEE Trans. Systems Man and Cybernetics, 1992, 22(3): 418-435.

    Google Scholar 

  144. Garofolo J, Lamel L, FisherWet al. DARPA TIMIT acoustic phonetic continuous speech corpus CD-ROM. Technical Report, 1993. https://nvlpubs.nist.gov/nistpubs/Legacy/IR/nistir4930.pdf, Nov. 2019.

  145. Craven M, DiPasquo D, Freitag D et al. Learning to construct knowledge bases from theWorldWideWeb. Artificial Intelligence, 2000, 118(1/2): 69-113.

    MATH  Google Scholar 

  146. LeCun Y, Bottou L, Bengio Y et al. Gradient-based learning applied to document recognition. Proceedings of the IEEE, 1998, 86(11): 2278-2324.

    Google Scholar 

  147. Lang K. NewsWeeder: Learning to filter net news. In Proc. the 12th Int. Conference on Machine Learning, July 1995, pp.331-339.

  148. Deng J, Dong W, Socher R et al. ImageNet: A large-scale hierarchical image database. In Proc. the 2009 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, June 2009, pp.248-255.

  149. Sang E F, de Meulder F. Introduction to the CoNLL-2003 shared task: Language-independent named entity recognition. In Proc. the 7th Conference on Natural Language Learning, May 2003, pp.142-147.

  150. Collier N, Kim J. Introduction to the bio-entity recognition task at JNLPBA. In Proc. the International Joint Workshop on Natural Language Processing in Biomedicine and its Applications, August 2004, Article No. 13.

  151. Yeh A, Morgan A, Colosimo M et al. BioCreAtIvE task 1A: Gene mention finding evaluation. BMC Bioinformatics, 2005, 6(S-1): Article No. 2.

  152. Vlachos A. Evaluating and combining biomedical named entity recognition systems. In Proc. the Workshop on Biological, Translational, and Clinical Language Processing, June 2007, pp.199-200.

  153. Peng F, McCallum A. Information extraction from research papers using conditional random fields. Information Processing and Management, 2006, 42(4): 963-979

    Google Scholar 

  154. de Carvalho V R, Cohen W. Learning to extract signature and reply lines from email. In Proc. the 1st Conference on Email and Anti-Spam, July 2004.

  155. Guyon I, Cawley G, Dror G et al. Results of the active learning challenge. In Proc. the Active Learning and Experimental Design Workshop, May 2010, pp.19-45.

  156. Pace R K, Barry R. Sparse spatial autoregressions. Stat. Probab. Lett., 1997, 33(3): 291-297.

    MATH  Google Scholar 

  157. Bay S D, Kibler D, Pazzani M et al. The UCI KDD archive of large data sets for data mining research and experimentation. SIGKDD Explor., 2000, 2(2): 81-85.

    Google Scholar 

  158. Tang Y P, Li G X, Huang S J. ALiPy: Active learning in Python. arXiv:1901.03802, 2019. https://arxiv.org/abs/1901.03802, Nov. 2019.

  159. Yang Y Y, Lee S C, Chung Y A et al. libact: Poolbased active learning in Python. arXiv:1710.00379, 2017. https://arxiv.org/abs/1710.00379, October 2019.

  160. Tran V C, Nguyen N T, Fujita H et al. A combination of active learning and self-learning for named entity recognition on Twitter using conditional random fields. Knowledge-Based Systems, 2017, 132: 179-187.

    Google Scholar 

  161. Scheffer T, Decomain C, Wrobel S. Active hidden Markov models for information extraction. In Proc. the 4th Int. Conference on Advances in Intelligent Data Analysis, September 2001, pp.309-318.

  162. Aldogan D, Yaslan Y. A comparison study on active learning integrated ensemble approaches in sentiment analysis. Computers and Electrical Engineering, 2017, 57(C): 311-323.

    Google Scholar 

  163. Zhang H, Huang M, Zhu X. A unified active learning framework for biomedical relation extraction. Journal of Computer Science and Technology, 2012, 27(6): 1302-1313.

    Google Scholar 

  164. Hoi S C H, Jin R, Zhu J et al. Batch mode active learning and its application to medical image classification. In Proc. the 23rd Int. Conference on Machine Learning, June 2006, pp.417-424.

  165. Wallace B C, Small K, Brodley C et al. Active learning for biomedical citation screening. In Proc. the 16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, July 2010, pp.173-182.

  166. Ma A, Patel N, Li M et al. Confidence based active learning for whole object image segmentation. In Proc. the 2006 Int. Workshop on Multimedia Content Representation, Classification and Security, September 2006, pp.753-760.

  167. Pavlopoulou, Kak A, Brodley C. Application of semisupervised and active learning to interactive contour delineation. In Proc. the ICML 2003 Workshop on the Continuum from Labeled to Unlabeled Data in Machine Learning and Data Mining, August 2003, pp.26-33.

  168. Boutell M R, Luo J, Shen X et al. Learning multi-label scene classification. Pattern Recognition, 2004, 37(9): 1757-1771.

    Google Scholar 

  169. Zhang B, Wang Y, Chen F. Multilabel image classification via high-order label correlation driven active learning. IEEE Trans. Image Processing, 2014, 23(3): 1430-1441.

    MathSciNet  MATH  Google Scholar 

  170. Top A, Hamarneh G, Abugharbieh R. Active learning for interactive 3D image segmentation. In Proc. the 14th Int. Conference on Medical Image Computing and Computerassisted Intervention, September 2011, pp.603-610.

  171. Caicedo J C, Lazebnik S. Active object localization with deep reinforcement learning. In Proc. the 2015 IEEE Int. Conference on Computer Vision, December 2015, pp.2488-2496.

  172. Kim Y, Kim S. Design of aging-resistant Wi-Fi fingerprintbased localization system with continuous active learning. In Proc. the 20th Int. Conference on Advanced Communication Technology, February 2018, pp.s1054-1059.

  173. Ayache S, Qu´enot G. Video corpus annotation using active learning. In Proc. the 30th European Conference on Information Retrieval Research, March 2008, pp.187-198.

  174. Reker D, Schneider G. Active-learning strategies in computer-assisted drug discovery. Drug Discovery Today, 2015, 20(4): 458-465.

    Google Scholar 

  175. Warmuth M K, Rätsch G, Mathieson M et al. Active learning in the drug discovery process. In Proc. the 15th Annual Conference on Neural Information Processing Systems, December 2001, pp.1449-1456.

  176. Figueroa R L, Zeng-Treitler Q, Ngo L et al. Active learning for clinical text classification: Is it better than random sampling? Journal of the American Medical Informatics Association, 2012, 19(5): 809-816.

    Google Scholar 

  177. Chen Y, Lasko T, Mei Q et al. A study of active learning methods for named entity recognition in clinical text. Journal of Biomedical Informatics, 2015, 58(1): 11-18.

    Google Scholar 

  178. Gu Y, Zydek D. Active learning for intrusion detection. In Proc. the 2014 National Wireless Research Collaboration Symposium, May 2014, pp.117-122.

  179. Hossain H M S, Roy N, Khan M. Active learning enabled activity recognition. In Proc. the 2016 IEEE Int. Conference on Pervasive Computing and Communications, March 2016, Article No. 26.

  180. Reker D, Schneider P, Schneider G. Multi-objective active machine learning rapidly improves structure-activity models and reveals new protein-protein interaction inhibitors. Chemical Science, 2016, 7(6): 3919-3927.

    Google Scholar 

  181. Yan S, Chaudhuri K, Javidi T. Active learning with logged data. arXiv:1802.09069, 2018. https://arxiv.org/abs/1802.09069, Nov. 2019.

  182. Danka T, Horvath P. modAL: A modular active learning framework for Python. arXiv:1805.00979, 2018. https://arxiv.org/abs/1805.00979, Nov. 2019.

  183. Pedregosa F, Varoquaux G, Gramfort A et al. Scikit-learn: Machine learning in Python. Journal of Machine Learning Research, 2011, 12: 2825 -2830.

    MathSciNet  MATH  Google Scholar 

  184. Atienza R. Advanced Deep Learning with Keras: Apply Deep Learning Techniques, Autoencoders, GANs, Variational Autoencoders, Deep Reinforcement Learning, Policy Gradients, and More. Packt Publishing, 2018.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Punit Kumar.

Electronic supplementary material

ESM 1

(PDF 397 kb)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kumar, P., Gupta, A. Active Learning Query Strategies for Classification, Regression, and Clustering: A Survey. J. Comput. Sci. Technol. 35, 913–945 (2020). https://doi.org/10.1007/s11390-020-9487-4

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11390-020-9487-4

Keywords

Navigation