Abstract
The technique of curriculum learning mimics cognitive mechanisms observed in human learning, where simpler concepts are presented prior to gradual introduction of more difficult concepts. Until now, the major obstacle for curriculum methods was the lack of a reliable method for estimating the difficulty of training instances. In this paper we show that, instead of trying to assess the difficulty of learning instances, a simple graph-based method of computing the typicality of instances can be used in conjunction with curriculum methods. We design new batch schedulers which organize ordered instances into batches of varying size and learning difficulty. Our method does not require any changes to the architecture of trained models, we improve the training merely by manipulating the order and frequency of instance presentation to the model.
M. Morzy—This work was supported by the National Science Centre, Poland, the decision no. 2016/23/B/ST6/03962.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bengio, Y., Louradour, J., Collobert, R., Weston, J.: Curriculum learning. In: Proceedings of the 26th ICML Conference, pp. 41–48 (2009)
Ebbinghaus, H.: Memory: a contribution to experimental psychology. Ann. Neurosci. 20(4), 155 (2013)
Elman, J.L.: Learning and development in neural networks: the importance of starting small. Cognition 48(1), 71–99 (1993)
Florensa, C., Held, D., Wulfmeier, M., Zhang, M., Abbeel, P.: Reverse curriculum generation for reinforcement learning. arXiv preprint arXiv:1707.05300 (2017)
Friedman, M.: The use of ranks to avoid the assumption of normality implicit in the analysis of variance. J. Am. Stat. Assoc. 32(200), 675–701 (1937)
Gong, C., Tao, D., Maybank, S.J., Liu, W., Kang, G., Yang, J.: Multi-modal curriculum learning for semi-supervised image classification. IEEE Trans. Image Process. 25(7), 3249–3260 (2016)
Graves, A., Bellemare, M.G., Menick, J., Munos, R., Kavukcuoglu, K.: Automated curriculum learning for neural networks. In: Proceedings of the 34th International Conference on Machine Learning, vol. 70, pp. 1311–1320. JMLR. org (2017)
Hacohen, G., Weinshall, D.: On the power of curriculum learning in training deep networks. In: International Conference on Machine Learning, pp. 2535–2544. PMLR (2019)
Jiang, L., Meng, D., Zhao, Q., Shan, S., Hauptmann, A.G.: Self-paced curriculum learning. In: Twenty-Ninth AAAI Conference on Artificial Intelligence (2015)
Khan, F., Mutlu, B., Zhu, J.: How do humans teach: on curriculum learning and teaching dimension. In: Advances in Neural Information Processing Systems, pp. 1449–1457 (2011)
Lin, L., Wang, K., Meng, D., Zuo, W., Zhang, L.: Active self-paced learning for cost-effective and progressive face identification. IEEE Trans. Pattern Anal. Mach. Intell. 40(1), 7–19 (2017)
Matiisen, T., Oliver, A., Cohen, T., Schulman, J.: Teacher-student curriculum learning. IEEE Trans. Neural Netw. Learn. Syst. 31(9), 3732–3740 (2019)
Nemenyi, P.: Distribution-free multiple comparisons (doctoral dissertation, Princeton University, 1963). Dissertation Abstracts International 25(2), 1233 (1963)
Pentina, A., Sharmanska, V., Lampert, C.H.: Curriculum learning of multiple tasks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 5492–5500 (2015)
Ruder, S.: An overview of gradient descent optimization algorithms. arXiv preprint arXiv:1609.04747 (2016)
Shannon, C.E.: A mathematical theory of communication. Bell Syst. Tech. J. 27(3), 379–423 (1948)
Wong, J.H., Gales, M.: Sequence student-teacher training of deep neural networks (2016)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Krysińska, I., Morzy, M., Kajdanowicz, T. (2021). Curriculum Learning Revisited: Incremental Batch Learning with Instance Typicality Ranking. In: Farkaš, I., Masulli, P., Otte, S., Wermter, S. (eds) Artificial Neural Networks and Machine Learning – ICANN 2021. ICANN 2021. Lecture Notes in Computer Science(), vol 12894. Springer, Cham. https://doi.org/10.1007/978-3-030-86380-7_23
Download citation
DOI: https://doi.org/10.1007/978-3-030-86380-7_23
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-86379-1
Online ISBN: 978-3-030-86380-7
eBook Packages: Computer ScienceComputer Science (R0)