Skip to main content
Log in

Deep multiple instance selection

  • Research Paper
  • Published:
Science China Information Sciences Aims and scope Submit manuscript

Abstract

Multiple instance learning (MIL) assigns a single class label to a bag of instances tailored for some real-world applications such as drug activity prediction. Classical MIL methods focus on figuring out interested instances, that is, region of interests (ROIs). However, owing to the non-differentiable selection process, these methods are not feasible in deep learning. Thus, we focus on fusing ROIs identification with deep MILs in this paper. We propose a novel deep MIL framework based on hard selection, that is, deep multiple instance selection (DMIS), which can automatically figure ROIs out in an end-to-end approach. To be specific, we propose DMIS-GS for instance selection via gumbel softmax or gumbel top-k, and then make predictions for this bag without the interference of redundant instances. For balancing exploration and exploitation of key instances, we apply a cooling down approach to the temperature in DMIS-GS, and propose a variance normalization method to make this hyper-parameter tuning process much easier. Generally, we give a theoretical analysis of our framework. The empirical investigations reveal the proposed frameworks’ superiorities against classical MIL methods on generalization ability, positioning ROIs, and comprehensibility on both synthetic and real-world datasets.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Dietterich T G, Lathrop R H, Lozano-Pérez T. Solving the multiple instance problem with axis-parallel rectangles. Artif Intell, 1997, 89: 31–71

    Article  Google Scholar 

  2. Zhou Z H, Zhang M L, Huang S J, et al. Multi-instance multi-label learning. Artif Intell, 2012, 176: 2291–2320

    Article  MathSciNet  Google Scholar 

  3. Angelidis S, Lapata M. Multiple instance learning networks for fine-grained sentiment analysis. Trans Assoc Comput Linguist, 2018, 6: 17–31

    Article  Google Scholar 

  4. Feng J, Zhou Z H. Deep MIML network. In: Proceedings of the 31st AAAI Conference on Artificial Intelligence, 2017. 1884–1890

  5. Carbonneau M A, Cheplygina V, Granger E, et al. Multiple instance learning: a survey of problem characteristics and applications. Pattern Recogn, 2018, 77: 329–353

    Article  Google Scholar 

  6. Andrews S, Tsochantaridis I, Hofmann T. Support vector machines for multiple-instance learning. In: Proceedings of Advances in Neural Information Processing Systems, 2002. 561–568

  7. Li Y F, Kwok J T, Tsang I W, et al. A convex method for locating regions of interest with multi-instance learning. In: Proceedings of Machine Learning and Knowledge Discovery in Databases, European Conference, 2009. 15–30

  8. Carbonneau M A, Granger E, Raymond A J, et al. Robust multiple-instance learning ensembles using random subspace instance selection. Pattern Recogn, 2016, 58: 83–99

    Article  Google Scholar 

  9. Zhang Q, Goldman S A. EM-DD: an improved multiple-instance learning technique. In: Proceedings of Advances in Neural Information Processing Systems, 2001. 1073–1080

  10. Qi C R, Su H, Mo K, et al. Pointnet: deep learning on point sets for 3D classification and segmentation. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 2017. 77–85

  11. Ilse M, Tomczak J M, Welling M. Attention-based deep multiple instance learning. In: Proceedings of the 35th International Conference on Machine Learning, 2018. 2132–2141

  12. Tang P, Wang X G, Bai S, et al. PCL: proposal cluster learning for weakly supervised object detection. IEEE Trans Pattern Anal Mach Intell, 2020, 42: 176–191

    Article  Google Scholar 

  13. Wang X G, Yan Y L, Tang P, et al. Bag similarity network for deep multi-instance learning. Inf Sci, 2019, 504: 578–588

    Article  MathSciNet  Google Scholar 

  14. Wei X S, Ye H J, Mu X, et al. Multi-instance learning with emerging novel class. IEEE Trans Knowl Data Eng, 2019. doi: https://doi.org/10.1109/TKDE.2019.2952588

  15. Zhou Z H, Xue X B, Jiang Y. Locating regions of interest in CBIR with multi-instance learning techniques. In: Proceedings of the 18th Australian Joint Conference on Artificial Intelligence, 2005. 92–101

  16. Chen Y X, Bi J B, Wang J Z. MILES: multiple-instance learning via embedded instance selection. IEEE Trans Pattern Anal Mach Intell, 2006, 28: 1931–1947

    Article  Google Scholar 

  17. Wang J, Zucker J D. Solving the multiple-instance problem: a lazy learning approach. In: Proceedings of the 17th International Conference on Machine Learning, 2000. 1119–1126

  18. Zhou Z H, Zhang M L. Solving multi-instance problems with classifier ensemble based on constructive clustering. Knowl Inf Syst, 2007, 11: 155–170

    Article  Google Scholar 

  19. Viola P A, Platt J C, Zhang C. Multiple instance boosting for object detection. In: Proceedings of Advances in Neural Information Processing Systems, 2005. 1417–1424

  20. Olvera-López J A, Carrasco-Ochoa J A, Martinez-Trinidad J F, et al. A review of instance selection methods. Artif Intell Rev, 2010, 34: 133–143

    Article  Google Scholar 

  21. Sofiiuk K, Barinova O, Konushin A. Adaptis: adaptive instance selection network. In: Proceedings of IEEE/CVF International Conference on Computer Vision, 2019. 7354–7362

  22. Li Z, Geng G H, Feng J, et al. Multiple instance learning based on positive instance selection and bag structure construction. Pattern Recogn Lett, 2014, 40: 19–26

    Article  Google Scholar 

  23. Liu G Q, Wu J X, Zhou Z H. Key instance detection in multi-instance learning. In: Proceedings of the 4th Asian Conference on Machine Learning, 2012. 253–268

  24. Bahdanau D, Cho K, Bengio Y. Neural machine translation by jointly learning to align and translate. In: Proceedings of the 3rd International Conference on Learning Representations, 2015

  25. Xu K, Ba J, Kiros R, et al. Show, attend and tell: neural image caption generation with visual attention. In: Proceedings of the 32nd International Conference on Machine Learning, 2015. 2048–2057

  26. Deng Y T, Kim Y, Chiu J, et al. Latent alignment and variational attention. In: Proceedings of Advances in Neural Information Processing Systems, 2018. 9735–9747

  27. Malinowski M, Doersch C, Santoro A, et al. Learning visual question answering by bootstrapping hard attention. In: Proceedings of the 15th European Conference on Computer Vision, 2018. 3–20

  28. Jang E, Gu S, Poole B. Categorical reparameterization with gumbel-softmax. In: Proceedings of the 5th International Conference on Learning Representations, 2017

  29. Maddison C J, Mnih A, Teh Y W. The concrete distribution: a continuous relaxation of discrete random variables. In: Proceedings of the 5th International Conference on Learning Representations, 2017

  30. van den Oord A, Vinyals O, Kavukcuoglu K. Neural discrete representation learning. In: Proceedings of Advances in Neural Information Processing Systems, 2017. 6306–6315

  31. Li Z H, He D, Tian F, et al. Towards binary-valued gates for robust LSTM training. In: Proceedings of the 35th International Conference on Machine Learning, 2018. 3001–3010

  32. Kool W, van Hoof H, Welling M. Stochastic beams and where to find them: the gumbel-top-k trick for sampling sequences without replacement. In: Proceedings of the 36th International Conference on Machine Learning, 2019. 3499–3508

  33. Do T T, Tran T, Reid I D, et al. A theoretically sound upper bound on the triplet loss for improving the efficiency of deep distance metric learning. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, Long Beach, 2019. 10404–10413

  34. Qian Q, Shang L, Sun B G, et al. Softtriple loss: deep metric learning without triplet sampling. In: Proceedings of IEEE/CVF International Conference on Computer Vision, 2019. 6449–6457

  35. Bengio Y, L’eonard N, Courville A C. Estimating or propagating gradients through stochastic neurons for conditional computation. 2013. ArXiv:1308.3432

  36. Ioffe S, Szegedy C. Batch normalization: accelerating deep network training by reducing internal covariate shift. In: Proceedings of the 32nd International Conference on Machine Learning, 2015. 448–456

  37. Ulyanov D, Vedaldi A, Lempitsky V S. Instance normalization: the missing ingredient for fast stylization. 2016. ArXiv:1607.08022

  38. Santurkar S, Tsipras D, Ilyas A, et al. How does batch normalization help optimization? In: Proceedings of Advances in Neural Information Processing Systems, 2018. 2488–2498

  39. Zaheer M, Kottur S, Ravanbakhsh S, et al. Deep sets. In: Proceedings of Advances in Neural Information Processing Systems, 2017. 3391–3401

  40. Zhou Z H, Sun Y Y, Li Y F. Multi-instance learning by treating instances as non-i.i.d. samples. In: Proceedings of the 26th International Conference on Machine Learning, 2009. 1249–1256

  41. Wei X S, Wu J X, Zhou Z H. Scalable algorithms for multi-instance learning. IEEE Trans Neural Netw Learn Syst, 2017, 28: 975–987

    Article  Google Scholar 

  42. Wang X G, Yan Y L, Tang P, et al. Revisiting multiple instance neural networks. Pattern Recogn, 2018, 74: 15–24

    Article  Google Scholar 

  43. Tang D Y, Qin B, Liu T. Document modeling with gated recurrent neural network for sentiment classification. In: Proceedings of Conference on Empirical Methods in Natural Language Processing, 2015. 1422–1432

  44. Yang Z C, Yang D Y, Dyer C, et al. Hierarchical attention networks for document classification. In: Proceedings of Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2016. 1480–1489

  45. Pennington J, Socher R, Manning C D. Glove: global vectors for word representation. In: Proceedings of Conference on Empirical Methods in Natural Language Processing, 2014. 1532–1543

Download references

Acknowledgements

This work was supported by National Natural Science Foundation of China (Grant Nos. 61773198, 61751306) and NSFC-NRF Joint Research Project (Grant No. 61861146001).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to De-Chuan Zhan.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Li, XC., Zhan, DC., Yang, JQ. et al. Deep multiple instance selection. Sci. China Inf. Sci. 64, 130102 (2021). https://doi.org/10.1007/s11432-020-3117-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s11432-020-3117-3

Keywords

Navigation