Skip to main content
Log in

Fruits yield estimation using Faster R-CNN with MIoU

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Fruit yield estimation is one of the challenging tasks using on-plant images to support smart farming and provide information about the product so that storage and export facility can be arranged. The detection and counting of on-plant fruits is a challenging task in complex vision. In this paper, we modify the intersection of union (IoU) in original Faster R-CNN (FR-CNN) for on-plant fruit detection. The modified IoU (MIoU) introduces good distance metric with the minimum area containing ground truth and predicted bounding box. Again, the MIoU pays extra attention to overlapping areas, which overcome the inefficiency of original R-CNN and enhance the detection accuracy. The proposed FR-CNN with MIoU achieved the correlation coefficient (R2) of mango, pomegranate, tomato, apple & orange are 0.98,0.92, 0.96, 0.98 & 0.95 respectively with the variation of imaging condition. In the same image sample, FR-CNN achieved the correlation coefficient (R2) of mango, pomegranate, tomato, apple & orange are 0.81,0.91,0.89, 0.90 & 0.92 respectively. So, the FR-CNN with MIoU enhances the detection accuracy compared to original FR-CNN. Again, the F1 score for apple, orange, tomato, pomegranate and mango is 0.9534, 0.9794, 0.9424, 0.9534 and mango 0.9383 respectively. In addition, the proposed method is efficient enough with less complex compared to state-of-art models for fruit detection. Again, the proposed methodology is evaluated using other state-of-art fruits datasets, namely ACFR dataset and KFuji RGB-DS dataset. The proposed methodology achieved F1 score of 0.9523 & 0.9432 for yield estimation of apple & mango of ACFR dataset and 0.8912 for KFuji RGB-DS dataset.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

Data availability

Since the data used in this paper was acquired by self-collection, the dataset is being further improved, so the dataset is not available for the time being.

References

  1. Abbas Q, Celebi ME (2019) Dermo deep-a classification of melanoma-nevus skin lesions using multi-feature fusion of visual features and deep neural network. Multimed Tools Appl 78:23559–23580. https://doi.org/10.1007/s11042-019-7652-y

    Article  Google Scholar 

  2. Bargoti, S, Underwood J (2017) Deep fruit detection in orchards. In Proceedings of the IEEE International Conference on Robotics & Automation. pp 3626–3633. https://doi.org/10.1109/ICRA.2017.79894 17.

  3. Behera SK, Sangeeta S, Rath AK, Sethy PK (2018) Image processing based detection & size estimation of fruit on mango tree canopies. Int J Appl Eng Res 13(4):6–13

  4. Behera SK, Pattnaik A, Rath AK, Barpanda NK, Sethy PK (2019a) Yield estimation of pomegranate using image processing techniques. Int J Innov Technol Explor Eng 8(6S):798–803

    Google Scholar 

  5. Behera SK, Rath AK, Sethy PK (2019b) Automatic fruits identification and disease analysis using machine learning techniques. Int J Innov Technol Explor Eng 8(6S2):103–107

    Google Scholar 

  6. Behera SK, Jena JJ, Rath AK, Sethy PK (2019c) Horticultural approach for detection, categorization and enumeration of on plant oval shaped fruits. In: emerging Technologies in Data Mining and Information Security, advance in intelligent systems and computing, vol 813. Springer, Singapore, pp 71–84. https://doi.org/10.1007/978-981-13-1498-8_7

    Chapter  Google Scholar 

  7. Behera SK, Rath AK, Mahapatra A, Sethy PK (2020) Identification, classification & grading of fruits using machine learning & computer intelligence: a review. J Ambient Intell Humaniz Comput. https://doi.org/10.1007/s12652-020-01865-8

  8. Borianne P, Borne F, Sarron J, Faye E. (2019) Deep mangoes: from fruit detection to cultivar identification in colour images of mango trees. arXiv preprint arXiv:1909.10939.

  9. Chen S et al (2017) Counting apples and oranges with deep learning: a data-driven approach. IEEE Robotics Autom Lett 2(2):781–788

    Article  Google Scholar 

  10. Chu J, Guo Z, Leng L (2018) Object detection based on multi-layer convolution feature fusion and online hard example mining. IEEE Access 6:19959–19967. https://doi.org/10.1109/ACCESS.2018.2815149

    Article  Google Scholar 

  11. Chu J, Zhang Y, Li S, Leng L, Miao J (2020) Syncretic-NMS: a merging non-maximum suppression algorithm for instance segmentation. IEEE Access 8:114705–114714

    Article  Google Scholar 

  12. Gené-Mola J, Vilaplana V, Rosell-Polo JR, Morros J-R, Ruiz-Hidalgo J, Gregorio E (2019) Multi-modal deep learning for Fuji apple detection using RGB-D cameras and their radiometric capabilities. Comput Electron Agric 162:689–698. https://doi.org/10.1016/j.compag.2019.05.016

    Article  Google Scholar 

  13. Gené-Mola J, Sanz-Cortiella R, Rosell-Polo JR, Morros J-R, Ruiz-Hidalgo J, Vilaplana V., Gregorio E. (2020). Fuji-SfM dataset: a collection of annotated images and point clouds for Fuji apple detection and location using structure-from-motion photogrammetry. Data in Brief, 105591. DOI: /https://doi.org/10.1016/j.dib.2020.105591

  14. Girshick R. Fast r-cnn (2015) In Proceedings of the IEEE international conference on computer vision

  15. Gongal A, Amatya S, Karkee M, Zhang Q, Lewis K (2015) Sensors and systems for fruit detection and localization: a review. Comput Electron Agric 116:8–19

    Article  Google Scholar 

  16. Häni N, Roy P, Isler V (2019) A comparative study of fruit detection and counting methods for yield mapping in apple orchards. J Field Robotics 37:263–282. https://doi.org/10.1002/rob.21902

    Article  Google Scholar 

  17. Horticultural Statistics at a Glance (2018) Link: http://www.agricoop.nic.in.

  18. Kang H, Chen C. (2010) Fruit detection, segmentation and 3d visualization of environments in apple orchards. Comput Electron Agric

  19. Kestur R, Meduri A, Narasipura O (2019) MangoNet: a deep semantic segmentation architecture for a method to detect and count mangoes in an open orchard. Eng Appl Artif Intell 77:59–69. https://doi.org/10.1016/j.engappai.2018.09.011

    Article  Google Scholar 

  20. Koirala A, Walsh KB, Wang Z, McCarthy C (2019) Deep learning—method overview and review of use for fruit detection and yield estimation. Comput Electron Agric 162:219–234. https://doi.org/10.1016/j.compag.2019.04.017

    Article  Google Scholar 

  21. Koirala A, Walsh KB, Wang Z, McCarthy C (2019) Deep learning for real-time fruit detection and orchard fruit load estimation: benchmarking of ‘MangoYOLO’. Precision Agric 20(6):1107–1135. https://doi.org/10.1007/s11119-019-09642-0

    Article  Google Scholar 

  22. Lal S, Behera SK, Sethy PK, Rath AK (2017) Identification and counting of mature apple fruit based on BP feed-forward neural network. In: Proceedings of International Conference on Sensing. Signal Processing and Security (ICSSS), Chennai, pp 361–368

    Google Scholar 

  23. Leng L, Teoh AB (2015) Alignment-free row-co-occurrence cancelable palmprint fuzzy vault. Pattern Recogn 48(7):2290–2303

    Article  Google Scholar 

  24. Leng L, Zhang J (2011) Dual-key-binding cancelable palmprint cryptosystem for palmprint protection and information security. J Netw Comput Appl 34(6):1979–1989

    Article  Google Scholar 

  25. Leng L, Teoh AB, Li M, Khan MK (2014) A remote cancelable palmprint authentication protocol based on multi-directional two-dimensional palm Phasor-fusion. Secur Commun Networks 7(11):1860–1871

    Article  Google Scholar 

  26. Leng L, Li M, Kim C, Bi X (2017) Dual-source discrimination power analysis for multi-instance contactless palmprint recognition. Multimed Tools Appl 76(1):333–354

    Article  Google Scholar 

  27. Li L, Pan X, Yang H, Liu Z, He Y, Li Z, Fan Y, Cao Z, Zhang L (2018) Multi-task deep learning for fine-grained classification and grading in breast cancer histopathological images. Multimed Tools Appl 79:14509–14528. https://doi.org/10.1007/s11042-018-6970-9

    Article  Google Scholar 

  28. Muhammad K et al (2017) Image based fruit category classification by 13-layer deep convolutional neural network and data augmentation. Multimed Tools Appl 78:3613–3632. https://doi.org/10.1007/s11042-017-5243-3

    Article  Google Scholar 

  29. Mühling M, Korfhage N, Müller E, Otto C, Springstein M, Langelage T, Veith U, Ewerth R, Freisleben B (2017) Deep learning for content-based video retrieval in film and television production. Multimed Tools Appl 76:22169–22194. https://doi.org/10.1007/s11042-017-4962-9

    Article  Google Scholar 

  30. Qureshi WS, Payne A, Walsh KB, Linker R, Cohen O, Dailey MN (2017) Machine vision for counting fruit on mango tree canopies. Precision Agric 18:224–244. https://doi.org/10.1007/s11119-016-9458-5

    Article  Google Scholar 

  31. Rahnemoonfar M, Sheppard C (2017) Deep count: fruit counting based on deep simulated learning. Sensors 17(4):905

    Article  Google Scholar 

  32. Ramu SM, Rajappa M, Krithivasan K, Nalluri MR (2019) A novel fast medical image segmentation scheme for anatomical scans. Multimed Tools Appl 78:21391–21422. https://doi.org/10.1007/s11042-019-7328-7

    Article  Google Scholar 

  33. Ren S, He K, Girshick R, Sun J. (2015) Faster r-cnn: towards real-time object detection with region proposal networks. In IEEE Transactions on Pattern Analysis and Machine Intelligence 39(6):1137–1149. https://doi.org/10.1109/TPAMI.2016.2577031

  34. Sethy PK, Panda S, Bhoi N (2016) On-tree detection and counting of mature and immature fruit of Carica Papaya using image processing technique. Int J Comput Appl 156(8):16–21. https://doi.org/10.5120/ijca2016912487

    Article  Google Scholar 

  35. Tang C (2018) Twelve-layer deep convolutional neural network with stochastic pooling for tea category classification on GPU platform. Multimed Tools Appl 77(17):22821–22839

    Article  Google Scholar 

  36. Tu S, Xue Y, Zheng C, Qi Y, Wan H, Mao L (2018) Detection of passion fruits and maturity classification using red-green-blue depth images. Biosyst Eng 175:156–167. https://doi.org/10.1016/j.biosystemseng.2018.09.004

    Article  Google Scholar 

  37. Xu L, Lv J (2018) Recognition method for apple fruit based on SUSAN and PCNN. Multimed Tools Appl 77:7205–7219. https://doi.org/10.1007/s11042-017-4629-6

    Article  Google Scholar 

  38. Yong B, Wang C, Shen J, Li F, Yin H, Zhou R (2020) Automatic ventricular nuclear magnetic resonance image processing with deep learning. Multimed Tools Appl. https://doi.org/10.1007/s11042-020-08911-9

  39. Yu H, Song S, Ma S, Sinnott RO (2019) Estimating fruit crop yield through deep learning. In: Proceedings of the 6th IEEE/ACM International Conference on Big Data Computing, Applications and Technologies - BDCAT '19, Auckland, New Zealand, pp. 145–148. https://doi.org/10.1145/3365109.3368766

  40. Yuan Y, Chu J, Leng L, Miao J, Kim BG (2020) A scale-adaptive object-tracking algorithm with occlusion detection. J Image Video Proc. https://doi.org/10.1186/s13640-020-0496-6

  41. Yuchi S, Xu S (2019) Research on cooperative classification of multimedia visual images based on deep machine learning model. Multimed Tools Appl. https://doi.org/10.1007/s11042-019-7637-x

  42. Zhang H, Wei Z (2010) Risk management of commodity trade business based on deep learning and parallel processing of visual multimedia big data. Multimed Tools Appl 79:9331–9349. https://doi.org/10.1007/s11042-019-7508-5

    Article  Google Scholar 

  43. Zhang Y, Chu J, Leng L, Miao J (2020) Mask-refined R-CNN: a network for refining object details in instance segmentation. Sensors 20(4):1010. https://doi.org/10.3390/s20041010

    Article  Google Scholar 

  44. Zhao G (2018) Smart pathological brain detection by synthetic minority oversampling technique, extreme learning machine, and Jaya algorithm. Multimed Tools Appl 77(17):22629–22648

    Article  Google Scholar 

  45. Zhu H, Liu Q, Qi Y, Huang X, Jiang F, Zhang S (2018) Plant identification based on very deep convolutional neural networks. Multimed Tools Appl 77:29779–29797. https://doi.org/10.1007/s11042-017-5578-9

    Article  Google Scholar 

Download references

Acknowledgements

We would like to thank the reviewers for their thoughtful comments and efforts towards improving our manuscript. In addition, we are thankful to Professor A.K. Kullu, Department of English, Sambalpur University, for correcting grammar and wording in this manuscript.

Code availability

Presently not Shared.

Funding

This work is supported by the research grant under “Collaborative and Innovation Scheme” of TEQIP-III with project title “Development of Novel Approaches for Recognition and Grading of Fruits using Image processing and Computer Intelligence”, with reference letter No. VSSUT/TEQIP/113/2020.

Author information

Authors and Affiliations

Authors

Contributions

SKB and PKS jointly accomplished the experimental work and wrote the paper under the supervision of AKR.

Corresponding author

Correspondence to Prabira Kumar Sethy.

Ethics declarations

Competing interests

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Conflict of interest

The authors declare no conflict of interest.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Behera, S.K., Rath, A.K. & Sethy, P.K. Fruits yield estimation using Faster R-CNN with MIoU. Multimed Tools Appl 80, 19043–19056 (2021). https://doi.org/10.1007/s11042-021-10704-7

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-021-10704-7

Keywords

Navigation