single-au.php

IJAT Vol.12 No.3 pp. 395-404
doi: 10.20965/ijat.2018.p0395
(2018)

Technical Paper:

A 3D Shape-Measuring System for Assessing Strawberry Fruits

Nobuo Kochi*,**,†, Takanari Tanabata**, Atsushi Hayashi**, and Sachiko Isobe**

*R&D Initiative, Chuo University
1-13-27 Kasuga, Bunkyo-ku, Tokyo 112-8551, Japan

Corresponding author

**Kazusa DNA Research Institute, Kisarazu, Japan

Received:
September 23, 2017
Accepted:
January 10, 2018
Online released:
May 1, 2018
Published:
May 5, 2018
Keywords:
3D measurement assessment, structure from motion, registration, plant measurement, strawberry fruits
Abstract

Plant shape measurements have conventionally been conducted in plant science by classifying their shape features, by measuring their widths and lengths with a Vernier caliper, or by similar methods. Those measurements rely heavily on human senses and manual labor, making it difficult to acquire massive data. Additionally, they are prone to large measurement differences. To cope with those problems of conventional measuring methods, we are developing a three-dimensional (3D) shape-measuring system using images and a reliable assessment technique. 3D objects enable us to assess and measure shape features with high accuracy and to automatically measure volume, which conventional methods cannot. Thus, our new system is capable of automatically and efficiently measuring objects. Our goal is to obtain wide acceptance of our method at actual research sites. Unlike industrial products, it is difficult to properly assess the measurements of plants because of their object fluctuations and shape complexities. This paper describes our automatic 3D shape-measuring system, the method for assessing measurement accuracy, and the assessment results. The measurement accuracy of the developed system for strawberry fruits is 0.6 mm or less for 90% or more of the fruit and 0.3 mm or less for 80%. This evidence supports the system’s capability of shape assessment. The developed system can fully automate photographing, measuring, and modeling objects and can semi-automatically analyze them, reducing the time required for the entire process from the conventional time of 6–7 h to 1.5 h. The developed system is designed for users with no technical knowledge so that they can easily use it to acquire 3D measurement data on plants. Thus, we intend to expand measurable objects from strawberry fruits to other plants and their parts, including leaves, stalks, and flowers

Cite this article as:
N. Kochi, T. Tanabata, A. Hayashi, and S. Isobe, “A 3D Shape-Measuring System for Assessing Strawberry Fruits,” Int. J. Automation Technol., Vol.12 No.3, pp. 395-404, 2018.
Data files:
References
  1. [1] T. Mochizuki, “Progress and Future Aspects of Strawberry Bleeding in Japan,” Breeding Research, Vol.2, No.3, pp. 155-163, 2000 (in Japanese).
  2. [2] Plant Variety Protection PVP Office at Maff, Japan Test Guidelines, http://www.hinsyu.maff.go.jp/info/sinsakijun/ botanical_taxon_e.html [Accessed April 17, 2017]
  3. [3] K. Dominik, D. K. Großkinsky, J. Svensgaard, S. Christensen, and T. Roitsch, “Plant phenomics and the need for physiological phenotyping across scales to narrow the genotype-to-phenotype knowledge gap,” J. of Experimental Botany, Vol.66, Issue 18, pp. 5429-5440, 2015.
  4. [4] S. Yabe, K. Uehara, Y. Yoshitsu, K. Watanabe, and K. Noshita, “How far have phenotyping systems been evolving?,” Breeding Research, Vol.18, No.2, pp. 67-71, 2016 (in Japanese).
  5. [5] A. Hayashi, N. Kochi, S. Isobe, and T. Tanabata, “Development of 3D Shape Measuring Method for Strawberry Fruits,” The Int. Conf. on The Status of Plant & Animal Genome Research, January14-18, P0626, 2017.
  6. [6] A. Khosravani, M. Lingenfelder, K. Wenzel, and D. Fritsch, “Coregistration of kinect point clouds based on image and object space observations,” Procs. of LC3D workshop, 2012.
  7. [7] T. Hedlund, “Registration of multiple ToF camera point clouds,” UMEA University, Department of Physics, 2010.
  8. [8] N. Kochi, T. Ito, K. Kitamura, and S. Kaneko, “Development of 3D Image Measurement System and Stereo-matching Method, and Its Archeological Measurement,” IEEJ Trans. Information and Systems, Vol.96, No.6, 2013, Translated from Denki Gakkai Ronbunshi, Vol.132, No.3, pp. 391-400, 2012.
  9. [9] C. S. Fraser, M. Shotis, and G. Ganci, “Multi-sensor system self-calibration,” Invited Paper, Videometrics IV, SPIE Vol.2598, pp. 2-18, 1995.
  10. [10] N. Kochi, “Photogrammetry,” Handbook of Optical Metrology, Principles and Applications, T. Yoshizawa (Ed.), Taylor and Francis, Chapter 22, 2013.
  11. [11] N. Kochi, K. Kitamura, T. Sasaki, and S. Kaneko, “3D Modeling of Architecture by Edge-Matching and Integrating the Point Clouds of Laser Scanner and Those of Digital Camera,” Int. Archives of Photogrammetry and Remote Sensing and Spatial Information Sciences, Vol.XXXIX-B5, V/4, Melbourne, pp. 279-284, 2012.
  12. [12] C. Tomasi and T. Kanade, “Shape and Motion from Image Streams under Orthography: A Factorization Method,” Int. J. of Computer Vision, Vol.9, No.2, pp. 137-154, 1992.
  13. [13] N. Snavely, S. Seitz, and R. Szeliski, “Photo tourism: Exploring photo collections in 3D,” ACM Trans. on Graphics, Siggraph, Vol.25, No.3, pp. 835-846, 2006.
  14. [14] T. Moriyama, N. Kochi, M. Yamada, and N. Fukaya, “Automatic Target-Identification with the Color-Coded-Targets,” The Int. Archives of Photogrammetry and Remote Sensing, Beijing, XXI Congress, WG V/I, pp. 39-44, 2008.
  15. [15] Y. Furukawa and J. Ponce, “Accurate, Dense, and Robust Multi-View Stereopsis,” IEEE Trans. on Pattern Analysis and Machine Intelligence, Vol.32, No.8, pp. 1362-1376, 2009.
  16. [16] M. Rothermel, K. Wenzel, D. Fritsch, and N. Haala, “SURE: Photogrammetric Surface Reconstruction From Imagery,” LC3D Workshop, Berlin, 2012.
  17. [17] N. Kochi, T. Anai, M. Yamada, S. Nishimura, T. Sasaki, H. Otani, D. Sasaki, K. Kimoto, and N. Yasu, “Examination about influence for precision of 3D image measurement from the ground control point measurement and surface matching,” Joint Workshop with ISPRS WG IV/7 and WG V/4, 2015.
  18. [18] M. Klopschitz, C. Zach, and D. Schmalstieg, “Generalized detection and merging of loop closures for video sequences,” Proc. 3D Data Processing, Visualization, and Transmission., Vol.2, 2008.
  19. [19] M. Havlena, A. Torii, J. Knopp, and T. Pajdla, “Randomized structure from motion based on atomic 3D models from camera triplets. In Computer Vision and Pattern Recognition,” 2009 IEEE Conf. on Computer Vision and Pattern Recognition (CVPR 2009), pp. 2874-2881, 2009.
  20. [20] S. Davide, F. Fraundorfer, and M. Pollefeys, “Closing the loop in appearance-guided omnidirectional visual odometry by using vocabulary trees,” Robotics and Autonomous Systems, Vol.58, No.6, pp. 820-827, 2010.
  21. [21] H. Michal, A. Torii, and T. Pajdla, “Efficient structure from motion by graph optimization,” The 11th European Conference on Computer Vision (ECCV 2010), pp. 100-113, 2010.
  22. [22] T. Anai, T. Sasaki, H. Otani, K. Osaragi, and N. Kochi, “Aerial photogrammetry procedure optimized for micro uav,” Intenational Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol.XL-5, pp. 41-46, doi:10.5194/isprsarchives-XL-5-41-2014, 2014.
  23. [23] E. Cappelletto, P. Zanuttigh, and G. M. Cortelazzo, “3D scanning of cultural heritage with consumer depth cameras,” Multimedia Tools and Applications, Vol.75, No.7, pp. 3631-3654, 2016.
  24. [24] https://kuvacode.com [Accessed September 20, 2017]
  25. [25] T. Tanabata, T. Yamada, Y. Shimizu, R. Shinozaki, K. Kinsyo, and M. Takano, “Development of Automatic Segmentation Software for Efficient Measurement of Area on the Digital Images of Plant Organs,” Horticultural Research (Japan), Vol.9, No.4, pp. 501-506, 2010.
  26. [26] https://en.wikipedia.org/wiki/Resin_casting [Accessed September 20, 2017]
  27. [27] https://www.creaform3d.com/ja/ce-ding-soriyusiyon/ potaburu3dsukiyanahandyscan-3d [Accessed September 18, 2017]
  28. [28] https://www.nikonmetrology.com/en-us/product/p3d-nc-2323s [Accessed September 20, 2017]
  29. [29] T. Watanabe, T. Niwa, and H. Masuda, “Registration of Point-Clouds from Terrestrial and Portable Laser Scanners,” Int. J. of Automation Technology, Vol.10, p. 63, 2016.
  30. [30] https://www.creaform3d.com/sites/default/files/assets/ technological-fundamentals/ebook2_understanding_portable_3d_ scanning _technologies_en_11092014.pdf [Accessed September 18, 2017]
  31. [31] T. Wakayama and T. Yoshizawa, “Compact camera for three-dimensional profilometry incorporating a single MEMS mirror,” Optical Engineering, Vol.51, No.1, 013601, 2012.
  32. [32] http://imaging.nikon.com/lineup/dslr/index.htm [Accessed September 21, 2017]
  33. [33] R. Matsumoto, “Expected Accuracy of 3D Measurement by Using a Pair of Stereo Photographs,” Japan Society of Photogrammetry and Remote Sensing, Vol.50, No.5, pp. 302-307, 2011 (in Japanese).
  34. [34] P. J. Besl and N. D. Mckay, “A method for registration of 3-D shapes,” IEEE Trans. on Pattern Analysis and Machine Intelligence, Vol.14, No.2, pp. 239-256, 1992.
  35. [35] S. Kaneko, T. Kondo, A. Miyamoto, and S. Igarashi, “Robust ICP Registration Algorithm Extended by M-estimation,” Trans. on JSPE (The Japan Society for Precision Engeneering), Vol 67, No 8, pp. 1276-1280, 2001 (in Japanese).
  36. [36] A. E. Ichim, M. Pauly, and R. Rusu, “RGB-D Handheld Mapping and Modeling,” Ecole Polytechnique Federale de Lausanne, 2013.
  37. [37] H. Okuda, Y. Kitaaki, M. Hashimoto, and S. Kaneko, “HM-ICP: Fast 3-D Registration Algorithm with Hierachical and Region Selection Approach of M-ICP,” J. of Robotics and Mechatronics, Vol.18, No.6, pp. 765-771, 2006.
  38. [38] F. R. Hampel, E. M. Ronchetti, P. J. Rousseeuw, and W. A. Stahel, “Robust Statics: The Approach Based on Influence Functions,” Wiley series, ISBN: 978-0-471-73577-9, 2005.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 05, 2024