Skip to main content
Log in

A novel hardware plane fitting implementation and applications for bionic vision

  • Original Paper
  • Published:
Machine Vision and Applications Aims and scope Submit manuscript

Abstract

This paper presents a high-speed real-time plane fitting implementation on a field-programmable gate array (FPGA) platform. A novel hardware-based least squares algorithm fits planes to patches of points within a depth image captured using a Microsoft Kinect v2 sensor. The validity of a plane fit and the plane parameters are reported for each patch of 11 by 11 depth pixels. The high level of parallelism of operations in the algorithm has allowed for a fast, low-latency hardware implementation on an FPGA that is capable of processing depth data at a rate of 480 frames per second. A hybrid hardware–software end-to-end system integrates the hardware solution with the Kinect v2 sensor via a computer and PCI express communication link to a Terasic TR4 FPGA development board. We have also implemented two proof-of-concept object detection applications as future candidates for bionic vision systems. We show that our complete end-to-end system is capable of running at 60 frames per second. An analysis and characterisation of the Kinect v2 sensor errors has been performed in order to specify logic precision requirements, statistical testing of the validity of a plane fit, and achievable plane fitting angle resolution.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

References

  1. Altera Quartus II Design Software. https://www.altera.com/products/design-software/fpga-design/quartus-ii/overview.html

  2. Altera: RAM-based shift register megafunction user guide. https://www.altera.com/content/dam/altera-www/global/en_US/pdfs/literature/ug/ug_shift_register_ram_based.pdf

  3. Altera Stratix IV FPGAs. https://www.altera.com/products/fpga/stratix-series/stratix-iv/overview.html

  4. Ground Plane and Tabletop Detection Results. http://youtu.be/CtDp8-Cqi4I

  5. MathWorks MATLAB Software Package. http://au.mathworks.com/products/matlab/

  6. Microsoft Kinect for Windows v2 Sensor. https://www.microsoft.com/en-us/kinectforwindows/

  7. ModelSim-Altera Software. https://www.altera.com/products/design-software/model-simulation/modelsim-altera-software.html

  8. Terasic TR4 Development Board. http://tr4.terasic.com/

  9. Bailey, D.G.: Design for embedded image processing on FPGAs. Wiley (2011)

  10. Benedetti, A., Perona, P.: Real-time 2-D feature detection on a reconfigurable computer. In: 1998 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 586–593 (1998)

  11. Boyle, J.R.: Improving Perception from Electronic Visual Prostheses. Ph.D. thesis (2005)

  12. Brindley, G.S., Lewin, W.S.: The visual sensations produced by electrical stimulation of the medial occipital cortex. J. Physiol. 196(2), 479–493 (1968)

    Article  Google Scholar 

  13. Butkiewicz, T.: Low-cost Coastal Mapping using Kinect v2 Time-of-Flight Cameras. In: Oceans—St John’s, pp. 1–9 (2014)

  14. Cho, J., Mirzaei, S., Oberg, J., Kastner, R.: FPGA-based face detection system using Haar classifiers. In: Proceedings of the ACM/SIGDA international symposium on Field programmable gate arrays, pp. 103–111 (2009)

  15. Fischler, M.A., Bolles, R.C.: Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 24(6), 381–395 (1981)

    Article  MathSciNet  Google Scholar 

  16. Henry, P., Krainin, M., Herbst, E., Ren, X., Fox, D.: RGB-D mapping: using kinect-style depth cameras for dense 3D modeling of indoor environments. Int. J. Robot. Res. 31(5), 647–663 (2012)

    Article  Google Scholar 

  17. Humayun, M.S., de Juan, E., Dagnelie, G., Greenberg, R.J., Propst, R.H., Phillips, D.H.: Visual perception elicited by electrical stimulation of retina in blind humans. Arch. Ophthalmol. 114(1), 40–46 (1996)

    Article  Google Scholar 

  18. Josh, H., Mann, C., Kleeman, L., Lui, W.L.D.: Psychophysics Testing of Bionic Vision Image Processing Algorithms Using an FPGA Hatpack. In: 2013 20th IEEE International Conference on Image Processing (ICIP), pp. 1550–1554 (2013)

  19. Josh, H., Yong, B., Kleeman, L.: Mobile, Real-Time Simulator for a Cortical Visual Prosthesis. In: Proceedings of the International Conference on Biomedical Electronics and Devices, pp. 37–46 (2012)

  20. Josh, H., Yong, B., Kleeman, L.: A real-time and portable bionic eye simulator. In: Biomedical Engineering Systems and Technologies, 5th International Joint Conference, BIOSTEC 2012, Vilamoura, Portugal, February 1–4, 2012, Revised Selected Papers, vol. 357, pp. 51–67. Springer (2013)

  21. Khoshelham, K., Elberink, S.O.: Accuracy and resolution of kinect depth data for indoor mapping applications. Sensors 12(2), 1437–1454 (2012)

    Article  Google Scholar 

  22. Kiral-Kornek, F.I., O’Sullivan-Greene, E., Savage, C.O., McCarthy, C., Grayden, D.B., Burkitt, A.N.: Improved visual performance in letter perception through edge orientation encoding in a retinal prosthesis simulation. J. Neural Eng. 11(6), 066,002 (2014)

  23. Kraft, M., Schmidt, A., Kasinski, A.: High-speed image feature detection using fpga implementation of fast algorithm. In: Proceedings of the International Conference on Computer Vision Theory and Application (VISAPP), pp. 174–179 (2008)

  24. Lachat, E., Macher, H., Mittet, M.A., Landes, T., Grussenmeyer, P.: First experiences with kinect V2 sensor for close range 3D modelling. In: ISPRS—International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. XL-5/W4, pp. 93–100 (2015)

  25. Li, J., Papachristou, C., Shekar, R.: An FPGA-based computing platform for real-time 3D medical imaging and its application to cone-beam CT reconstruction. J. Imaging Sci. Technol. 49(3), 237–245 (2005)

    Google Scholar 

  26. Li, W.H.: A Fast and Flexible Computer Vision System for Implanted Visual Prostheses. In: Computer Vision—ECCV 2014 Workshops, vol. 8927, pp. 686–701 (2015)

  27. Li, W.H., Tang, T.J.J., Lui, W.L.D.: Going beyond vision to improve bionic vision. In: 2013 20th IEEE International Conference on Image Processing (ICIP), pp. 1555–1558 (2013)

  28. Lim, Y.K., Kleeman, L., Drummond, T.: Algorithmic methodologies for FPGA-based vision. Mach. Vis. Appl. 24(6), 1197–1211 (2013)

    Article  Google Scholar 

  29. Lowery, A.J.: Introducing the Monash vision group’s cortical prosthesis. In: 2013 20th IEEE International Conference on Image Processing (ICIP), pp. 1536–1539 (2013)

  30. Lui, W.L.D., Browne, D., Kleeman, L., Drummond, T., Li, W.H.: Transformative reality: improving bionic vision with robotic sensing. In: Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBS), pp. 304–307 (2012)

  31. McCarthy, C., Walker, J.G., Lieby, P., Scott, A., Barnes, N.: Mobility and low contrast trip hazard avoidance using augmented depth. J. Neural Eng. 12(1), 1–15 (2015)

  32. Parikh, N., Itti, L., Humayun, M.S., Weiland, J.: Performance of visually guided tasks using simulated prosthetic vision and saliency-based cues. J. Neural Eng. 10(2), 1–13 (2013)

  33. Pinto, A.M., Costa, P., Moreira, A.P., Rocha, L.F., Veiga, G., Moreira, E.: Evaluation of Depth Sensors for Robotic Applications. In: IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC), pp. 139–143 (2015)

  34. Rorres, C., Anton, H.: Applications of Linear Algebra, 2nd edn. (1979)

  35. Stückler, J., Steffens, R.: Real-Time 3D Perception and Efficient Grasp Planning for Everyday Manipulation Tasks. In: Proceedings of the European Conference on Mobile Robots (ECMR), pp. 1–6, Orebro, Sweden (2011)

  36. Tang, T.J.J., Li, W.H.: An Assistive EyeWear Prototype that interactively converts 3D Object Locations into Spatial Audio. In: Proceedings of the 2014 ACM International Symposium on Wearable Computers, pp. 119–126 (2014)

  37. Tang, T.J.J., Lui, W.L.D., Li, W.H.: Plane-based detection of staircases using inverse depth. In: Australasian Conference on Robotics and Automation (ACRA) (2012)

  38. van Rheede, J.J., Kennard, C., Hicks, S.L.: Simulating prosthetic vision: optimizing the information content of a limited visual display. J. Vis. 10(14), 1–14 (2010)

    Article  Google Scholar 

  39. Veraart, C., Raftopoulos, C., Mortimer, J.T., Delbeke, J., Pins, D., Michaux, G., Vanlierde, A., Parrini, S., Wanet-Defalque, M.C.: Visual sensations produced by optic nerve stimulation using an implanted self-sizing spiral cuff electrode. Brain Res. 813(1), 181–186 (1998)

    Article  Google Scholar 

  40. Weiland, J.D., Parikh, N., Pradeep, V., Medioni, G.: Smart image processing system for retinal prosthesis. In: Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBS), pp. 300–303 (2012)

  41. Zhao, Y., Lu, Y., Tian, Y., Li, L., Ren, Q., Chai, X.: Image processing based recognition of images with a limited number of pixels using simulated prosthetic vision. Inf. Sci. 180(16), 2915–2924 (2010)

    Article  Google Scholar 

Download references

Acknowledgments

This work was funded through the Australian Research Council Research in Bionic Vision Science and Technology Initiative (SR1000006).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Horace Josh.

Appendix: Derivation of resultant error models of normalised plane parameters

Appendix: Derivation of resultant error models of normalised plane parameters

Having found the mean and standard deviation of depth value error (Sect. 3.1), a model for the error in the parameters of our plane fitting algorithm can be developed. As shown in Sect. 2 which discusses our plane fitting implementation, for a plane with the form \(z = Au + Bv + C\), the parameters A, B, and C can be found using the simplified equations from (6). The z values are assumed to be mutually independent normal random variables and so the expectation, variance and standard deviation of the parameters can be found as follows. A, B and C can be represented as a linear combinations of \(z_i\):

$$\begin{aligned} A= & {} \frac{1}{\sum _{i=1}^{m}u_i^2}\left( \sum _{i=1}^{m}u_iz_i\right) \nonumber \\ B= & {} \frac{1}{\sum _{i=1}^{m}v_i^2}\left( \sum _{i=1}^{m}v_iz_i\right) \\ C= & {} \frac{1}{n}\sum _{i=1}^{m}z_i\nonumber \end{aligned}$$
(47)

Therefore, assuming \(z_i = (z_\mathrm{true} + e_i )\), where \(e_i\sim {}N(0,\sigma _z)\), all estimates of A, B, and C will have zero mean errors and variances as follows:

$$\begin{aligned}&\sigma _A^2 = \left[ \frac{1}{\sum _{i=1}^{m}u_i^2}\right] ^2\left( \sum \limits _{i=1}^{m}u_i^2\sigma _i^2\right) = \frac{\sigma _z^2}{\sum _{i=1}^{m}u_i^2}\nonumber \\&\sigma _B^2 = \left[ \frac{1}{\sum _{i=1}^{m}v_i^2}\right] ^2\left( \sum \limits _{i=1}^{m}v_i^2\sigma _i^2\right) = \frac{\sigma _z^2}{\sum _{i=1}^{m}v_i^2} \nonumber \\&\sigma _C^2 = \frac{1}{n^2}\sum \limits _{i=1}^{m}\sigma _i^2 = \frac{\sigma _z^2}{n}\nonumber \\&\mathrm{and}~\mathrm{standard}~\mathrm{deviations}~\sigma _A,~\sigma _B~\mathrm{and}~\sigma _C \end{aligned}$$
(48)

From Eq. (9) the standard deviations of A and B can be expressed in terms of k as follows:

$$\begin{aligned} \sigma _A= & {} \frac{\sigma _z}{k\sqrt{6}}\nonumber \\ \sigma _B= & {} \frac{\sigma _z}{k\sqrt{6}}\nonumber \\ \sigma _C= & {} \frac{\sigma _z}{3} \end{aligned}$$
(49)

From Eq. (49) it is apparent that the standard deviation of errors scales inversely with patch size. Therefore larger patch sizes reduce errors. From Eq. (19) we can see that \(\alpha \) and \(\beta \) are scaled versions of A and B and so the variances follow as scaled versions of Eq. (48):

$$\begin{aligned}&\sigma _\alpha ^2 = \mathrm{Var}[-AF] = F^2\frac{\sigma _z^2}{\sum _{i=1}^{m}u_i^2} = F^2\frac{\sigma _z^2}{6k^2}\nonumber \ \\&\sigma _\beta ^2 = \mathrm{Var}[-BF] = F^2\frac{\sigma _z^2}{\sum _{i=1}^{m}v_i^2} = F^2\frac{\sigma _z^2}{6k^2}\\&\mathrm{and}~\mathrm{standard}~\mathrm{deviations}~\sigma _\alpha ~\mathrm{and}~\sigma _\beta \nonumber \end{aligned}$$
(50)

To find the variance of \(\gamma \) we need to expand Eq. (19) as follows, so that all correlated terms are grouped.

$$\begin{aligned} \gamma= & {} C + Au_c + Bv_c \nonumber \\= & {} \frac{\sum _{i=1}^{m}z_i}{n} + u_c\frac{\sum {(\mathrm{column}\,3)} - \sum {(\mathrm{column}\,1)}}{30}\nonumber \\&+\, v_c\frac{\sum {(\mathrm{row}\,3)} - \sum {(\mathrm{row}\,1)}}{30} \nonumber \\= & {} z_1\left( \frac{1}{9} + \frac{u_c}{30} - \frac{v_c}{30}\right) + z_2\left( \frac{1}{9} - \frac{v_c}{30}\right) + z_3\left( \frac{1}{9} + \frac{u_c}{30} - \frac{v_c}{30}\right) \nonumber \\&+\, z_4\left( \frac{1}{9} - \frac{u_c}{30}\right) + z_5\left( \frac{1}{9}\right) + z_6\left( \frac{1}{9} + \frac{u_c}{30}\right) \nonumber \\&+\, z_7\left( \frac{1}{9} - \frac{u_c}{30} + \frac{v_c}{30}\right) + z_8\left( \frac{1}{9} + \frac{v_c}{30}\right) \nonumber \\&+\, z_9\left( \frac{1}{9} - \frac{u_c}{30} + \frac{v_c}{30}\right) \end{aligned}$$
(51)

Now if we choose the worst case values for \(u_c\) and \(v_c\) of 256 and 212, respectively, then the variance can be expressed as follows:

$$\begin{aligned} \sigma _\gamma ^2= & {} \sigma _z^2\left( \frac{-697}{45}\right) ^2 + \sigma _z^2\left( \frac{-313}{45}\right) ^2 + \sigma _z^2\left( \frac{71}{45}\right) ^2 \nonumber \\&+\, \sigma _z^2\left( \frac{-379}{45}\right) ^2 + \sigma _z^2\left( \frac{1}{9}\right) ^2 + \sigma _z^2\left( \frac{389}{45}\right) ^2 \nonumber \\&+\, \sigma _z^2\left( \frac{-61}{45}\right) ^2 + \sigma _z^2\left( \frac{323}{45}\right) ^2 + \sigma _z^2\left( \frac{707}{45}\right) ^2 \nonumber \\= & {} \sigma _z^2\left( \frac{33149}{45}\right) \approx (27.14\sigma _z)^2 \end{aligned}$$
(52)

For the variance of \(\kappa \), we use the following approximation:

$$\begin{aligned}&\mathrm{For}~\mathrm{some}~f = aX^b \nonumber \\&\sigma _f^2\approx (abX^{b-1}\sigma _X)^2 = \left( \frac{fb\sigma _X}{X}\right) ^2 \end{aligned}$$
(53)

Giving us the following variance for \(\kappa \):

$$\begin{aligned} \sigma _\kappa ^2\approx (2C\sigma _C)^2 = \left( \frac{2C\sigma _z}{3}\right) ^2 \end{aligned}$$
(54)

From Eqs. (50, 52, 54) we can now define standard deviations for the four plane parameters \(\alpha \), \(\beta \), \(\gamma \) and \(\kappa \):

$$\begin{aligned} \sigma _\alpha= & {} F\frac{\sigma _z}{k\sqrt{6}}\nonumber \\ \sigma _\beta= & {} F\frac{\sigma _z}{k\sqrt{6}} \\ \sigma _\gamma= & {} \sigma _z\sqrt{\frac{33149}{45}} \approx 27.14\sigma _z \nonumber \\ \sigma _\kappa\approx & {} \frac{2C\sigma _z}{3}\nonumber \end{aligned}$$
(55)

To find the variances of the normalised plane parameters \(\alpha _n\), \(\beta _n\), \(\gamma _n\) and \(\kappa _n\), we can use the following approximation:

$$\begin{aligned}&\mathrm{For}~\mathrm{some}~f = \frac{X}{Y}\nonumber \\&\sigma _f^2\approx f^2\left[ \left( \frac{\sigma _X}{X}\right) ^2 + \left( \frac{\sigma _Y}{Y}\right) ^2 - 2\frac{\sigma _{XY}}{XY}\right] \end{aligned}$$
(56)

We need to define standard deviation equations for \(\lambda \) first. From Eq. (22), we know that \(\lambda = |\alpha | + |\beta | + |\gamma |\). Therefore there are eight different cases for evaluating variance, as each of \(\alpha \), \(\beta \), \(\gamma \) could have a positive or negative result. It can be shown, however, that there are only four unique cases, and furthermore we are only interested in the best and worst case variances. This leaves the two following cases for \(\lambda \):

$$\begin{aligned} \lambda _1= & {} (\alpha ) + (\beta ) + (\gamma ) = -FA -FB + (C + Au_c + Bv_c)\nonumber \\ \lambda _2= & {} (\alpha ) + (\beta ) + (-\gamma ) = -FA -FB - (C + Au_c + Bv_c)\nonumber \\ \end{aligned}$$
(57)

Using a similar method as was done for \(\sigma _{\gamma }^2\) in Eqs. (51, 52), we can find the following variance equations for the two cases of \(\lambda \):

$$\begin{aligned} \sigma _{\lambda _1}^2= & {} \frac{{\sigma _z^2}53749}{225} \approx (15.456\sigma _z)^2 \nonumber \\ \sigma _{\lambda _2}^2= & {} \frac{{\sigma _z^2}1081477}{225} \approx (69.329\sigma _z)^2 \end{aligned}$$
(58)

We also need to find the covariances of each of the plane parameters \(\alpha \), \(\beta \), \(\gamma \) and \(\kappa \) with the scaling factor \(\lambda \). This can be done through the products of each, removing the non-correlated cross-terms. The following covariances can be defined:

$$\begin{aligned}&\mathrm{For}~\mathrm{best}~\mathrm{case}~\mathrm{of}~\sigma _{\lambda _1} = 15.456\sigma _z, \nonumber \\&\sigma _{\alpha \lambda _1} = \frac{1342\sigma _z^2}{5} \approx 16.38^2\sigma _z^2 \nonumber \\&\sigma _{\beta \lambda _1} = \frac{9394\sigma _z^2}{25} \approx 19.38^2\sigma _z^2 \\&\sigma _{\gamma \lambda _1} = -\frac{91187\sigma _z^2}{225} \approx -20.13^2\sigma _z^2 \nonumber \\&\sigma _{\kappa \lambda _1} = \frac{\sigma _z^3}{81}\nonumber \end{aligned}$$
(59a)
$$\begin{aligned}&\mathrm{For}~\mathrm{worst}~\mathrm{case}~\mathrm{of}~\sigma _{\lambda _2} = 69.329\sigma _z,\nonumber \\&\sigma _{\alpha \lambda _2} = \frac{37942\sigma _z^2}{25} \approx 38.96^2\sigma _z^2 \nonumber \\&\sigma _{\beta \lambda _2} = \frac{35258\sigma _z^2}{25} \approx 37.55^2\sigma _z^2\\&\sigma _{\gamma \lambda _2} = -\frac{422677\sigma _z^2}{225} \approx -43.34^2\sigma _z^2 \nonumber \\&\sigma _{\kappa \lambda _2} = -\frac{\sigma _z^3}{81}\nonumber \end{aligned}$$
(59b)

Substituting Eqs. (55, 58, 59a, 59b) into Eq. (56) gives the following equations for the variances of the normalised plane parameters \(\alpha _n\), \(\beta _n\), \(\gamma _n\) and \(\kappa _n\):

$$\begin{aligned}&\mathrm{For}~\mathrm{best}~\mathrm{case}~\mathrm{of}~\sigma _{\lambda _1} = 15.456\sigma _z,\nonumber \\&\sigma _{\alpha _n}^2 \approx \sigma _z^2\left( \frac{\alpha }{\lambda _1}\right) ^2\left[ \frac{29.88^2}{\alpha ^2} + \frac{15.45^2}{\lambda _1^2} - \frac{23.17^2}{\alpha \lambda _1}\right] \nonumber \\&\sigma _{\beta _n}^2 \approx \sigma _z^2\left( \frac{\beta }{\lambda _1}\right) ^2\left[ \frac{29.88^2}{\beta ^2} + \frac{15.45^2}{\lambda _1^2} - \frac{27.41^2}{\beta \lambda _1}\right] \\&\sigma _{\gamma _n}^2 \approx \sigma _z^2\left( \frac{\gamma }{\lambda _1}\right) ^2\left[ \frac{27.14^2}{\gamma ^2} + \frac{15.45^2}{\lambda _1^2} + \frac{28.47^2}{\gamma \lambda _1}\right] \nonumber \\&\sigma _{\kappa _n}^2 \approx \sigma _z^2\left( \frac{\kappa }{\lambda _1}\right) ^2\left[ \frac{(0.66C)^2}{\kappa ^2} + \frac{15.45^2}{\lambda _1^2} - \frac{0.024\sigma _z}{\kappa \lambda _1}\right] \nonumber \end{aligned}$$
(60a)
$$\begin{aligned}&\mathrm{For}~\mathrm{worst}~\mathrm{case}~\mathrm{of}~\sigma _{\lambda _2} = 69.329\sigma _z, \nonumber \\&\sigma _{\alpha _n}^2 \approx \sigma _z^2\left( \frac{\alpha }{\lambda _2}\right) ^2\left[ \frac{29.88^2}{\alpha ^2} + \frac{69.33^2}{\lambda _2^2} - \frac{55.09^2}{\alpha \lambda _2}\right] \nonumber \\&\sigma _{\beta _n}^2 \approx \sigma _z^2\left( \frac{\beta }{\lambda _2}\right) ^2\left[ \frac{29.88^2}{\beta ^2} + \frac{69.33^2}{\lambda _2^2} - \frac{53.11^2}{\beta \lambda _2}\right] \\&\sigma _{\gamma _n}^2 \approx \sigma _z^2\left( \frac{\gamma }{\lambda _2}\right) ^2\left[ \frac{27.14^2}{\gamma ^2} + \frac{69.33^2}{\lambda _2^2} + \frac{61.3^2}{\gamma \lambda _2}\right] \nonumber \\&\sigma _{\kappa _n}^2 \approx \sigma _z^2\left( \frac{\kappa }{\lambda _2}\right) ^2\left[ \frac{(0.66C)^2}{\kappa ^2} + \frac{69.33^2}{\lambda _2^2} + \frac{0.024\sigma _z}{\kappa \lambda _2}\right] \nonumber \end{aligned}$$
(60b)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Josh, H., Kleeman, L. A novel hardware plane fitting implementation and applications for bionic vision. Machine Vision and Applications 27, 967–982 (2016). https://doi.org/10.1007/s00138-016-0764-8

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00138-016-0764-8

Keywords

Navigation