Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Article
  • Published:

Integration of deep learning and soft robotics for a biomimetic approach to nonlinear sensing

Abstract

Traditional approaches to sensing have often been aimed at simple sensor characteristics to make interpretation of the sensor outputs easier, but this has also limited the quality of the encoded sensory information. Integrating a complex sensor with deep learning could hence be a strategy for removing current limitations on the information that sensory inputs can carry. Here, we demonstrate this concept with a soft-robotic sensor that mimics fast non-rigid deformation of the ears in certain bat species. We show that a deep convolutional neural network can use the nonlinear Doppler shift signatures generated by these motions to estimate the direction of a sound source with an estimation error of ~0.5°. Previously, determining the direction of a sound source based on pressure receivers required either multiple frequencies or multiple receivers. Our current results demonstrate a third approach that makes do with only a single frequency and a single receiver.

This is a preview of subscription content, access via your institution

Access options

Rent or buy this article

Prices vary by article type

from$1.95

to$39.95

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: Experimental set-up.
Fig. 2: Doppler shift signatures associated with different directions.
Fig. 3: Diagram of the CNN architecture.
Fig. 4: Deep learning of direction finding from Doppler signatures.
Fig. 5: Accuracy of direction finding with a dynamic soft-robotic pinna.
Fig. 6: Direction-finding approaches.

Similar content being viewed by others

Data availability

The original sound recordings are available from the corresponding author upon reasonable request. The training datasets (spectrograms derived from the recordings) are available via Code Ocean at https://doi.org/10.24433/CO.6834234.v1. Source data are provided with this paper.

Code availability

All source code for the CNN is available via Code Ocean at https://doi.org/10.24433/CO.6834234.v1.

References

  1. Hansen, R. & Kolev, N. Introduction to Synthetic Aperture Sonar 1–28 (INTECH Open Access, 2011).

  2. Athley, F., Engdahl, C. & Sunnergren, P. On radar detection and direction finding using sparse arrays. IEEE Trans. Aerosp. Electron. Syst. 43, 1319–1333 (2007).

    Article  Google Scholar 

  3. Tuncer, T. E. & Friedlander, B. Classical and Modern Direction-of-Arrival Estimation (Academic Press, 2009).

    Google Scholar 

  4. Peng, H. M., Chang, E. R. & Wang, L. S. Rotation method for direction finding via GPS carrier phases. IEEE Trans. Aerosp. Electron. Syst. 36, 72–84 (2000).

    Article  Google Scholar 

  5. Nehorai, A. & Paldi, E. Acoustic vector-sensor array processing. IEEE Trans. Signal Process. 42, 2481–2491 (1994).

    Article  Google Scholar 

  6. De Bree, H. E. The microflown: an acoustic particle velocity sensor. Acoust. Aust. 31, 91–94 (2003).

    Google Scholar 

  7. Benesty, J., Chen, J. & Huang, Y. Microphone Array Signal Processing Vol. 1 (Springer Science & Business Media, 2008).

  8. Bai, M. R. & Lin, C. Microphone array signal processing with application in three-dimensional spatial hearing. J. Acoust. Soc. Am. 117, 2112–2121 (2005).

    Article  Google Scholar 

  9. Araki, S., Sawada, H., Mukai, R. & Makino, S. DOA estimation for multiple sparse sources with arbitrarily arranged multiple sensors. J. Signal Process. Syst. 63, 265–275 (2011).

    Article  Google Scholar 

  10. Hu, J. S., Chan, C. Y., Wang, C. K., Lee, M. T. & Kuo, C. Y. Simultaneous localization of a mobile robot and multiple sound sources using a microphone array. Adv. Robot. 25, 135–152 (2011).

    Article  Google Scholar 

  11. Brandstein, M. & Ward, D. Microphone Arrays: Signal Processing Techniques and Applications (Springer Science & Business Media, 2013).

  12. Middlebrooks, J. C. & Green, D. M. Sound localization by human listeners. Annu. Rev. Psychol. 42, 135–159 (1991).

    Article  Google Scholar 

  13. Hayes, M. P. & Gough, P. T. Synthetic aperture sonar: a review of current status. IEEE J. Ocean. Eng. 34, 207–224 (2009).

    Article  Google Scholar 

  14. Moreira, A. et al. A tutorial on synthetic aperture radar. IEEE Geosci. Remote Sens. Mag. 1, 6–43 (2013).

    Article  Google Scholar 

  15. Zonooz, B. et al. Spectral weighting underlies perceived sound elevation. Sci. Rep. 9, 1642 (2019).

    Article  Google Scholar 

  16. McNab, B. K. & Köhler, M. The difficulty with correlations: energy expenditure and brain mass in bats. Comp. Biochem. Physiol. A Mol. Integr. Physiol. 212, 9–14 (2017).

    Article  Google Scholar 

  17. Yin, X. & Müller, R. Fast-moving bat ears create informative Doppler shifts. Proc. Natl Acad. Sci. USA 116, 12270–12274 (2019).

    Article  Google Scholar 

  18. Gao, L., Balakrishnan, S., He, W., Yan, Z. & Müller, R. Ear deformations give bats a physical mechanism for fast adaptation of ultrasonic beam patterns. Phys. Rev. Lett. 107, 214301 (2011).

    Article  Google Scholar 

  19. Yin, X., Qiu, P., Yang, L. & Müller, R. Horseshoe bats and old world leaf-nosed bats have two discrete types of pinna motions. J. Acoust. Soc. Am. 141, 3011–3017 (2017).

    Article  Google Scholar 

  20. Müller, R., Lu, H. & Buck, J. R. Sound-diffracting flap in the ear of a bat generates spatial information. Phys. Rev. Lett. 100, 108701 (2008).

    Article  Google Scholar 

  21. Müller, R. A numerical study of the role of the tragus in the big brown bat. J. Acoust. Soc. Am. 116, 3701–3712 (2004).

    Article  Google Scholar 

  22. Oldfield, S. R. & Parker, S. P. Acuity of sound localisation: a topography of auditory space. III. Monaural hearing conditions. Perception 15, 67–81 (1986).

    Article  Google Scholar 

  23. Lawrence, B. D. & Simmons, J. A. Echolocation in bats: the external ear and perception of the vertical positions of targets. Science 218, 481–483 (1982).

    Article  Google Scholar 

  24. Masters, W. M., Moffat, A. J. & Simmons, J. A. Sonar tracking of horizontally moving targets by the big brown bat Eptesicus fuscus. Science 228, 1331–1333 (1985).

    Article  Google Scholar 

  25. Wotton, J. M. & Jenison, R. L. A backpropagation network model of the monaural localization information available in the bat echolocation system. J. Acoust. Soc. Am. 101, 2964–2972 (1997).

    Article  Google Scholar 

  26. Ferguson, E. L., Williams, S. B. & Jin, C. T. Sound source localization in a multipath environment using convolutional neural networks. In 2018 IEEE International Conference on Acoustics, Speech and Signal Processing 2386–2390 (IEEE, 2018).

  27. Pu, H. et al. Towards robust multiple blind source localization using source separation and beamforming. Sensors 21, 532 (2021).

    Article  Google Scholar 

  28. Sutlive, J. & Müller, R. Dynamic echo signatures created by a biomimetic sonar head. Bioinspir. Biomim. 14, 066014 (2019).

    Article  Google Scholar 

  29. Lurton, X. An Introduction to Underwater Acoustics: Principles and Applications (Springer, 2002).

  30. Lathuiliére, S., Mesejo, P., Alameda-Pineda, X. & Horaud, R. A comprehensive analysis of deep regression. IEEE Trans. Pattern Anal. Mach. Intell. 42, 2065–2081 (2020).

    Article  Google Scholar 

  31. Simonyan, K. & Zisserman, A. Very deep convolutional networks for large-scale image recognition. In Proc. 3rd International Conference on Learning Representations (ICLR, 2015); https://arxiv.org/pdf/1409.1556.pdf

  32. Kingma, D. P. & Ba, J. Adam: a method for stochastic optimization. In Proc. 3rd International Conference on Learning Representations (ICLR, 2015); https://arxiv.org/pdf/1412.6980.pdf

  33. Yin, X. & Müller, R. Integration of Deep Learning and Soft Robotics for a Biomimetic Approach To Nonlinear Sensing (Code Ocean, 2021); https://doi.org/10.24433/CO.6834234.v1

Download references

Acknowledgements

This research has been supported by the Office of Naval Research (award no. N00014-17-1-2376 to R.M.), the National Science Foundation (award no. 1362886 to R.M.), the Naval Engineering Education Consortium (award no. N001741910001 to R.M.) and a fellowship from the China Scholarship Council to X.Y.

Author information

Authors and Affiliations

Authors

Contributions

R.M. and X.Y. conceived and designed the experiments. X.Y. performed the experiments. X.Y. and R.M. analysed the results. X.Y. and R.M. wrote the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Rolf Müller.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Peer review information Nature Machine Intelligence thanks Deepak Gala and Ming Zhong for their contribution to the peer review of this work.

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary Video 1

This video shows an example of fast ear motions in a female Pratt’s roundleaf bat (Hipposideros pratti). The video was recorded at a frame rate of 200 Hz, and it is first played back using the original sequence of frames and then a second time slowed down 10 times.

Supplementary Video 2

Hardware-in-the-loop implementation of the direction-finding approach: The pinna capable of fast deformations is mounted on a pan-tilt unit and the speaker emits a single ultrasonic frequency (90 kHz), and the deep neural network tries to align the pinna with the loudspeaker so that the laser pointer mounted next to the pinna is aimed at the piece of paper next to the speaker. This video is a demonstration and does not depict the experimental procedure described in the text.

Source data

Source Data Fig. 5

Source data that includes prediction and measurement errors for Fig. 5.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yin, X., Müller, R. Integration of deep learning and soft robotics for a biomimetic approach to nonlinear sensing. Nat Mach Intell 3, 507–512 (2021). https://doi.org/10.1038/s42256-021-00330-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s42256-021-00330-1

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing