Skip to main content
Log in

Underwater bubble plumes multi-scale morphological feature extraction and state recognition method

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

A large amount of information can be derived from the motion form of underwater bubble plumes. As a result of the inadequate light, mutual adhesion, and absence of a background for the bubble plumes, it is difficult to extract features from the images of bubble plumes, and the recognition accuracy is low. Using a combination of nonsubsampled contourlet transform (NSCT) and quantum neural network (QNN), we present a method for extracting bubble plumes features and recognizing their states. To obtain the multi-scale sub-band images, the underwater bubble plumes image is transformed by NSCT. In the case of low-frequency images, the fuzzy set binarization method is used to extract bright spots, after which morphological features are calculated. The differential box counting (DBC) method is used to calculate the fractal dimension for the high-frequency images, which is used as a directional detail feature. In order to achieve accurate state recognition of the underwater bubble plumes, the quantum gate set convolution neural network (QCSCNN) was designed, taking into account the advantages of the quantum gate and the convolution neural network (CNN). In conclusion, the proposed method is implemented and the experimental results indicate that it achieves the promising and satisfactory results. According to the proposed method, it has been confirmed that it is able to achieve a good effect on convergence speed as well as the recognition accuracy of underwater bubble plumes.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Data availability

The data sets generated and/or analyzed during the current study are available upon reasonable request from the corresponding author.

References

  1. O’Malley DJ, Haelssig JB (2019) Multiscale modelling of mass transfer in gas jets and bubble plumes. Can J Chem Eng 97(11):2843–2879

    Article  Google Scholar 

  2. Yang X, Sun S, et al (2022) Underwater bubble plume image generative model based on noise prior and multi conditional labels. Image Vis Comput 119(10):1–12

  3. Odena A, Olah C, Shlens J (2017) Conditional image synthesis with auxiliary classifier GANs. In: Proceedings of the international conference on machine learning, pp 2642–2651

  4. Li X, Chen G, Khan F (2019) Analysis of underwater gas release and dispersion behavior to assess subsea safety risk. J Hazard Mater 367(5):676–685

    Article  Google Scholar 

  5. Guan Z, Li P, Wen Y et al (2021) Efficient underwater energy harvesting from bubble-driven pipe flow. Appl Energy 295(87):116987

    Article  Google Scholar 

  6. Liu W, Li N, Weng C et al (2021) Bubble dynamics and pressure field characteristics of underwater detonation gas jet generated by a detonation tube. Phys Fluids 33(2):23302

    Article  Google Scholar 

  7. Andrade M, Arruda L, Santos E, et al (2017) Bubble shape identification and calculation in gas-liquid slug flow using semi-automatic image segmentation. In: Proceedings of the international conference on image analysis and processing, Italy, pp 116–126

  8. Zhang H, Li X, Yang Q, et al (2019) Optical image recognition of underwater bubbles. Infrared Laser Eng 48(3):262–268

  9. He Y, Dong Y, Xue R (2019) Recognition method on two-phase flow regime based on cross recursive analysis. J Syst Simul 31(4):720–726

    Google Scholar 

  10. Tharwat A, Hemedan A, Hasanien A et al (2018) A biometric-based model for fish species classification. Fish Res 204:324–336

    Article  Google Scholar 

  11. Li J, Eustice R, Johnson-Roberson M (2015) High-level visual features for underwater place recognition. In: Proceedings of the 2015 IEEE international conference on robotics and automation (ICRA), Seattle, WA, USA, pp 3652–3659

  12. Li X, Wei Z, Huang L, et al (2018) Real-time underwater fish tracking based on adaptive multi-appearance model. In: Proceedings of the 25th IEEE international conference on image processing (ICIP), Athens, pp 2710–2714

  13. Do MN, Vetterli M (2005) The contourlet transform: an efficient directional multiresolution image representation. IEEE Trans Image Process 14(12):2091–2106

    Article  Google Scholar 

  14. Rajalingam B et al (2020) Intelligent multimodal medical image fusion with deep guided filtering. Multimedia Syst 1:1–15

    Google Scholar 

  15. Bao H, Feng J, Dinh N et al (2020) Computationally efficient CFD prediction of bubbly flow using physics-guided deep learning. Int J Multiph Flow 131:103378

    Article  MathSciNet  Google Scholar 

  16. Kattenborn T, Leitloff J et al (2021) Review on Convolutional Neural Networks (CNN) in vegetation remote sensing. ISPRS J Photogramm Remote Sens 173:24–49

    Article  Google Scholar 

  17. Wang S et al (2021) Polarization image fusion algorithm using NSCT and CNN. J Russ Laser Res 42:443–452

    Article  Google Scholar 

  18. Huang Y, Wang Q, Shi L, Yang Q (2016) Underwater gas pipeline leakage source localization by distributed fiber-optic sensing based on particle swarm optimization tuning of the support vector machine. Appl Opt 55(2):242–247

    Article  Google Scholar 

  19. Ami T, Kitagawa M, Umekawa H, Ozawa M (2019) Dynamic simulation of pressure drop oscillation in gas-liquid two-phase system. Multiph Sci Technol 31(1):1–16

    Article  Google Scholar 

  20. Zhang H, Li X-C, Yang Q, Wu C-X, Lei Z (2019) Optical image recognition of underwater bubbles. Infrared Laser Eng 48(3):326001.1–326001.7

  21. Li G, Wang BB, Wu HJ, DiMarco SF (2020) Impact of bubble size on the integral characteristics of bubble plumes in quiescent and unstratified water. Int J Multiph Flow 125:103230–103230

  22. Wang B, Lai C, Socolofsky SA (2019) Mean velocity, spreading and entrainment characteristics of weak bubble plumes in unstratified and stationary water. J Fluid Mech 874:102–130

    Article  MathSciNet  MATH  Google Scholar 

  23. David L, Cockx A, Liné A (2021) The organized flow structure of an oscillating bubble plume. AIChE J

  24. Bohne T, Griemann T, Rolfes R (2020) Development of an efficient buoyant jet integral model of a bubble plume coupled with a population dynamics model for bubble breakup and coalescence to predict the transmission loss of a bubble curtain. Int J Multiph Flow 132:103456–103471

    Article  MathSciNet  Google Scholar 

  25. Zhang H, Li XC, Qian Y, Wu CX, Lei Z (2019) Optical image recognition of underwater bubbles. Infrared Laser Eng 48(3):270–276

    Google Scholar 

  26. Li H, Tao JG, Luo Y, Deng LP, Deng ZQ (2019) An underwater image bubble noise removal method based on optical flow. J Harbin Inst Technol (New Ser) 26(1):11–16

    Google Scholar 

  27. Marhaban MH, Massinaei M, Jahedsaravani A (2014) Development of a new algorithm for segmentation of flotation froth images. Miner Metall Process 1(31):66–67

    Google Scholar 

  28. Haas T, Schubert C et al (2020) BubCNN: bubble detection using faster RCNN and shape regression network. Chem Eng Sci 216:115467

    Article  Google Scholar 

  29. Da Cunha AL, Zhou J, Do MN (2006) The nonsubsampled contourlet transform: theory, design, and applications. IEEE Trans Image Process 15(10):3089–3101

    Article  Google Scholar 

  30. Bo L, Peng H, Wang J (2021) A novel fusion method based on dynamic threshold neural P systems and nonsubsampled contourlet transform for multi-modality medical images. Signal Process 178:107793

    Article  Google Scholar 

  31. Li X, Zhou F et al (2021) Multi-focus image fusion based on nonsubsampled contourlet transform and residual removal. Signal Process 184:108062

    Article  Google Scholar 

  32. Wang Z, Li X et al (2021) Medical image fusion based on convolutional neural networks and non-subsampled contourlet transform. Expert Syst Appl 171:114574

    Article  Google Scholar 

  33. Hasan Md, Hossain M et al (2022) A combined approach of non-subsampled contourlet transform and convolutional neural network to detect gastrointestinal polyp. Multimed Tools Appl 81:9949–9968

    Article  Google Scholar 

  34. Gyongyosi L, Imre S (2019) A survey on quantum computing technology. Comput Sci Rev 3:51–71

    Article  MathSciNet  Google Scholar 

  35. Burg V, Low G et al (2021) Quantum computing enhanced computational catalysis. Phys Rev Res 3(3):33055

    Article  Google Scholar 

  36. Kak SC (1995) Quantum neural computing. Inf Sci 83:143–160

  37. Jeswal SK, Chakraverty S (2019) Recent developments and applications in quantum neural network: a review. Arch Comput Methods Eng 26(4):793–807

    Article  MathSciNet  Google Scholar 

  38. Mangini S, Stefano F et al (2021) Quantum computing models for artificial neural networks. EPL (Europhys Lett) 134(1):10002

    Article  Google Scholar 

  39. Abbas A, Amira D et al (2021) The power of quantum neural networks. Nat Comput Sci 1(6):403–409

    Article  Google Scholar 

  40. Farhi E, Neven H (2018) Classification with quantum neural networks on near term processors. arXiv preprint: arXiv:1802.06002

  41. Mari A, Bromley TR et al (2020) Transfer learning in hybrid classical-quantum neural networks. Quantum 4:340

    Article  Google Scholar 

  42. Chen H, Wossnig L et al (2021) Universal discriminative quantum neural networks. Quantum Mach Intell 3(1):1–11

    Article  Google Scholar 

  43. A. Salman, S. Siddiqui, et al, " Automatic Fish Detection in Underwater Videos by a Deep Neural Network-Based Hybrid Motion Learning System", ICES Journal of Marine Science, pp. 1–13, 2019.

  44. Zhou C, Xu D et al (2019) Evaluation of fish feeding intensity in aquaculture using a convolutional neural network and machine vision. Aquaculture 507:457–465

    Article  Google Scholar 

  45. Zeng L, Bing S, Zhu D (2021) Underwater target detection based on faster R-CNN and adversarial occlusion network. Eng Appl Artif Intell 100:104190

    Article  Google Scholar 

  46. Kottursamy K (2021) Multi-scale CNN approach for accurate detection of underwater static fish image. J Artif Intell 3(3):230–242

    Google Scholar 

  47. Cohen N, Shari O, Shashua A (2016) On the expressive power of deep learning: a tensor analysis. In: Proceedings of the conference on learning theory. PMLR, pp 698–728

Download references

Acknowledgements

The National Natural Science Foundation of China (No. 62201249) and the Natural Science Foundation of Nanjing Institute of Technology supported this research (No. CKJB202009).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xue Yang.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yang, X., Chen, W. Underwater bubble plumes multi-scale morphological feature extraction and state recognition method. Neural Comput & Applic 35, 8437–8451 (2023). https://doi.org/10.1007/s00521-022-08116-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-022-08116-1

Keywords