Detection of avian influenza-infected chickens based on a chicken sound convolutional neural network

https://doi.org/10.1016/j.compag.2020.105688Get rights and content

Highlights

  • Chicken sounds were intercepted from complex noise.

  • A new method CSCNN was proposed in this paper.

  • CSCNN used to detect avian influenza-infected chickens via chicken sound.

  • The recognition accuracy is high to 97.43%.

Abstract

The modern poultry industry is large-scale and breeding-intensive, making the spread of disease in poultry easier, faster and more harmful. Avian influenza (AI) is the most important disease in poultry, and prevention and detection of avian influenza in poultry is a focus of scientific research and the poultry industry. In this paper, a new sound recognition method, the chicken sound convolutional neural network (CSCNN), is proposed for detection of chickens with avian influenza. According to the spectral differences in environmental noise, chicken behaviour noise and chicken sound, a method was designed to extract the chicken sound from complex sound data. Four features of the chicken sounds were calculated and combined into feature maps, including Logfbank, Mel Frequency Cepstrum Coefficient (MFCC), MFCC Delta and MFCC Delta-Delta. Finally, the sounds of healthy chickens and chickens with avian influenza were recognized using CSCNN. In the experiment, the recognition accuracies of CSCNN via spectrogram (CSCNN-S) were 93.01%, 95.05%, and 97.43% on the 2nd, 4th, and 6th day after injection with the H9N2 virus, and the recognition accuracies of CSCNN with feature mapping (CSCNN-F) were 89.79%, 93.56%, and 95.84%, respectively. The experimental results show that the method proposed in this paper can be used to quickly and effectively detect avian influenza-infected chickens via chicken sound.

Introduction

Disease control has always been an important issue in poultry breeding, especially because the modern poultry industry is large-scale and breeding-intensive, facilitating the spread of poultry diseases. Once poultry diseases appear, they usually result in substantial profit losses to farmers and can even endanger human health and safety (Lai et al., 2016), Therefore, the prevention and detection of poultry diseases has remained a focus of scientific research and the poultry industry. Avian influenza (AI) is the most important disease in poultry. According to pathogenicity, AI can be divided into high pathogenic avian influenza (HPAI) and low pathogenic avian influenza (LPAI). Infection with HPAI viruses in poultry can result in 100% flock mortality, whereas LPAI viruses primarily cause respiratory disease (Mulatti et al., 2017, Marangon et al., 2003). The spread of AI is rapid and widespread, and when AI appears, it is difficult for humans to produce a timely and effective judgement. H9N2 is a widespread avian influenza virus subtype in poultry worldwide. It infects a broad spectrum of host species including birds and mammals, and has great economic impact because they are able to induce disease independently or in association with each other.

With the development of computer technology, sound and video monitoring methods have been increasingly applied to observation of animal behaviour (Nasirahmadi et al., 2015, Balachandar and Chinnaiyan, 2018, Kumar et al., 2018, Okinda et al., 2019, Fang et al., 2020, Riekert et al., 2020). Compared with traditional artificial methods, sound and video monitoring methods offer many advantages that can reduce the risk of human infection and supply 24-hour continuous real-time monitoring. In recent years, more numerous research methods based on sound and video monitoring have been applied to the study of poultry diseases. Banakar et al. (2016) used the data mining method and Dempster-Shafer evidence theory to identify and classify several common diseases of chickens, and the recognition rate was 91.15%. Image processing technology can be used to identify infected chickens by observing their postures and raise early warning of broiler disease (Zhuang et al., 2018, Zhuang and Zhang, 2019). Aydin (2017) applied three-dimensional (3D) machine vision to monitor the gait and detect the posture of broilers via pictures, and the classification accuracy was 93%. Based on the time difference of arrival (TDOA) principle of the sound source location (SSL) method, Du et al. (2018) designed a sound monitoring system for layers using Kinect sensors, and the accuracy of the test in the laboratory was 74.7%. Huang et al. (2019) used support vector machines (SVM) to identify the sounds of chickens infected with avian influenza, and the recognition rate in the test set was 84% to 90%.

Deep learning is an important method in artificial intelligence technology that is widely used in various fields and has shown good performance in sound and speech processing (McLoughlin et al., 2015, Purwins et al., 2019). However, the deep learning method has primarily been used to analyse the sounds of the environment and human language (Abdel-Hamid et al., 2014, Piczak, 2015). In recent years, some researchers began to use deep learning methods to analyse animal sounds (Colonna et al., 2016, Mac Aodha et al., 2018, Hassan et al., 2017), and for the study of bird songs, a variety of networks and methods have emerged (Grill and Schlüter, 2017, Cakir et al., 2017, Carpentier et al., 2019, Donofre et al., 2020). However, most of the existing methods place a segment of audio into the neural network to extract the features. This method must process animal sounds and environmental sound information, the amount of computation is large and the results are easily affected by environmental sounds.

The objective of this research is to build a method for recognition of healthy chickens and chickens infected with avian influenza based on their sound. A new recognition algorithm based on the convolutional neural network is proposed for recognition of the sounds of chickens infected with avian influenza. Moreover, according to the differences in environmental noise, chicken behaviour noise, and chicken sounds, a new method for extracting the chicken sound from complex sound was proposed, it can reduce the computation of redundant information and the influence of various noises on the recognition result.

Section snippets

Experimental setup

The virus injection and data collection were conducted in the specific-pathogen-free (SPF) chicken isolation cage at the South China Agricultural University, and all operations were conducted in accordance with the Animal Biosafety Level Three (ABSL-3) standards. The SPF chicken isolation cage isolates chickens with avian influenza, ensures that the chickens have sufficient space to move freely and reduces the influence of outside noises. As shown in Fig. 1, the SPF chicken isolation cage was

Experimental data

The threshold of the noise score is important for chicken sound detection, and the accuracy of chicken sound detection with different noise score thresholds is shown in Fig. 7(a). A higher rate of chicken sounds occurs when the noise score threshold is less than 0.85. Fig. 7(b) shows the mean chicken sound number in each 10-minute sample of sound data with different noise score thresholds, and the chicken sound number increases as the threshold increases. The noise score threshold is set to

Conclusion

In this paper, the CSCNN sound recognition method is proposed for detection of chickens with avian influenza that were injected with the H9N2 avian influenza virus. This method detected chicken sounds from complex sound based on the spectral differences automatically, and transformed chicken sound to spectrogram and feature map as input of network. The results showed that the highest accuracy of CSCNN achieves 97.43% in detection of chickens with avian influenza. And compared with three

CRediT authorship contribution statement

Kaixuan Cuan: Conceptualization, Methodology, Software, Writing - original draft. Tiemin Zhang: Supervision, Conceptualization. : . Junduan Huang: Validation, Writing - review & editing, Validation, Writing - review & editing. Cheng Fang: Visualization, Software. Yun Guan: Data curation.

Declaration of Competing Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Acknowledgements

Funding: This work was supported by the National Key Research and Development Plan of China, China [grant No. 2018YFD0500705] and the Guangdong Province Special Fund for Modern Agricultural Industry Common Key Technology R&D Innovation Team, China [grant No. 2019KJ129].

References (39)

  • X. Zhuang et al.

    Development of an early warning algorithm to detect sick broilers

    Comput. Electron. Agric.

    (2018)
  • X. Zhuang et al.

    Detection of sick broilers by digital image processing and deep learning

    Biosyst. Eng.

    (2019)
  • O. Abdel-Hamid et al.

    Convolutional neural networks for speech recognition

    IEEE/ACM Trans. Audio Speech Lang. Process.

    (2014)
  • J.B. Allen et al.

    A unified approach to short-time Fourier analysis and synthesis

    Proc. IEEE

    (1977)
  • S. Balachandar et al.

    Internet of Things Based Reliable Real-Time Disease Monitoring of Poultry Farming Imagery Analytics

  • S.S. Bharali et al.

    Zero crossing rate and short term energy as a cue for sex detection with reference to Assamese vowels

  • E. Cakir et al.

    Convolutional recurrent neural networks for bird audio detection

  • A. Chowdhury et al.

    Fusing MFCC and LPC features using 1D triplet CNN for speaker recognition in severely degraded audio signals

    IEEE Trans. Inf. Forensics Secur.

    (2019)
  • J. Colonna et al.

    Automatic classification of anuran sounds using convolutional neural networks

  • Cited by (47)

    • SY-Track: A tracking tool for measuring chicken flock activity level

      2024, Computers and Electronics in Agriculture
    • Fish school feeding behavior quantification using acoustic signal and improved Swin Transformer

      2023, Computers and Electronics in Agriculture
      Citation Excerpt :

      These features were based on the design of the human ear's auditory system. Among them, Mel features and spectrograms perform better in feeding behavior recognition (Cuan et al., 2020). In this study, two features were chosen for training and comparison under the same conditions.

    • A lightweight CNN-based model for early warning in sow oestrus sound monitoring

      2022, Ecological Informatics
      Citation Excerpt :

      Although monitoring methods with sensors are widely used, they require high-performance equipment and are prone to cause stress in livestock, which is not conducive to long-term monitoring. With the continuous development of science and technology, there are an increasing number of methods for detecting animal behaviours based on audio and video recognition technology (Cuan et al., 2020; Kumar et al., 2018; Sheng et al., 2020; Y. Zhang et al., 2019). Audio, as a nondestructive monitoring method for information collection, has advantages, such as simplicity of operation and low costs, over traditional identification and other methods, and it can be used for real-time monitoring around the clock.

    • Internet of Things and Machine Learning techniques in poultry health and welfare management: A systematic literature review

      2022, Computers and Electronics in Agriculture
      Citation Excerpt :

      Qiang and Kou (2019) also benchmarked support vector machines, Bayesian networks, and kNN to predict avian flu. Other studies using ML techniques to detect avian influenza are SVM (Aziz & Othman, 2017; Huang et al., 2019; Zhuang et al., 2018), random forest, and maximum entropy (Belkhiria et al., 2020), deep learning (Cuan et al., 2020, 2022a) and association rules and sequential pattern mining (Xu et al., 2017). Also, in monitoring Newcastle disease, a severe infectious disease, Banakar et al. (2016) used SVM and Dempster-Shafer, while Okinda et al. (2019) benchmarked the performance of neural networks, SVM, and logistic regression.

    View all citing articles on Scopus
    View full text