Sleep Apnea Event Detection from Sub-frame Based Feature Variation in EEG Signal Using Deep Convolutional Neural Network | IEEE Conference Publication | IEEE Xplore

Sleep Apnea Event Detection from Sub-frame Based Feature Variation in EEG Signal Using Deep Convolutional Neural Network


Abstract:

The topic of automatic detection of sleep apnea which is a respiratory sleep disorder, affecting millions of patients worldwide, is continuously being explored by researc...Show More

Abstract:

The topic of automatic detection of sleep apnea which is a respiratory sleep disorder, affecting millions of patients worldwide, is continuously being explored by researchers. Electroencephalogram signal (EEG) represents a promising tool due to its direct correlation to neural activity and ease of extraction. Here, an innovative approach is proposed to automatically detect apnea by incorporating local variations of temporal features for identifying the global feature variations over a broader window. An EEG data frame is divided into smaller sub-frames to effectively extract local feature variation within one larger frame. A fully convolutional neural network (FCNN) is proposed that will take each sub-frame of a single frame individually to extract local features. Following that, a dense classifier consisting of a series of fully connected layers is trained to analyze all the local features extracted from subframes for classifying the entire frame as apnea/non-apnea. Finally, a unique post-processing technique is applied which significantly improves accuracy. Both the EEG frame length and post-processing parameters are varied to find optimal detection conditions. Large-scale experimentation is executed on publicly available data of patients with varying apnea-hypopnea indices for performance evaluation of the suggested method.
Date of Conference: 20-24 July 2020
Date Added to IEEE Xplore: 27 August 2020
ISBN Information:

ISSN Information:

PubMed ID: 33019242
Conference Location: Montreal, QC, Canada

Contact IEEE to Subscribe

References

References is not available for this document.