Abstract:
Numerous people are suffering from sleep-related problems. To diagnose them, a prerequisite is to divide the polysomnography (PSG) data into different sleep stages. Thus,...Show MoreMetadata
Abstract:
Numerous people are suffering from sleep-related problems. To diagnose them, a prerequisite is to divide the polysomnography (PSG) data into different sleep stages. Thus, sleep stage classification is an essential step, but collecting PSG data is expensive, time-consuming, and even belated. To address this issue, using accelerometers that are widely used in smartwatches is treated as an alternative way to monitor people’s sleep conditions. However, the flexibility of deep learning models by purely using wrist-worn accelerometer data for sleep stage classification has not been investigated by researchers. To explore the answer, in this paper, we design a novel axis-aware hybrid fusion-based deep learning model, named AccSleepNet, which takes the three axes’ accelerometer data as the input simultaneously. The designed axis-aware hybrid fusion mechanism prompts the model to learn the deep features from three axes collaboratively. Finally, a classification module takes the fused feature representations from three axes as input and outputs the predicted sleep stage. Experimental results on two public datasets demonstrate the effectiveness of the proposed AccSleepNet for the sleep stage classification task compared with state-of-the-art baselines. Moreover, an ablation study validates the necessity of leveraging three axes’ accelerometer data and the superiority of the designed axis-aware hybrid fusion mechanism 1.
Date of Conference: 06-08 December 2022
Date Added to IEEE Xplore: 02 January 2023
ISBN Information: