Multicategory CNN for Prediction Sleep Quality from Feature Physiological and Behavioral Daily
Keywords:
Sleep Quality, Convolutional Neural Network (CNN), Deep Learning, Physiological Signals, Behavioral Data, Multimodal Fusion, Wearable Devices.Abstract
Sleep quality is a vital indicator of overall human health and cognitive performance. Traditional methods for sleep assessment rely on subjective questionnaires or single-modality sensor data, which limits precision and adaptability in real-world monitoring. This study aims to develop a multiclass Convolutional Neural Network (CNN) model to predict Sleep_Quality levels— poor , fair , good , and excellent —by integrating physiological features (eg, heart rate, heart rate variability, skin temperature) and daily behavioral features (eg, step count, sedentary ratio, and smartphone usage patterns). The outputs were fused through fully connected layers followed by a Softmax classifier. Model performance was evaluated using subject-independent validation, with metrics including macro-F1, macro-AUROC, and Expected Calibration Error (ECE). The proposed multiclass CNN achieved an accuracy of 78.8%, a macro-F1 score of 0.76, and a macro-AUROC of 0.87, outperforming classical machine learning baselines such as SVM, Random Forest, and XGBoost . Multimodal fusion improved macro-F1 by 8% over the best unimodal model. Post-training temperature scaling reduced ECE from 0.094 to 0.064, indicating improved reliability of probabilistic outputs. Grad-CAM analysis revealed interpretable temporal patterns linking stable HRV and consistent bedtime routines to higher sleep quality categories. The results demonstrate that the proposed CNN multiclass model effectively captures complex physiological–behavioral interactions associated with sleep quality. This approach provides a foundation for personalized, data-driven sleep health monitoring systems and highlights the potential of deep learning in predictive sleep analytics. Future research will expand validation using cross-device and polysomnography datasets to enhance clinical generalizability.
References
M. Kline, “Sleep quality and health outcomes,” Sleep Medicine Reviews, vol. 45, pp. 1–10, 2019.
D. Buysse, “Sleep health: Can we define it? Does it matter?,” Sleep, vol. 37, no. 1, pp. 9–17, 2014.
J. F. Reeder and T. L. Brown, “Wearable sensors and the future of sleep research,” Frontiers in Neuroscience, vol. 15, 2021.
A. Gupta et al., “Machine learning approaches for sleep quality prediction: A review,” IEEE Access, vol. 10, pp. 11230–11245, 2022.
Y. LeCun et al., “Deep learning,” Nature, vol. 521, pp. 436–444, 2015.
X. Zhang, et al., “Sleep stage classification using CNN on EEG signals,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 28, no. 6, pp. 1421–1431, 2020.
S. Li, et al., “Behavioral feature learning for health monitoring using deep CNNs,” Sensors, vol. 23, no. 3, 2023.
W. Chen, et al., “Multiclass sleep quality assessment using deep convolutional neural networks,” IEEE Journal of Biomedical and Health Informatics, vol. 27, no. 4, pp. 1905–1916, 2023.
D. J. Buysse, “Sleep health: Can we define it? Does it matter?,” Sleep, vol. 37, no. 1, pp. 9–17, 2014, doi: 10.5665/sleep.3298.
A. Sathyanarayana, S. Joty, L. Fernandez-Luque, F. Ofli, J. Srivastava, A. Elmagarmid, T. Arora, and S. Taheri, “Sleep Quality Prediction From Wearable Data Using Deep Learning,” JMIR mHealth and uHealth, vol. 4, no. 4, e125, 2016, doi: 10.2196/mhealth.6562.
Supratak, H. Dong, C. Wu, and Y. Guo, “DeepSleepNet: A Model for Automatic Sleep Stage Scoring Based on Raw Single-Channel EEG,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 25, no. 11, pp. 1998–2008, Nov. 2017, doi: 10.1109/TNSRE.2017.2721116.
Y. LeCun, Y. Bengio, and G. Hinton, “Deep learning,” Nature, vol. 521, pp. 436–444, May 2015, doi: 10.1038/nature14539.
H. Korkalainen et al., “Deep learning enables sleep staging from photoplethysmogram for patients with suspected sleep apnea,” Sleep, vol. 43, no. 11, zssa098, 2020, doi: 10.1093/sleep/zsaa098.
M. R. Patterson, A. A. S. Nunes, D. Gerstel, R. Pilkar, T. Guthrie, A. Neishabouri, and C. C. Guo, “40 years of actigraphy in sleep medicine and current state of the art algorithms,” NPJ Digital Medicine, vol. 6, no. 51, 2023, doi: 10.1038/s41746-023-00802-1.
A. Di Credico et al., “Predicting Sleep Quality through Biofeedback: A Machine Learning Approach Using Heart Rate Variability and Skin Temperature,” Clocks & Sleep, vol. 6, no. 3, pp. 322–337, 2024, doi: 10.3390/clockssleep6030023.
Guo, G. Pleiss, Y. Sun, and K. Q. Weinberger, “On Calibration of Modern Neural Networks,” in Proc. 34th Int. Conf. Machine Learning (ICML), PMLR vol. 70, pp. 1321–1330, 2017.
J. K. Phillips et al., “Irregular sleep/wake patterns are associated with poorer academic performance and delayed circadian and sleep/wake timing,” Scientific Reports, vol. 7, no. 3216, 2017, doi: 10.1038/s41598-017-03171-4.
L. Zhuang, R. Zhang, and L. Chen, “Intelligent automatic sleep staging model based on CNN for medical diagnosis,” Frontiers in Public Health, vol. 10, 946833, 2022, doi: 10.3389/fpubh.2022.946833.
H. Yue, X. Yang, and Z. Peng, “Research and application of deep learning-based sleep staging: current status and prospects,” Sleep Medicine Reviews, 2024, doi: 10.1016/j.smrv.2024.101897.
Y. Tai et al., “Association between circadian skin temperature rhythms and sleep quality,” Sleep, vol. 46, no. 8, 2023, doi: 10.1093/sleep/zsad001.
X. Jiang, C. Guo, H. Chen, and J. Luo, “Convolutional neural network based on photoplethysmography for sleep analysis,” Frontiers in Neuroscience, vol. 17, 1222715, 2023, doi: 10.3389/fnins.2023.1222715.
Zhong et al., “Electronic Screen Use and Sleep Duration and Timing in Adolescents,” JAMA Network Open, vol. 8, no. 7, e****, 2025. (Associations of pre-bed screen use with poor sleep quality.)
S. Schrempft, K. Bayes, and A. Steptoe, “Associations between bedtime media use and sleep outcomes: a population-based cohort,” Sleep Medicine, 2024, doi: 10.1016/j.sleep.2024.08.012.
Downloads
Published
Issue
Section
License
Copyright (c) 2025 Aminuddin Indra Permana, Hanna Willa Dhany

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.




