11am - 12 noon
Tuesday 19 January 2021
Machine learning applied to video monitoring of sleep
PhD Open viva Presentation by PhD Student: Sara Mahvash Mohammadi.
This event has passed
Sleep quality is an important determinant of human health and wellbeing and impacts on many aspects of health, ranging from everyday fitness and general alertness to the rate of recovery from a serious illness. Two important indicators of sleep quality are body posture and movements during sleep. Currently, clinical diagnosis requires patients to undergo a polysomnography (PSG)-based assessment in a dedicated clinical sleep unit. This involves attaching multiple electrodes to the head and body which can themselves impact on sleep quality. Additionally, the current standard manual or automated scoring of determining sleep state lacks a comprehensive quantification of body position during sleep.
However, non-contact camera-based sleep monitoring technologies offer an alternative approach to sleep quality assessment that addresses these shortcomings. Nevertheless, these approaches have not been validated against the PSG gold standard or other clinical methods. Such evaluations to date have largely relied on simulated sleep rather than real sleep or require subjects to avoid any occlusion from bedding. These aspects constitute significant limitations for the routine implementation of video monitoring to measure body position and movements during sleep.
In this thesis, the design and development of a non-contact sleep monitoring system that automatically analyses body posture and movement by using video data captured from actual sleep using a blanket bed covering are presented. Experimental data are compared to gold standard methods of PSG assessment and manual expert annotation of the video data. Starting with simulated sleep data, a variety of different methodologies were explored based on deep learning for automatic sleep pose detection including a 4-layer de novo convolutional neural network (CNN) network, two-step deep learning, and combining deep learning with tensor factorisation for the detection of body poses during sleep when poses occluded by a blanket.
Nocturnal sleep was quantified in 12 healthy participants using recordings of infrared camera as well as PSG data in the Surrey sleep laboratory. Supervised machine learning strategies, using a transfer learning approach applied to infrared camera data, successfully quantified sleep poses of participants covered by a blanket. This represents the first occasion that such a machine learning approach has been used to successfully detect four predefined poses and the empty bed state during 8-10 hour overnight sleep episodes. Markov Chain transition matrix was also used to quantify sleep behaviour and to perform another comparison method against PSG and manual approach. In a cohort of 12 healthy participants, we found that fine-tuning a ResNet-152 pre-trained CNN network achieved the best performance compared with the standard end-to-end CNN and other pre-trained CNN networks. The method outperforms other video-based methods with an accuracy of 95.1% for sleep pose estimation and outperforms the clinical standard for pose estimation using a PSG position sensor.
Unsupervised data-driven pose analysis has also been investigated as a potential avenue of quantifying personalised sleep behaviour. This approach revealed that during a nocturnal sleep episode a participant may have as many as 17 distinct sleep poses. Pilot analysis of correlation between the sleep poses as well as movement with sleep physiology such as sleep stage, heart rate, and heart rate variability has also been performed. This demonstrates that whilst sleep pose based on four standard poses and sleep physiology only show low levels of correlation, unsupervised clustering when combined with physiology such as heart rate may offer an alternative approach to resolving sleep states using non-contact approaches. The results of these studies indicate that it is feasible to use video data and machine learning to quantify sleep behaviour.
Attend the seminar
You can join the seminar via Zoom.