Pervasive Healthcare Delivery

This project is in collaboration with Prof Guang-Zhong Yang of Imperial College London.
Pervasive healthcare systems provides an effective solution for monitoring the wellbeing of elderly, quantifying post-operative patient recovery and monitoring the progression of neurodegenerative diseases such as Parkinson’s. In practice, pervasive systems are only realizable by integrating multiple sensing modalities and existing research has shown that there is a complementary relationship between wearable and ambient, i.e. background, sensing paradigms. Wearable sensing enables continuous monitoring of patient motion and physiological parameters through a network of body-worn sensors wirelessly linked to each other. They are usually based on accelerometers, pulse oximeters (SpO2), ECG and temperature sensors. However, wearable sensors only provide limited body information and due to the lack of global reference, it can be difficult to use this information to deduce the context of the activities. Ambient sensing employs a large number of sensors that are ubiquitously placed in the environment such as video cameras, infrared sensors, water flow and utility usage sensors, and pressure sensors mounted on furniture. These systems can provide information about the location and activities of the subject within the environment and enable the detection of critical events such as falls.

By integrating the strengths of ambient and wearable sensing, it is possible to provide true pervasive systems that can be used to accurately infer subject condition based on activity and physiological parameters. In general, sensory data can be fused at signal, data, feature, or decision levels. For instance, sensor signals can be combined by using simple hardware thresholds, whereas at the data level, pattern recognition methods such as Bayesian Networks, Hidden Markov Models (HMMs) and Gaussian Mixture Models (GMMs) are often used for data fusion and analysis. Furthermore, due to the large volume of sensing data, dimensionality reduction techniques such as Manifold Embedding, Principal Component Analysis (PCA) and feature selection are often applied prior to applying actual activity classification procedures. The work in this section was carried out by Dr Mohamed ElHelw while at Imperial College London and is currently pursued by Ahmed Salah, Dr Neamat El-Gayar and Dr ElHelw. Body Sensor development kits for this work are donated by Prof Guang-Zhong Yang of Imperial College London.

 

 


Ambient sensing is achieved by using vision nodes to extract binary information on subject posture. Shown are scene images captured by vision nodes (left) and computed blobs after background subtraction (right)


The e-AR sensor (bottom left) developed at Imperial College London is based on the Body Sensor Network (BSN) node (top). The sensor measures 3D acceleration as well as other physiological parameters such as SpO2 and heart rate.

 
Copyright © 2010-2014 Ubiquitous & Visual Computing Group