An article published in npj digital medicine journal has evaluated the potential for the fusion of data collected from wearables to present “an effective data fusion framework” to contribute to more holistic approach to monitoring health, with the authors highlighting that “combining information from distinct sensor systems enhances the dimensionality of measurements and increases our understanding”, and as such “fusion techniques will play a pivotal role in transforming healthcare by providing personalised and comprehensive solutions”.
The authors begin by comparing wearables collecting health data to “a mini clinic worn directly on our body”, and acknowledges that they have “the potential to enable more insightful physical examinations through high-resolution data”. However, the article notes that the data does not necessarily provide the full picture – for example, daily heart rate information is useful, but without considering the information in line with other outcomes such as sleep quality of activity levels, “it may be meaningless” when trying to gauge overall health or the impact of a treatment.
Data fusion – combining data from different sources to create a fuller picture – is therefore highlighted as a way to “move beyond isolated data points”.
The authors write that the “the use of numerous systems and data fusion could ensure increased dimensionality of patient measurement through multiple sensor types”. Using the example of accelerometer and electrocardiogram (ECG) signals being fused to identify of abnormal heart rhythms, they suggest that an “optimal approach” would be to employ “feature-level fusion” that links features from both modalities into a single feature vendor. This would mean that features extracted from the accelerometer signal would be used to identify the type of activity being performed, and then features extracted from the ECG signal would be used to analyse the heart rate and identify any irregularities during that activity. “This two-step process of feature extraction and interpretation from both sensor signals would therefore enable a more comprehensive assessment of the individual’s heart health,” the article states.
The authors also note that with advancements in tech, wearable sensors are likely to decrease in size and become “more efficient and functional”, potentially able to capture metrics that we cannot capture today. “Consequently,” they say, “fusion will play an increasingly vital role in combining the outcomes of wearable biosensors. This could initiate transforming healthcare by providing comprehensive and personalised solutions for improving individual well-being and enabling proactive healthcare management.”
The article also highlights several factors which they recommend should be considered to support optimal outcomes and reliable results, including sensor selection as the primary factor, to prioritise “relevance to the target outcome and accuracy”. For example, wearable sensors with Internet of Things (IoT) connectivity can provide benefits of real-time data and remote monitoring; but researchers warn of the need for “robust security measures” to “safeguard patient data”.
Elsewhere in published research, researchers from Moorfields Eye Hospital and the UCL Institute of Ophthalmology have discovered “markers that indicate the presence of Parkinson’s disease in patients on average seven years before clinical presentation”, in “the largest study to date on retinal imaging in Parkinson’s disease”.
Researchers from King’s College London have also explored the environmental impacts of AI-enabled health, with a central focus on how “ethical principles can be integrated to improve the sustainability” of digital health systems.
To read the journal article on data fusion from wearables in full, please click here.