Deidentifying data from wearable devices may not be enough to protect users’ privacy, according to a review of studies published in the Lancet Digital Health.
The analysis focused on studies that evaluated whether individuals could be reidentified based on biometric signals from wearables. The researchers included 72 studies in their final review. Most focused on using EEG, ECG and inertial measurement unit (IMU) data, like using a device’s accelerometer or gyroscope to measure different types of movement and gait.
Overall, 17 studies demonstrated an ability to identify an individual based on EEG. Five of those studies included the recording length needed to identify users: 21 seconds on average, with a median of 12.8 seconds. Eight studies found a way to reidentify users based on ECG, while 13 could pinpoint individuals based on their walking gait.
“In conclusion, a real risk of reidentification exists when wearable device sensor data is shared. Although this risk can be minimised, it cannot be fully mitigated. Our findings reveal that the basic practices of withholding identifiers from public repositories might not be sufficient to ensure privacy,” the researchers wrote.
“More research is needed to guide the creation of policies and procedures that are sufficient to protect privacy, given the prevalence of wearable-device data collection and sharing.”
WHY IT MATTERS
The study’s authors found many of the studies they reviewed had high correct identification rates, and users could be identified with relatively small amounts of sensor data. However, they did note that many of the studies included in the review had small groups of participants, a number that could limit its generalizability to larger groups. Still, the four studies with larger populations did have similar results as the smaller studies.
As more health data becomes more available and organizations like the FDA and the NIH encourage its use, the study’s authors argue researchers and data scientists will need to consider new ways to protect user privacy.
“The findings here should not be used to justify blocking the sharing of biometric data from wearable devices. On the contrary, this systematic review exposes the need for more careful consideration of how data should be shared since the risk of not sharing data (eg, algorithmic bias and failure to develop new algorithmic tools that could save lives) might be even greater than the risk of reidentification,” they wrote.
“Our findings suggest that privacy-preserving methods will be needed for open science to flourish. For example, there is an opportunity for regulatory bodies and funding agencies to expand support for privacy-conscious data-sharing platforms that mitigate reidentification risk.”