A Machine Learning-Empowered System for Long-Term Motion-Tolerant Wearable Monitoring of Blood Pressure and Heart Rate With Ear-ECG/PPG

 

In this paper, we propose a fully ear-worn long-term blood pressure (BP) and heart rate (HR) monitor to achieve a higher wearability. Moreover, to enable practical application scenarios, we present a machine learning framework to deal with severe motion artifacts induced by head movements. We suggest situating all electrocardiogram (ECG) and photoplethysmography (PPG) sensors behind two ears to achieve a super wearability, and successfully acquire weak ear-ECG/PPG signals using a semi-customized platform. After introducing head motions toward real-world application scenarios, we apply a support vector machine classifier to learn and identify raw heartbeats from motion artifacts-impacted signals. Furthermore, we propose an unsupervised learning algorithm to automatically filter out residual distorted/faking heartbeats, for ECG-to-PPG pulse transit time (PTT) and HR estimation. Specifically, we introduce a dynamic time warping-based learning approach to quantify distortion conditions of raw heartbeats referring to a high-quality heartbeat pattern, which are then compared with a threshold to perform purification. The heartbeat pattern and the distortion threshold are learned by a K-medoids clustering approach and a histogram triangle method, respectively. Afterward, we perform a comparative analysis on ten PTT or PTT&HR-based BP learning models. Based on an acquired data set, the BP and HR estimation using the proposed algorithm has an error of −1.4±5.2 mmHg and 0.8±2.7 beats/min, respectively, both much lower than the state-of-the-art approaches. These results demonstrate the capability of the proposed machine learning-empowered system in ear-ECG/PPG acquisition and motion-tolerant BP/HR estimation. This proof-of-concept system is expected to illustrate the feasibility of ear-ECG/PPG-based motion-tolerant BP/HR monitoring.

View this article on IEEE Xplore