TY - JOUR
T1 - Semi-automated data labeling for activity recognition in pervasive healthcare
AU - Cruz-Sandoval, Dagoberto
AU - Beltran-Marquez, Jessica
AU - Garcia-Constantino, Matias
AU - Gonzalez-Jasso, Luis A.
AU - Favela, Jesus
AU - Lopez-Nava, Irvin Hussein
AU - Cleland, Ian
AU - Ennis, Andrew
AU - Hernandez-Cruz, Netzahualcoyotl
AU - Rafferty, Joseph
AU - Synnott, Jonathan
AU - Nugent, Chris
N1 - Publisher Copyright:
© 2019 by the authors. Licensee MDPI, Basel, Switzerland.
PY - 2019/7/2
Y1 - 2019/7/2
N2 - Activity recognition, a key component in pervasive healthcare monitoring, relies on classification algorithms that require labeled data of individuals performing the activity of interest to train accurate models. Labeling data can be performed in a lab setting where an individual enacts the activity under controlled conditions. The ubiquity of mobile and wearable sensors allows the collection of large datasets from individuals performing activities in naturalistic conditions. Gathering accurate data labels for activity recognition is typically an expensive and time-consuming process. In this paper we present two novel approaches for semi-automated online data labeling performed by the individual executing the activity of interest. The approaches have been designed to address two of the limitations of self-annotation: (i) The burden on the user performing and annotating the activity, and (ii) the lack of accuracy due to the user labeling the data minutes or hours after the completion of an activity. The first approach is based on the recognition of subtle finger gestures performed in response to a data-labeling query. The second approach focuses on labeling activities that have an auditory manifestation and uses a classifier to have an initial estimation of the activity, and a conversational agent to ask the participant for clarification or for additional data. Both approaches are described, evaluated in controlled experiments to assess their feasibility and their advantages and limitations are discussed. Results show that while both studies have limitations, they achieve 80% to 90% precision.
AB - Activity recognition, a key component in pervasive healthcare monitoring, relies on classification algorithms that require labeled data of individuals performing the activity of interest to train accurate models. Labeling data can be performed in a lab setting where an individual enacts the activity under controlled conditions. The ubiquity of mobile and wearable sensors allows the collection of large datasets from individuals performing activities in naturalistic conditions. Gathering accurate data labels for activity recognition is typically an expensive and time-consuming process. In this paper we present two novel approaches for semi-automated online data labeling performed by the individual executing the activity of interest. The approaches have been designed to address two of the limitations of self-annotation: (i) The burden on the user performing and annotating the activity, and (ii) the lack of accuracy due to the user labeling the data minutes or hours after the completion of an activity. The first approach is based on the recognition of subtle finger gestures performed in response to a data-labeling query. The second approach focuses on labeling activities that have an auditory manifestation and uses a classifier to have an initial estimation of the activity, and a conversational agent to ask the participant for clarification or for additional data. Both approaches are described, evaluated in controlled experiments to assess their feasibility and their advantages and limitations are discussed. Results show that while both studies have limitations, they achieve 80% to 90% precision.
KW - Activity recognition
KW - Data labeling
KW - Environmental sound recognition
KW - Gesture recognition
KW - Pervasive healthcare
UR - https://www.scopus.com/pages/publications/85069791596
U2 - 10.3390/s19143035
DO - 10.3390/s19143035
M3 - Article
C2 - 31295850
AN - SCOPUS:85069791596
SN - 1424-8220
VL - 19
JO - Sensors (Switzerland)
JF - Sensors (Switzerland)
IS - 14
M1 - 3035
ER -