Authors:
Alihuén García-Pavioni
and
Beatriz López
Affiliation:
Exit Grup, University of Girona, Catalonia, Spain
Keyword(s):
Accelerometers Wristbands, High-dimensional Time Series, Time Series Classification, Dimensionality Reduction, Feature Extraction, Behavior Recognition, Signal State Changes.
Abstract:
Feature extraction for high-dimensional time series has become a topic of great importance in recent years. In the medical field, the information needed to predict emotions, stress, epileptic seizures, heart attacks, Parkinson, fall detection in the elderly, and other diseases, can be provided by body sensors in the form of time series signals. The commercial usage of wearable accelerometers has also made the study of time series activity recognition gain much attention. Thus, as the time series provided by the accelerometers could be really long, consuming a lot of storage data and also hamming the machine learning classifier accuracy results, it is important to identify which features are relevant in this particular context, so the data stored can consume the least amount of memory possible in the device, while at the same time the activity classification performance would be satisfactory. This work intends to provide a way for these devices to save the relevant information needed
for the machine learning activity classification, by defining a new feature extraction method. The method proposed in this work, called State Changes Representation for Time Series (SCRTS), relies on the relevant data associated with the “state changes” in the time series. These changes are identified according to the conditional probabilities of passing from one state to another during the time, and the “relevance” of each state. We show the results of this method with an experiment based on accelerometers data recorded by the ©ActiGraph wGT3X-BT wristband to recognize sedentary behavior. After applying this method, it was achieved to reduce time series frames of dimension 360, to vectors of dimension 12; while their classification accuracy was 84%.
(More)