TU Darmstadt / ULB / TUbiblio

Sensing Technology for Human Activity Recognition: a Comprehensive Survey

Fu, Biying and Damer, Naser and Kirchbuchner, Florian and Kuijper., Arjan (2020):
Sensing Technology for Human Activity Recognition: a Comprehensive Survey.
In: IEEE Access (Early Access), ISSN 2169-3536,
DOI: 10.1109/ACCESS.2020.2991891,
[Article]

Abstract

Sensors are devices that quantify the physical aspects of the world around us. This ability is important to gain knowledge about human activities. Human Activity recognition plays an import role in people’s everyday life. In order to solve many human-centered problems, such as health-care, and individual assistance, the need to infer various simple to complex human activities is prominent. Therefore, having a well defined categorization of sensing technology is essential for the systematic design of human activity recognition systems. By extending the sensor categorization proposed by White, we survey the most prominent research works that utilize different sensing technologies for human activity recognition tasks. To the best of our knowledge, there is no thorough sensor-driven survey that considers all sensor categories in the domain of human activity recognition with respect to the sampled physical properties, including a detailed comparison across sensor categories. Thus, our contribution is to close this gap by providing an insight into the state-of-the-art developments. We identify the limitations with respect to the hardware and software characteristics of each sensor category and draw comparisons based on benchmark features retrieved from the research works introduced in this survey. Finally, we conclude with general remarks and provide future research directions for human activity recognition within the presented sensor categorization.

Item Type: Article
Erschienen: 2020
Creators: Fu, Biying and Damer, Naser and Kirchbuchner, Florian and Kuijper., Arjan
Title: Sensing Technology for Human Activity Recognition: a Comprehensive Survey
Language: English
Abstract:

Sensors are devices that quantify the physical aspects of the world around us. This ability is important to gain knowledge about human activities. Human Activity recognition plays an import role in people’s everyday life. In order to solve many human-centered problems, such as health-care, and individual assistance, the need to infer various simple to complex human activities is prominent. Therefore, having a well defined categorization of sensing technology is essential for the systematic design of human activity recognition systems. By extending the sensor categorization proposed by White, we survey the most prominent research works that utilize different sensing technologies for human activity recognition tasks. To the best of our knowledge, there is no thorough sensor-driven survey that considers all sensor categories in the domain of human activity recognition with respect to the sampled physical properties, including a detailed comparison across sensor categories. Thus, our contribution is to close this gap by providing an insight into the state-of-the-art developments. We identify the limitations with respect to the hardware and software characteristics of each sensor category and draw comparisons based on benchmark features retrieved from the research works introduced in this survey. Finally, we conclude with general remarks and provide future research directions for human activity recognition within the presented sensor categorization.

Journal or Publication Title: IEEE Access (Early Access)
Uncontrolled Keywords: Human activity recognition, Surveys, Smart environments
Divisions: 20 Department of Computer Science
20 Department of Computer Science > Interactive Graphics Systems
20 Department of Computer Science > Mathematical and Applied Visual Computing
Date Deposited: 13 May 2020 07:10
DOI: 10.1109/ACCESS.2020.2991891
Official URL: https://ieeexplore.ieee.org/document/9083980
Export:
Suche nach Titel in: TUfind oder in Google
Send an inquiry Send an inquiry

Options (only for editors)
Show editorial Details Show editorial Details