TU Darmstadt / ULB / TUbiblio

User Independent, Multi-Modal Spotting of Subtle Arm Actions with Minimal Training Data

Bauer, Gerald ; Blanke, Ulf ; Lukowicz, Paul ; Schiele, Bernt (2013)
User Independent, Multi-Modal Spotting of Subtle Arm Actions with Minimal Training Data.
San Diego, CA, USA
doi: 10.1109/PerComW.2013.6529448
Konferenzveröffentlichung, Bibliographie

Kurzbeschreibung (Abstract)

We address a specific, particularly difficult class of activity recognition problems defined by (1) subtle, and hardly discriminative hand motions such as a short press or pull, (2) large, ill defined NULL class (any other hand motion a person may express during normal life), and (3) difficulty of collecting sufficient training data, that generalizes well from one to multiple users. In essence we intend to spot activities such as opening a cupboard, pressing a button, or taking an object from a shelve in a large data stream that contains typical every day activity. We focus on body-worn sensors without instrumenting objects, we exploit available infrastructure information, and we perform a one-to-many-users training scheme for minimal training effort. We demonstrate that a state of the art motion sensors based approach performs poorly under such conditions (Equal Error Rate of 18% in our experiments). We present and evaluate a new multi modal system based on a combination of indoor location with a wrist mounted proximity sensor, camera and inertial sensor that raises the EER to 79%.

Typ des Eintrags: Konferenzveröffentlichung
Erschienen: 2013
Autor(en): Bauer, Gerald ; Blanke, Ulf ; Lukowicz, Paul ; Schiele, Bernt
Art des Eintrags: Bibliographie
Titel: User Independent, Multi-Modal Spotting of Subtle Arm Actions with Minimal Training Data
Sprache: Deutsch
Publikationsjahr: März 2013
Verlag: IEEE Computer Society
Buchtitel: 2013 IEEE International Conference on Pervasive Computing and Communications Workshops (PERCOM Workshops)
Veranstaltungsort: San Diego, CA, USA
DOI: 10.1109/PerComW.2013.6529448
Kurzbeschreibung (Abstract):

We address a specific, particularly difficult class of activity recognition problems defined by (1) subtle, and hardly discriminative hand motions such as a short press or pull, (2) large, ill defined NULL class (any other hand motion a person may express during normal life), and (3) difficulty of collecting sufficient training data, that generalizes well from one to multiple users. In essence we intend to spot activities such as opening a cupboard, pressing a button, or taking an object from a shelve in a large data stream that contains typical every day activity. We focus on body-worn sensors without instrumenting objects, we exploit available infrastructure information, and we perform a one-to-many-users training scheme for minimal training effort. We demonstrate that a state of the art motion sensors based approach performs poorly under such conditions (Equal Error Rate of 18% in our experiments). We present and evaluate a new multi modal system based on a combination of indoor location with a wrist mounted proximity sensor, camera and inertial sensor that raises the EER to 79%.

Freie Schlagworte: Training, Sensor systems, Cameras, Training data, Support vector machines, Printers
ID-Nummer: TUD-CS-2013-0471
Fachbereich(e)/-gebiet(e): Profilbereiche
Profilbereiche > Cybersicherheit (CYSEC)
Hinterlegungsdatum: 24 Aug 2017 16:27
Letzte Änderung: 12 Jan 2019 21:20
PPN:
Export:
Suche nach Titel in: TUfind oder in Google
Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen