Abstract
It is challenging to precisely identify the boundary of activities in order to annotate the activity datasets required to train activity recognition systems. This is the case for experts, as well as non-experts who may be recruited for crowd-sourcing paradigms to reduce the annotation effort or speed up the process by distributing the task over multiple annotators. We present a method to automatically adjust annotation boundaries, presuming a correct annotation label, but imprecise boundaries, otherwise known as \label jitter". The approach maximizes the Fukunaga Class-Separability, applied to time series. Evaluations on a standard benchmark dataset showed statistically significant improvements from the initial jittery annotations.
Original language | English |
---|---|
Title of host publication | Proceedings of the 2013 ACM conference on Pervasive and ubiquitous computing adjunct publication |
Place of Publication | New York NY USA |
Publisher | Association for Computing Machinery (ACM) |
Pages | 673-678 |
Number of pages | 6 |
ISBN (Print) | 9781450322157 |
DOIs | |
Publication status | Published - 2013 |
Externally published | Yes |
Event | ACM International Joint Conference on Pervasive and Ubiquitous Computing 2013 - Zurich, Switzerland Duration: 8 Sep 2013 → 12 Sep 2013 |
Conference
Conference | ACM International Joint Conference on Pervasive and Ubiquitous Computing 2013 |
---|---|
Abbreviated title | UbiComp 2013 |
Country | Switzerland |
City | Zurich |
Period | 8/09/13 → 12/09/13 |
Keywords
- Annotation errors
- Class separability
- Crowdsourcing
- Human activity recognition