Abstract
Radar signals penetrate, scaffer, absorb and reflect energy into proximate objects and ground penetrating and aerial radar systems are well established. We describe a highly accurate system based on a combination of a monostatic radar (Google Soli), supervised machine learning to support object and material classification based UIs. Based on RadarCat techniques, we explore the development of tangible user interfaces without modification of the objects or complex infrastructures. This affords new forms of interaction with digital devices, proximate objects and micro-gestures.
Original language | English |
---|---|
Title of host publication | Proceeding SA'17 - SIGGRAPH Asia 2017 Emerging Technologies |
Subtitle of host publication | Bangkok, Thailand — November 27 - 30, 2017 |
Editors | Takuji Narumi, Borom Tunwattanapong |
Place of Publication | New York NY USA |
Publisher | Association for Computing Machinery (ACM) |
Number of pages | 2 |
ISBN (Electronic) | 9781450354042 |
DOIs | |
Publication status | Published - 2017 |
Externally published | Yes |
Event | ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia 2017 - Bangkok, Thailand Duration: 27 Nov 2017 → 30 Nov 2017 Conference number: 10th https://sa2017.siggraph.org/ |
Conference
Conference | ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia 2017 |
---|---|
Abbreviated title | SIGGRAPH Asia 2017 |
Country/Territory | Thailand |
City | Bangkok |
Period | 27/11/17 → 30/11/17 |
Internet address |
Keywords
- Object recognition
- Radar sensing
- Tangible interaction