Tangible UI by object and material classification with radar

Hui Shyong Yeo, Barrett Ens, Aaron Quigley

Research output: Chapter in Book/Report/Conference proceedingConference PaperOtherpeer-review

4 Citations (Scopus)

Abstract

Radar signals penetrate, scaffer, absorb and reflect energy into proximate objects and ground penetrating and aerial radar systems are well established. We describe a highly accurate system based on a combination of a monostatic radar (Google Soli), supervised machine learning to support object and material classification based UIs. Based on RadarCat techniques, we explore the development of tangible user interfaces without modification of the objects or complex infrastructures. This affords new forms of interaction with digital devices, proximate objects and micro-gestures.

Original languageEnglish
Title of host publicationProceeding SA'17 - SIGGRAPH Asia 2017 Emerging Technologies
Subtitle of host publicationBangkok, Thailand — November 27 - 30, 2017
EditorsTakuji Narumi, Borom Tunwattanapong
Place of PublicationNew York NY USA
PublisherAssociation for Computing Machinery (ACM)
Number of pages2
ISBN (Electronic)9781450354042
DOIs
Publication statusPublished - 2017
Externally publishedYes
EventACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia 2017 - Bangkok, Thailand
Duration: 27 Nov 201730 Nov 2017
Conference number: 10th
https://sa2017.siggraph.org/

Conference

ConferenceACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia 2017
Abbreviated titleSIGGRAPH Asia 2017
Country/TerritoryThailand
CityBangkok
Period27/11/1730/11/17
Internet address

Keywords

  • Object recognition
  • Radar sensing
  • Tangible interaction

Cite this