Mobile device and intelligent display interaction via scale-invariant image feature matching

Leigh Herbert, Nick Pears, Dan Jackson, Patrick Olivier

    Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

    9 Citations (Scopus)

    Abstract

    We present further developments of our system that allows direction interaction between a camera-equipped hand-held device and a remote display. The essence of this system is the ability to estimate a planar projectivity between the remote display and the displayed image of that display on the handheld device. We describe how to achieve this by matching scale invariant SURF features across the two displays (remote and hand-held). We implement a prototype system and a drawing application and conduct both performance and usability evaluations. The feedback given indicates that our system is responsive, accurate and easy to use.

    Original languageEnglish
    Title of host publicationPECCS 2011 - Proceedings of the 1st International Conference on Pervasive and Embedded Computing and Communication Systems
    Pages207-214
    Number of pages8
    Publication statusPublished - 12 Sep 2011
    Event1st International Conference on Pervasive and Embedded Computing and Communication Systems, PECCS 2011 - Vilamoura, Algarve, Portugal
    Duration: 5 Mar 20117 Mar 2011

    Conference

    Conference1st International Conference on Pervasive and Embedded Computing and Communication Systems, PECCS 2011
    CountryPortugal
    CityVilamoura, Algarve
    Period5/03/117/03/11

    Keywords

    • Direct touch interaction
    • Human-computer interaction
    • Mobile device

    Cite this