Relative pose based redundancy removal

Collaborative RGB-D data transmission in mobile visual sensor networks

Xiaoqin Wang, Y. Ahmet Şekercioğlu, Tom Drummond, Vincent Frémont, Enrico Natalizio, Isabelle Fantoni

Research output: Contribution to journalArticleResearchpeer-review

Abstract

In this paper, the Relative Pose based Redundancy Removal (RPRR) scheme is presented, which has been designed for mobile RGB-D sensor networks operating under bandwidth-constrained operational scenarios. The scheme considers a multiview scenario in which pairs of sensors observe the same scene from different viewpoints, and detect the redundant visual and depth information to prevent their transmission leading to a significant improvement in wireless channel usage efficiency and power savings. We envisage applications in which the environment is static, and rapid 3D mapping of an enclosed area of interest is required, such as disaster recovery and support operations after earthquakes or industrial accidents. Experimental results show that wireless channel utilization is improved by 250% and battery consumption is halved when the RPRR scheme is used instead of sending the sensor images independently.

Original languageEnglish
Article number2430
Number of pages23
JournalSensors
Volume18
Issue number8
DOIs
Publication statusPublished - 26 Jul 2018

Keywords

  • 3D mapping
  • Collaborative coding
  • Relative pose estimation
  • RGB-D sensors
  • Robotic vision
  • Visual sensors

Cite this

Wang, Xiaoqin ; Şekercioğlu, Y. Ahmet ; Drummond, Tom ; Frémont, Vincent ; Natalizio, Enrico ; Fantoni, Isabelle. / Relative pose based redundancy removal : Collaborative RGB-D data transmission in mobile visual sensor networks. In: Sensors. 2018 ; Vol. 18, No. 8.
@article{aa1bd46c388b4d3291ceb3bf54e17784,
title = "Relative pose based redundancy removal: Collaborative RGB-D data transmission in mobile visual sensor networks",
abstract = "In this paper, the Relative Pose based Redundancy Removal (RPRR) scheme is presented, which has been designed for mobile RGB-D sensor networks operating under bandwidth-constrained operational scenarios. The scheme considers a multiview scenario in which pairs of sensors observe the same scene from different viewpoints, and detect the redundant visual and depth information to prevent their transmission leading to a significant improvement in wireless channel usage efficiency and power savings. We envisage applications in which the environment is static, and rapid 3D mapping of an enclosed area of interest is required, such as disaster recovery and support operations after earthquakes or industrial accidents. Experimental results show that wireless channel utilization is improved by 250{\%} and battery consumption is halved when the RPRR scheme is used instead of sending the sensor images independently.",
keywords = "3D mapping, Collaborative coding, Relative pose estimation, RGB-D sensors, Robotic vision, Visual sensors",
author = "Xiaoqin Wang and Şekercioğlu, {Y. Ahmet} and Tom Drummond and Vincent Fr{\'e}mont and Enrico Natalizio and Isabelle Fantoni",
year = "2018",
month = "7",
day = "26",
doi = "10.3390/s18082430",
language = "English",
volume = "18",
journal = "Sensors",
issn = "1424-8220",
publisher = "MDPI AG",
number = "8",

}

Relative pose based redundancy removal : Collaborative RGB-D data transmission in mobile visual sensor networks. / Wang, Xiaoqin; Şekercioğlu, Y. Ahmet; Drummond, Tom; Frémont, Vincent; Natalizio, Enrico; Fantoni, Isabelle.

In: Sensors, Vol. 18, No. 8, 2430, 26.07.2018.

Research output: Contribution to journalArticleResearchpeer-review

TY - JOUR

T1 - Relative pose based redundancy removal

T2 - Collaborative RGB-D data transmission in mobile visual sensor networks

AU - Wang, Xiaoqin

AU - Şekercioğlu, Y. Ahmet

AU - Drummond, Tom

AU - Frémont, Vincent

AU - Natalizio, Enrico

AU - Fantoni, Isabelle

PY - 2018/7/26

Y1 - 2018/7/26

N2 - In this paper, the Relative Pose based Redundancy Removal (RPRR) scheme is presented, which has been designed for mobile RGB-D sensor networks operating under bandwidth-constrained operational scenarios. The scheme considers a multiview scenario in which pairs of sensors observe the same scene from different viewpoints, and detect the redundant visual and depth information to prevent their transmission leading to a significant improvement in wireless channel usage efficiency and power savings. We envisage applications in which the environment is static, and rapid 3D mapping of an enclosed area of interest is required, such as disaster recovery and support operations after earthquakes or industrial accidents. Experimental results show that wireless channel utilization is improved by 250% and battery consumption is halved when the RPRR scheme is used instead of sending the sensor images independently.

AB - In this paper, the Relative Pose based Redundancy Removal (RPRR) scheme is presented, which has been designed for mobile RGB-D sensor networks operating under bandwidth-constrained operational scenarios. The scheme considers a multiview scenario in which pairs of sensors observe the same scene from different viewpoints, and detect the redundant visual and depth information to prevent their transmission leading to a significant improvement in wireless channel usage efficiency and power savings. We envisage applications in which the environment is static, and rapid 3D mapping of an enclosed area of interest is required, such as disaster recovery and support operations after earthquakes or industrial accidents. Experimental results show that wireless channel utilization is improved by 250% and battery consumption is halved when the RPRR scheme is used instead of sending the sensor images independently.

KW - 3D mapping

KW - Collaborative coding

KW - Relative pose estimation

KW - RGB-D sensors

KW - Robotic vision

KW - Visual sensors

UR - http://www.scopus.com/inward/record.url?scp=85050779940&partnerID=8YFLogxK

U2 - 10.3390/s18082430

DO - 10.3390/s18082430

M3 - Article

VL - 18

JO - Sensors

JF - Sensors

SN - 1424-8220

IS - 8

M1 - 2430

ER -