Toward multi-stage decoupled visual SLAM system

Mohamed H. Merzban, Mohamed Abdellatif, Hossam Abbas, Salvatore Sessa

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

SLAM is defined as simultaneous estimation of mobile robot pose and structure of the surrounding environment Currently, there is a much interest in Visual SLAM, SLAM with a camera as main sensor, because the camera is an ubiquitous and affordable sensor. Camera measurements formed by perspective projection is highly nonlinear with respect to estimated states, leading to complicated nonlinear estimation problem. In this paper, a novel system is proposed that divides the problem into two parts: local and global motion estimation. This division leads to a simple linear estimation system. In the first stage, local motion parameters (acceleration, velocity, angular acceleration and orientation) are estimated in robot local frame. Robot position and the scene map are then estimated in the second stage in global frame as global motion parameters. Map is updated at each camera frame and is represented in a relative way to decouple robot pose from map structure estimation. The new system simplified the map correction to a linear optimization problem. Simulation results showed that the proposed system converges and yields accurate results.

Original languageEnglish
Title of host publicationROSE 2013 - 2013 IEEE International Symposium on Robotic and Sensors Environments, Proceedings
Pages172-177
Number of pages6
DOIs
Publication statusPublished - 2013
Externally publishedYes
Event2013 11th IEEE International Symposium on Robotic and Sensors Environments, ROSE 2013 - Washington, DC
Duration: 2013 Oct 212013 Oct 23

Other

Other2013 11th IEEE International Symposium on Robotic and Sensors Environments, ROSE 2013
CityWashington, DC
Period13/10/2113/10/23

Fingerprint

Cameras
Robots
Sensors
Angular velocity
Motion estimation
Mobile robots

Keywords

  • Graph Theory
  • Inertial Sensors
  • Relative Map
  • Robot Localization
  • Sensor Fusion
  • Visual SLAM

ASJC Scopus subject areas

  • Artificial Intelligence
  • Human-Computer Interaction
  • Electrical and Electronic Engineering

Cite this

Merzban, M. H., Abdellatif, M., Abbas, H., & Sessa, S. (2013). Toward multi-stage decoupled visual SLAM system. In ROSE 2013 - 2013 IEEE International Symposium on Robotic and Sensors Environments, Proceedings (pp. 172-177). [6698438] https://doi.org/10.1109/ROSE.2013.6698438

Toward multi-stage decoupled visual SLAM system. / Merzban, Mohamed H.; Abdellatif, Mohamed; Abbas, Hossam; Sessa, Salvatore.

ROSE 2013 - 2013 IEEE International Symposium on Robotic and Sensors Environments, Proceedings. 2013. p. 172-177 6698438.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Merzban, MH, Abdellatif, M, Abbas, H & Sessa, S 2013, Toward multi-stage decoupled visual SLAM system. in ROSE 2013 - 2013 IEEE International Symposium on Robotic and Sensors Environments, Proceedings., 6698438, pp. 172-177, 2013 11th IEEE International Symposium on Robotic and Sensors Environments, ROSE 2013, Washington, DC, 13/10/21. https://doi.org/10.1109/ROSE.2013.6698438
Merzban MH, Abdellatif M, Abbas H, Sessa S. Toward multi-stage decoupled visual SLAM system. In ROSE 2013 - 2013 IEEE International Symposium on Robotic and Sensors Environments, Proceedings. 2013. p. 172-177. 6698438 https://doi.org/10.1109/ROSE.2013.6698438
Merzban, Mohamed H. ; Abdellatif, Mohamed ; Abbas, Hossam ; Sessa, Salvatore. / Toward multi-stage decoupled visual SLAM system. ROSE 2013 - 2013 IEEE International Symposium on Robotic and Sensors Environments, Proceedings. 2013. pp. 172-177
@inproceedings{5786acd43ed54960b342323eb5efe193,
title = "Toward multi-stage decoupled visual SLAM system",
abstract = "SLAM is defined as simultaneous estimation of mobile robot pose and structure of the surrounding environment Currently, there is a much interest in Visual SLAM, SLAM with a camera as main sensor, because the camera is an ubiquitous and affordable sensor. Camera measurements formed by perspective projection is highly nonlinear with respect to estimated states, leading to complicated nonlinear estimation problem. In this paper, a novel system is proposed that divides the problem into two parts: local and global motion estimation. This division leads to a simple linear estimation system. In the first stage, local motion parameters (acceleration, velocity, angular acceleration and orientation) are estimated in robot local frame. Robot position and the scene map are then estimated in the second stage in global frame as global motion parameters. Map is updated at each camera frame and is represented in a relative way to decouple robot pose from map structure estimation. The new system simplified the map correction to a linear optimization problem. Simulation results showed that the proposed system converges and yields accurate results.",
keywords = "Graph Theory, Inertial Sensors, Relative Map, Robot Localization, Sensor Fusion, Visual SLAM",
author = "Merzban, {Mohamed H.} and Mohamed Abdellatif and Hossam Abbas and Salvatore Sessa",
year = "2013",
doi = "10.1109/ROSE.2013.6698438",
language = "English",
isbn = "9781467329385",
pages = "172--177",
booktitle = "ROSE 2013 - 2013 IEEE International Symposium on Robotic and Sensors Environments, Proceedings",

}

TY - GEN

T1 - Toward multi-stage decoupled visual SLAM system

AU - Merzban, Mohamed H.

AU - Abdellatif, Mohamed

AU - Abbas, Hossam

AU - Sessa, Salvatore

PY - 2013

Y1 - 2013

N2 - SLAM is defined as simultaneous estimation of mobile robot pose and structure of the surrounding environment Currently, there is a much interest in Visual SLAM, SLAM with a camera as main sensor, because the camera is an ubiquitous and affordable sensor. Camera measurements formed by perspective projection is highly nonlinear with respect to estimated states, leading to complicated nonlinear estimation problem. In this paper, a novel system is proposed that divides the problem into two parts: local and global motion estimation. This division leads to a simple linear estimation system. In the first stage, local motion parameters (acceleration, velocity, angular acceleration and orientation) are estimated in robot local frame. Robot position and the scene map are then estimated in the second stage in global frame as global motion parameters. Map is updated at each camera frame and is represented in a relative way to decouple robot pose from map structure estimation. The new system simplified the map correction to a linear optimization problem. Simulation results showed that the proposed system converges and yields accurate results.

AB - SLAM is defined as simultaneous estimation of mobile robot pose and structure of the surrounding environment Currently, there is a much interest in Visual SLAM, SLAM with a camera as main sensor, because the camera is an ubiquitous and affordable sensor. Camera measurements formed by perspective projection is highly nonlinear with respect to estimated states, leading to complicated nonlinear estimation problem. In this paper, a novel system is proposed that divides the problem into two parts: local and global motion estimation. This division leads to a simple linear estimation system. In the first stage, local motion parameters (acceleration, velocity, angular acceleration and orientation) are estimated in robot local frame. Robot position and the scene map are then estimated in the second stage in global frame as global motion parameters. Map is updated at each camera frame and is represented in a relative way to decouple robot pose from map structure estimation. The new system simplified the map correction to a linear optimization problem. Simulation results showed that the proposed system converges and yields accurate results.

KW - Graph Theory

KW - Inertial Sensors

KW - Relative Map

KW - Robot Localization

KW - Sensor Fusion

KW - Visual SLAM

UR - http://www.scopus.com/inward/record.url?scp=84893316847&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84893316847&partnerID=8YFLogxK

U2 - 10.1109/ROSE.2013.6698438

DO - 10.1109/ROSE.2013.6698438

M3 - Conference contribution

AN - SCOPUS:84893316847

SN - 9781467329385

SP - 172

EP - 177

BT - ROSE 2013 - 2013 IEEE International Symposium on Robotic and Sensors Environments, Proceedings

ER -