Automatic estimation of the position and orientation of the drill to be grasped and manipulated by the disaster response robot based on analyzing depth camera information

Keishi Nishikawa, Jun Ohya, Hiroyuki Ogata, Kenji Hashimoto, Takashi Matsuzawa, Asaki Imai, Shunsuke Kimura, Atsuo Takanishi

Research output: Contribution to journalConference articlepeer-review

Abstract

Towards the actualization of a disaster response robot that can locate and manipulate a drill at an arbitrary position with an arbitrary posture in disaster sites, this paper proposes a method that can estimate the position and orientation of the drill that is to be grasped and manipulated by the robot arm, by utilizing the depth camera information acquired by the depth camera. In this paper's algorithm, first, using a conventional method, the target drill is detected on the basis of an RGB image captured by the depth camera, and 3D point cloud data representing the target is generated by combining the detection results and the depth image. Second, using our proposed method, the generated point cloud data is processed to estimate the information on the proper position and orientation for grasping the drill. More specifically, a pass through filter is applied to the generated 3D point cloud data obtained by the first step. Then, the point cloud is divided, and features are classified so that the chuck and handle are identified. By computing the centroid of the point cloud for the chuck, the position for grasping is obtained. By applying Principal Component Analysis, the orientation for grasping is obtained. Experiments were conducted on a simulator. The results show that our method could accurately estimate the proper configuration for the autonomous grasping a normal-type drill.

Original languageEnglish
Article number452
JournalIS and T International Symposium on Electronic Imaging Science and Technology
Volume2019
Issue number7
DOIs
Publication statusPublished - 2019 Jan 13
Event2019 Intelligent Robotics and Industrial Applications Using Computer Vision Conference, IRIACV 2019 - Burlingame, United States
Duration: 2019 Jan 132019 Jan 17

ASJC Scopus subject areas

  • Computer Graphics and Computer-Aided Design
  • Computer Science Applications
  • Human-Computer Interaction
  • Software
  • Electrical and Electronic Engineering
  • Atomic and Molecular Physics, and Optics

Fingerprint Dive into the research topics of 'Automatic estimation of the position and orientation of the drill to be grasped and manipulated by the disaster response robot based on analyzing depth camera information'. Together they form a unique fingerprint.

Cite this