Virtually transparent surgical instruments in endoscopic surgery with augmentation of obscured regions

Yuta Koreeda, Yo Kobayashi, Satoshi Ieiri, Yuya Nishio, Kazuya Kawamura, Satoshi Obata, Ryota Souzaki, Makoto Hashizume, Masakatsu G. Fujie

Research output: Contribution to journalArticle

3 Citations (Scopus)

Abstract

Purpose: We developed and evaluated a visual compensation system that allows surgeons to visualize obscured regions in real time, such that the surgical instrument appears virtually transparent. Methods: The system consists of two endoscopes: a main endoscope to observe the surgical environment, and a supporting endoscope to render the region hidden from view by surgical instruments. The view captured by the supporting endoscope is transformed to simulate the view from the main endoscope, segmented to the shape of the hidden regions, and superimposed to the main endoscope image so that the surgical instruments look transparent. A prototype device was benchmarked for processing time and superimposition rendering error. Then, it was evaluated in a training environment with 22 participants performing a backhand needle driving task with needle exit point error as the criterion. Lastly, we conducted an in vivo study. Results: In the benchmark, the mean processing time was 62.4 ms, which was lower than the processing time accepted in remote surgeries. The mean superimposition error of the superimposed image was 1.4 mm. In the training environment, needle exit point error with the system decreased significantly for experts compared with the condition without the system. This change was not significant for novices. In the in vivo study, our prototype enabled visualization of needle exit points during anastomosis. Conclusion: The benchmark suggests that the implemented system had an acceptable performance, and evaluation in the training environment demonstrated improved surgical task outcomes in expert surgeons. We will conduct a more comprehensive in vivo study in the future.

Original languageEnglish
Pages (from-to)1-10
Number of pages10
JournalInternational journal of computer assisted radiology and surgery
DOIs
Publication statusAccepted/In press - 2016 Apr 2

Fingerprint

Endoscopy
Endoscopes
Surgical Instruments
Surgery
Needles
Benchmarking
Processing
Visualization
Equipment and Supplies

Keywords

  • Augmented reality
  • Computer-assisted surgery
  • Endoscopic surgery
  • Laparoscopic surgery
  • Medical image processing
  • Visualization

ASJC Scopus subject areas

  • Radiology Nuclear Medicine and imaging
  • Health Informatics
  • Surgery

Cite this

Virtually transparent surgical instruments in endoscopic surgery with augmentation of obscured regions. / Koreeda, Yuta; Kobayashi, Yo; Ieiri, Satoshi; Nishio, Yuya; Kawamura, Kazuya; Obata, Satoshi; Souzaki, Ryota; Hashizume, Makoto; Fujie, Masakatsu G.

In: International journal of computer assisted radiology and surgery, 02.04.2016, p. 1-10.

Research output: Contribution to journalArticle

Koreeda, Yuta ; Kobayashi, Yo ; Ieiri, Satoshi ; Nishio, Yuya ; Kawamura, Kazuya ; Obata, Satoshi ; Souzaki, Ryota ; Hashizume, Makoto ; Fujie, Masakatsu G. / Virtually transparent surgical instruments in endoscopic surgery with augmentation of obscured regions. In: International journal of computer assisted radiology and surgery. 2016 ; pp. 1-10.
@article{3de0162b223d437e8bfdc68ad4afac3a,
title = "Virtually transparent surgical instruments in endoscopic surgery with augmentation of obscured regions",
abstract = "Purpose: We developed and evaluated a visual compensation system that allows surgeons to visualize obscured regions in real time, such that the surgical instrument appears virtually transparent. Methods: The system consists of two endoscopes: a main endoscope to observe the surgical environment, and a supporting endoscope to render the region hidden from view by surgical instruments. The view captured by the supporting endoscope is transformed to simulate the view from the main endoscope, segmented to the shape of the hidden regions, and superimposed to the main endoscope image so that the surgical instruments look transparent. A prototype device was benchmarked for processing time and superimposition rendering error. Then, it was evaluated in a training environment with 22 participants performing a backhand needle driving task with needle exit point error as the criterion. Lastly, we conducted an in vivo study. Results: In the benchmark, the mean processing time was 62.4 ms, which was lower than the processing time accepted in remote surgeries. The mean superimposition error of the superimposed image was 1.4 mm. In the training environment, needle exit point error with the system decreased significantly for experts compared with the condition without the system. This change was not significant for novices. In the in vivo study, our prototype enabled visualization of needle exit points during anastomosis. Conclusion: The benchmark suggests that the implemented system had an acceptable performance, and evaluation in the training environment demonstrated improved surgical task outcomes in expert surgeons. We will conduct a more comprehensive in vivo study in the future.",
keywords = "Augmented reality, Computer-assisted surgery, Endoscopic surgery, Laparoscopic surgery, Medical image processing, Visualization",
author = "Yuta Koreeda and Yo Kobayashi and Satoshi Ieiri and Yuya Nishio and Kazuya Kawamura and Satoshi Obata and Ryota Souzaki and Makoto Hashizume and Fujie, {Masakatsu G.}",
year = "2016",
month = "4",
day = "2",
doi = "10.1007/s11548-016-1384-5",
language = "English",
pages = "1--10",
journal = "Computer-Assisted Radiology and Surgery",
issn = "1861-6410",
publisher = "Springer Verlag",

}

TY - JOUR

T1 - Virtually transparent surgical instruments in endoscopic surgery with augmentation of obscured regions

AU - Koreeda, Yuta

AU - Kobayashi, Yo

AU - Ieiri, Satoshi

AU - Nishio, Yuya

AU - Kawamura, Kazuya

AU - Obata, Satoshi

AU - Souzaki, Ryota

AU - Hashizume, Makoto

AU - Fujie, Masakatsu G.

PY - 2016/4/2

Y1 - 2016/4/2

N2 - Purpose: We developed and evaluated a visual compensation system that allows surgeons to visualize obscured regions in real time, such that the surgical instrument appears virtually transparent. Methods: The system consists of two endoscopes: a main endoscope to observe the surgical environment, and a supporting endoscope to render the region hidden from view by surgical instruments. The view captured by the supporting endoscope is transformed to simulate the view from the main endoscope, segmented to the shape of the hidden regions, and superimposed to the main endoscope image so that the surgical instruments look transparent. A prototype device was benchmarked for processing time and superimposition rendering error. Then, it was evaluated in a training environment with 22 participants performing a backhand needle driving task with needle exit point error as the criterion. Lastly, we conducted an in vivo study. Results: In the benchmark, the mean processing time was 62.4 ms, which was lower than the processing time accepted in remote surgeries. The mean superimposition error of the superimposed image was 1.4 mm. In the training environment, needle exit point error with the system decreased significantly for experts compared with the condition without the system. This change was not significant for novices. In the in vivo study, our prototype enabled visualization of needle exit points during anastomosis. Conclusion: The benchmark suggests that the implemented system had an acceptable performance, and evaluation in the training environment demonstrated improved surgical task outcomes in expert surgeons. We will conduct a more comprehensive in vivo study in the future.

AB - Purpose: We developed and evaluated a visual compensation system that allows surgeons to visualize obscured regions in real time, such that the surgical instrument appears virtually transparent. Methods: The system consists of two endoscopes: a main endoscope to observe the surgical environment, and a supporting endoscope to render the region hidden from view by surgical instruments. The view captured by the supporting endoscope is transformed to simulate the view from the main endoscope, segmented to the shape of the hidden regions, and superimposed to the main endoscope image so that the surgical instruments look transparent. A prototype device was benchmarked for processing time and superimposition rendering error. Then, it was evaluated in a training environment with 22 participants performing a backhand needle driving task with needle exit point error as the criterion. Lastly, we conducted an in vivo study. Results: In the benchmark, the mean processing time was 62.4 ms, which was lower than the processing time accepted in remote surgeries. The mean superimposition error of the superimposed image was 1.4 mm. In the training environment, needle exit point error with the system decreased significantly for experts compared with the condition without the system. This change was not significant for novices. In the in vivo study, our prototype enabled visualization of needle exit points during anastomosis. Conclusion: The benchmark suggests that the implemented system had an acceptable performance, and evaluation in the training environment demonstrated improved surgical task outcomes in expert surgeons. We will conduct a more comprehensive in vivo study in the future.

KW - Augmented reality

KW - Computer-assisted surgery

KW - Endoscopic surgery

KW - Laparoscopic surgery

KW - Medical image processing

KW - Visualization

UR - http://www.scopus.com/inward/record.url?scp=84962204989&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84962204989&partnerID=8YFLogxK

U2 - 10.1007/s11548-016-1384-5

DO - 10.1007/s11548-016-1384-5

M3 - Article

C2 - 27038964

AN - SCOPUS:84962204989

SP - 1

EP - 10

JO - Computer-Assisted Radiology and Surgery

JF - Computer-Assisted Radiology and Surgery

SN - 1861-6410

ER -