Article Text

other Versions

Download PDFPDF

Virtual reality in GI endoscopy: intuitive zoom for improving diagnostics and training
  1. Alexander Hann,
  2. Benjamin M Walter,
  3. Niklas Mehlhase,
  4. Alexander Meining
  1. Interventional and Experimental Endoscopy (InExEn), Department of Internal Medicine I, Ulm University, Ulm, Germany
  1. Correspondence to Dr Alexander Hann, Interventional and Experimental Endoscopy (InExEn), Department of Internal Medicine I, Ulm University, Ulm 89081, Germany; alexander.hann{at}uniklinik-ulm.de

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Message

Among the many areas which may benefit from virtual reality (VR), GI endoscopy could be an important one. We developed a system to magnify the endoscopic image on forward movement of the endoscopists head to zoom into areas of interest during real-time endoscopy, called intuitive zoom. We could show that VR could be conveniently incorporated into daily endoscopic work when it comes to closely examining operator-defined regions of interest. Additionally, applied light intensity-derived three-dimensional (3D)reconstruction of the intestinal mucosa was used to better assess lesions such as polyps using the Paris classification. These new techniques might improve routine endoscopic diagnosis and therapy and open a new world of endoscopic training.

In more detail

Study background

Flexible endoscopy is performed using a two-dimensional (2D) image of the working area. Although this is regarded as a standard since 1970,1 new technologies allowing for stereoscopic vision thus improving visual spacial perception emerge. VR is used to generate realistic images that replicate a real environment by using head mounted displays (HMDs). Since 2016, after the introduction of the HMDs Oculus Rift (Oculus VR, California, USA) and HTC VIVE (HTC, New Taipei City, Republic of China), the technology became available at consumer prices. Until now, VR is often used for training purposes like surgical training.2 3 VR is also used for endoscopic training although most simulators just project a virtual endoscopic image on a flat screen without full immersion of the examiner.4–6 To explore the potential of VR in real-time flexible endoscopy, we developed a setup for performing endoscopic interventions aided by special features that can be easily implemented in VR.

Methods

Endoscopic examinations were performed using an Olympus GIF-HQ190 and CF-HQ190I (Hamburg, Germany). VR was implemented using the headset HTC VIVE accompanied by the VIVE Tracker. Interaction of the user with the devices was programmed using the cross-platform game engine Unity (Unity Technologies, San Francisco, California, USA). The endoscopic stomach training model from Erler Zimmer (Lauf, Germany) was used for the ex vivo simulations.

Results

In order to implement VR in real-time endoscopy, the high resolution endoscopic image was imported in VR. Tracking devices in two corners of the room registered the location of the examiner with the HMD (online supplementary figure 1). The precise localisation of the gastroscope was registered in VR by the attachment of a controller to the control handle of the endoscope (figure 1). Upper GI endoscopy was performed using a gastroscopy training model that displays various pathologies including a ventricular ulcer and a gastric polyp. A video of the full examination is presented in the online supplementary video 1. In order to check for the integrity of the instruments to be inserted into the working channel, we provided an additional live footage recorded by the camera attached in front of the HMD. This image is attached in VR to the location of the controller that itself tracks the endoscope. Thus, if the examiner looks down to the endoscope, he can examine the forceps and introduce it easily into the working channel (figure 2A). In GI endoscopy, advancing of the endoscope does not necessarily translates into forward movement of the endoscopic tip. In these cases, the examiner instinctively moves his head forward in order to have a clearer view at the image on the screen. However, in real life, this movement has little effect on the perception of the endoscopic image. Using the exact location of the head of the examiner, we developed a feature that enlarges the endoscopic image in VR on forward movement of the head. In figure 2B, the examiner focuses on a gastric polyp with the forceps introduced in the working channel. He advances the forceps without moving the endoscope and leans his head slightly forward. This movement results in the so called intuitive zoom of the endoscopic image presented in figure 2C. After successful biopsy of the polyp leaning the head backwards leads to a shrinkage of the endoscopic image to its regular size. Thus, using intuitive zoom the examiner is able to zoom into different parts of the live endoscopic image just by moving his head into the desired direction. An additional experimental setup even enabled intuitive zoom on the regular 2D monitor by tracking the head of the examiner by a VIVE Tracker (online supplementary figure 2). Using this setup, made the usage of the HMD unnecessary making this technique more feasible in daily practice. An example for results using this technique is displayed in figure 3. Using this method in daily practice, we experienced benefits during the examination of Barrett’s oesophagus or the pit pattern of colon polyps (online supplementary video 2). To further enhance the perception of the mucosal surface, we implemented a 3D reconstruction using light intensity. Assuming that brighter areas are closer to the light source of the endoscope, we were able to generate 3D images using the regular 2D endoscopic image. After generation, we slowly tilted the 3D image to improve the perception of the mucosal surface. This method especially helped us to classify polyps using the Paris classification (figure 4).

Supplemental material

Supplemental material

Supplemental material

Supplemental material

Figure 1

Virtual reality-controller mounted on a regular gastroscope.

Figure 2

Demonstration of a gastroscopy performed in virtual reality (VR) with intuitive zoom using an endoscopic stomach training model with a polyp. Insert represents the simultaneously recorded outer view of the examiner. (A) Represents presentation of the exterior view at the endoscope in VR to check and introduce the forceps by the endoscopist. (B) Represents the regular view at the endoscopic image in VR with a polyp and forceps inserted. (C) Shows the intuitive zoom with magnification of the endoscopic image simply by forward movement of the head, while the endoscope remains in place.

Figure 3

Intuitive zoom applied during gastroscopy. (A) Depicts the room setup with the monitor on left side and the examiner with the tracking device on his head on the right side. (B) Forward movement of the head results in a stepless magnification of the endoscopic image in the area of interest.

Figure 4

Three-dimensional (3D) reconstruction of the mucosal surface using light intensity. (A) Depicted is a colon polyp (lateral spreading tumour granular type) classified Paris IIa shown using narrow band imaging. (B) After 3D reconstruction using light intensity, the elevated borders of the polyp are clearly differentiated from the depressed centre parts.

Comments

VR and usage of stereoscopic images is currently on the way to a broader application in surgery.7 Especially in neurosurgery, this technology has been applied in planning complex interventions.8 Using VR leads to shorter operative time and increased overall performance.9 10 VR allows for implementation and visualisation of different information like spatial parameters and it represents an interface to robotic systems.11

Transfer of these technologies to GI endoscopy is difficult due to the use of flexible endoscopes which still do not provide stereoscopic images. Additionally, thin and permanently moving intestinal walls deep inside the body prevent an exact spatial registration which is required to generate a 3D image. Hence, at present, 3D datasets of the gut can only be acquired by CT. However, there are several new methods under current scientific investigation such as structure-from-motion,12 photometric stereo endoscopy13 and structured light14 that might be available in the near future. In this work, we used VR in flexible GI endoscopy by implementing different image sources into a virtual endoscopy cockpit. Our implementation of a digital magnification of the endoscopic image relative to the position of the examiner is just one possibility of integrating these new tracking devices. This technique can also integrate electronic chromoendoscopy. Additionally, using just the regular endoscopic image can transform light intensity information into a 3D landscape. Thus, helping understand the surface of polyps. Other image sources that can be implemented into the virtual endoscopy cockpit include previous CT scans, vital signs of the patient during sedation, etc.

The integration of such an amount of information sources into the VR environment raises the need for an intuitive interaction with the platform while both hands manipulate the endoscope. We successfully applied voice control to change between different screens like vital signs and live endoscopic view. A future perspective might be to interact with a virtual tutor that helps during difficult passages, especially for young endoscopists. This tutor might present simultaneously how to move the endoscope together with the resulting endoscopic view on verbal request of the examiner.

Using the technique of intuitive zoom in daily work, presented to be particularly advantageous by drawing the attention to a specific region of the endoscopic image. Accordingly, we realised that we examined regions of Barrett’s oesophagus more closely by focusing only on irregular patterns in the image. Other examples include the examination of polyps. We focused more on worrisome pit pattern, fading out other regions by intuitive zoom. This phenomenon is supported by previous studies that presented a benefit of magnification in order to differentiate polyps during real-time colonoscopy.15 This feature will also improve training by teaching young gastroenterologists to focus on specific regions without disturbance of the remaining endoscopic image. The latter group might also benefit from the 3D reconstruction of the mucosal surface. A 3D volume of a polyp in VR is presumably more easily to classify using the Paris classification instead of using a 2D image. Additionally, interaction with 3D polyps in VR might be implemented into a VR endoscopy training platform.

Hence, in conclusion, we think that based on our early feasibility data VR with tracking of the examiner and the endoscope might be helpful for performing flexible endoscopy using current endoscopes and devices. With further implementation of 3D-imaging technology and robotic manipulators, VR will potentially become relevant as an interface helping the GI endoscopists to perform real endoscopic interventions by taking advantage of the possibilities as presented in VR.

References

Footnotes

  • Contributors AH, BMW, AM: conception and design. AH, BMW, NM: performed the experiments. AM: provision of study materials. AH, AM: drafting of the article. All authors had critical revision of the article for important intellectual content and provided final approval of the article.

  • Funding This work was supported by the German Science Foundation (DFG) under Grant FOR 1321.

  • Competing interests None declared.

  • Patient consent Not required.

  • Provenance and peer review Not commissioned; internally peer reviewed.