Article Text

PDF

PTU-016 Wireless Capsule Endoscope Localisation Based On Visual Odometry
  1. A Koulaouzidis1,
  2. DK Iakovidis2,
  3. E Spyrou2
  1. 1The Royal Infirmary of Edinburgh, Edinburgh, UK
  2. 2Technological Educational Institute of Central Greece, Lamia, Greece

Abstract

Introduction The localisation of a wireless capsule endoscope (WCE) within the small-bowel is typically performed by wearable radiofrequency sensors triangulation. The accuracy of this approach is low.1 Only a few approaches have been proposed for WCE localisation based on visual features. These include methods addressing the estimation of the rotation angle of the capsule2,3 and temporal video segmentation methods.4 We present a WCE localisation method, based only on visual information extracted from conventional WCE recording.

Methods Automatic detection of points of interest (POI) in WCE video frames, matching of the detected POI between consecutive frames, and determination of actual correspondences between subsets of these POI based on the random sample consensus (RANSAC) algorithm was performed. Maximally stable extremal regions (MSER) algorithm, instead of the speeded up feature extraction (SURF) algorithm, was used. Based on the scaling and the rotation of the content of the consecutive WCE frames, it is possible to estimate the displacement and the rotation of the capsule within the GI tract. For the ex-vivo experiment; a standard simulated intestinal environment was created. Markers were sewn (at set, pre-recorded distances) onto the luminal surface of porcine small-bowel through which a capsule (MiroCam®, IntroMedic Co Ltd, Seoul, Korea) was propelled.

Results Comparative experiments using both SURF and MSER features, which indicated the superiority of the former over the latter, we conducted. We worked on a corpus of 1070 WCE frames (634 indicating forward motion, 436 indicating backward motion). The accuracy using SURF features was 81.5% (87.2% on forward motion, 73.2% on backward motion), while using MSER was 67.2% (79.8% on forward motion, and 48.9% backward motion). Noteworthy, the proposed algorithm often fails when using MSER (6.7% of frames while <0.1% when using SURF) and a transform is not estimated due to the lack of adequate correspondences between interest points.

Conclusion Visual odometry is a promising technique and – potentially – a feasible alternative to other localisation approaches in WCE.

References

  1. Than TD, et al. A review of localization systems for robotic endoscopic capsules. IEEE Transactions on Biomedical Engineering 2012;59:2387–99

  2. Spyrou E, Iakovidis D. Homography-based orientation estimation for capsule endoscope tracking. in 2012 IEEE Int. Conf. on Imaging Systems and Techniques (IST), IEEE, 2012:101–105,

  3. Mackiewicz M, et al. Wireless capsule endoscopy color video segmentation. IEEE Transactions on Medical Imaging 2008;27:1769–81

  4. Spyrou E, Iakovidis DK. Video-based measurements for wireless capsule endoscope tracking. Measurement Science and Technology, Measurement Science and Technology, IOP Publishing, 2014;25:015002 (14pp)

Disclosure of Interest None Declared.

Statistics from Altmetric.com

Request permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.