000079654 001__ 79654
000079654 005__ 20210520140800.0
000079654 037__ $$aTESIS-2019-131
000079654 041__ $$aeng
000079654 1001_ $$aAli, Nader Mahmoud
000079654 24500 $$aVisual Monocular SLAM for Minimally Invasive Surgery and its Application to Augmented Reality
000079654 260__ $$aZaragoza$$bUniversidad de Zaragoza, Prensas de la Universidad$$c2018
000079654 300__ $$a157
000079654 4900_ $$aTesis de la Universidad de Zaragoza$$v2019-111$$x2254-7606
000079654 500__ $$aPresentado:  19 06 2018
000079654 502__ $$aTesis-Univ. Zaragoza, Instituto de Investigación en Ingeniería de Aragón (I3A), 2018$$bZaragoza, Universidad de Zaragoza$$c2018
000079654 506__ $$aby-nc-nd$$bCreative Commons$$c3.0$$uhttp://creativecommons.org/licenses/by-nc-nd/3.0/es
000079654 520__ $$aRecovering 3D information of intra-operative endoscopic images together with the relative endoscope camera position are fundamental blocks towards accurate guidance and navigation in image-guided surgery. They allow augmented reality overlay of pre-operative models, which are readily available from different imaging modalities. This thesis provides a systematic approach for estimating these two pieces of information based on a pure vision Simultaneous Localization And Mapping (SLAM). SLAM goal is localizing a camera sensor, in real-time, within a map (3D reconstruction) of the environment that is also built online. It enables markerless camera tracking, where it uses only information from RGB images of a standard monocular camera.<br />The preliminary work in this thesis has presented a sparse SLAM solution for real time and accurate intra-operative visualization of patient's pre-operative models over the patient skin. We proposed a non-invasive registration and visualization pipeline that requires minimal interactions from medical staff and runs solely on a commodity Tablet-PC with a build-in camera. Subsequently, we directed our focus to endoscopy, which is very challenging for monocular 3D reconstruction and endoscope camera tracking. We have addressed the utilization of the state-of-the-art sparse SLAM, and achieved a remarkable tracking performance. Thus, it was our second contribution to propose a pairwise dense reconstruction algorithm that exploits the initial SLAM exploration phase and accurately provides a pairwise dense reconstruction of the surgical scene.<br />A further contribution is an extension of state-of-the-art sparse SLAM with a novel dense multi-view stereo-like approach to perform live dense reconstructions and hence eliminates the wait for the abdominal cavity exploration. We decouple the dense reconstruction from the camera trajectory estimation, resulting in a system that combines the accuracy and robustness of feature-based SLAM with the more complete reconstruction of direct SLAM methods. The proposed system can cope with challenging lighting conditions and poor/repetitive textures in endoscopy at an affordable time budget using modern GPU. The proposed system has been validated and evaluated on real porcine sequences of abdominal cavity exploration and showed a superior performance to other dense SLAM methods in terms of accuracy, density, and computation times. It has been also tested on different in-door sequences and showed a promising reconstructions results.<br />The proposed solutions in this thesis have been validated on real porcine in-vivo and ex-vivo sequences from different datasets and have proved to be fast and do not need any external tracking hardware nor significant intervention from medical staff, other than moving the Tablet-PC or the endoscope. They therefore can be integrated easily into the current surgical workflow. <br />
000079654 520__ $$a<br />
000079654 521__ $$97098$$aPrograma de Doctorado en Ingeniería Biomédica
000079654 6531_ $$arobotica
000079654 6531_ $$avision artificial
000079654 6531_ $$acirugia
000079654 700__ $$aMARTINEZ MONTIEL, JOSÉ MARÍA$$edir.
000079654 7102_ $$aUniversidad de Zaragoza$$bInstituto de Investigación en Ingeniería de Aragón (I3A)
000079654 830__ $$9510
000079654 8560_ $$ftdr@unizar.es
000079654 8564_ $$uhttps://zaguan.unizar.es/record/79654/files/TESIS-2019-131.pdf$$zTexto completo (eng)
000079654 909CO $$ooai:zaguan.unizar.es:79654$$pdriver
000079654 909co $$ptesis
000079654 9102_ $$a$$bInstituto de Investigación en Ingeniería de Aragón (I3A)
000079654 980__ $$aTESIS