Dr. André Mewes
Publications
2020

Hatscher, B; Mewes, A; Pannicke, E; Kägebein, U; Wacker, F; Hansen, C; Hensen, B
Touchless scanner control to support MRI-guided interventions Journal Article
In: International Journal of Computer Assisted Radiology and Surgery, vol. 15, no. 3, pp. 545–553, 2020, ISSN: 1861-6429.
@article{hatscher_touchless_2020,
title = {Touchless scanner control to support MRI-guided interventions},
author = {B Hatscher and A Mewes and E Pannicke and U Kägebein and F Wacker and C Hansen and B Hensen},
url = {https://doi.org/10.1007/s11548-019-02058-1},
doi = {10.1007/s11548-019-02058-1},
issn = {1861-6429},
year = {2020},
date = {2020-03-01},
urldate = {2020-03-01},
journal = {International Journal of Computer Assisted Radiology and Surgery},
volume = {15},
number = {3},
pages = {545–553},
abstract = {MRI-guided interventions allow minimally invasive, radiation-free treatment but rely on real-time image data and free slice positioning. Interventional interaction with the data and the MRI scanner is cumbersome due to the diagnostic focus of current systems, confined space and sterile conditions.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}
2019

Mewes, A; Heinrich, F; Kägebein, U; Hensen, B; Wacker, F; Hansen, C
Projector‐based augmented reality system for interventional visualization inside MRI scanners Journal Article
In: The International Journal of Medical Robotics and Computer Assisted Surgery, vol. 15, no. 1, pp. e1950, 2019, ISSN: 1478-5951, 1478-596X.
@article{clive_projectorbased_2019,
title = {Projector‐based augmented reality system for interventional visualization inside MRI scanners},
author = {A Mewes and F Heinrich and U Kägebein and B Hensen and F Wacker and C Hansen},
editor = {L Clive},
url = {https://onlinelibrary.wiley.com/doi/10.1002/rcs.1950},
doi = {10.1002/rcs.1950},
issn = {1478-5951, 1478-596X},
year = {2019},
date = {2019-02-01},
urldate = {2019-02-01},
journal = {The International Journal of Medical Robotics and Computer Assisted Surgery},
volume = {15},
number = {1},
pages = {e1950},
abstract = {Background Navigation support in the interventional MRI is separated from the operating field, which makes it difficult to interpret positions and orientations and to coordinate the nessecary hand movements.
Methods We developed a projector-based augmented reality system to enable visual navigation of tracked instruments on pre-planned paths, and the visualization of risk structures directly on the patient inside the MRI bore. To assess the accuracy of the system a user study with clinicians was carried out in needle navigation test scenario.
Results The targets were reached with an error of 1.7 ± 0.5 mm, the entry points with an error of 1.7 ± 0.8 mm.
Conclusion The results suggest that the projected augmented reality navigation directly on the patient is accurate enough to target lesions with a size of down to 15 mm, so that prototype can serve as a platform for current and future research in augmented reality visualization and dynamic registration.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}
Methods We developed a projector-based augmented reality system to enable visual navigation of tracked instruments on pre-planned paths, and the visualization of risk structures directly on the patient inside the MRI bore. To assess the accuracy of the system a user study with clinicians was carried out in needle navigation test scenario.
Results The targets were reached with an error of 1.7 ± 0.5 mm, the entry points with an error of 1.7 ± 0.8 mm.
Conclusion The results suggest that the projected augmented reality navigation directly on the patient is accurate enough to target lesions with a size of down to 15 mm, so that prototype can serve as a platform for current and future research in augmented reality visualization and dynamic registration.

Mewes, A
Projector-based augmented reality and touchless interaction to support MRI-guided interventions Miscellaneous
2019.
@misc{mewes_projector-based_2019,
title = {Projector-based augmented reality and touchless interaction to support MRI-guided interventions},
author = {A Mewes},
url = {https://opendata.uni-halle.de/bitstream/1981185920/32204/1/Mewes_Andre_Dissertation_2019.pdf},
year = {2019},
date = {2019-01-01},
urldate = {2019-01-01},
abstract = {Bei minimal-invasiven perkutanen Eingriffen, wie z.B. der thermischen Tumorab-
lation, bedienen sich Radiologen bildgebender Verfahren, die Auskunft über
die aktuelle Position des Ablationsapplikators und die umgebenden Risikostruk-
turen geben. Bei MRT-gestützten Interventionen werden diese Bilddaten auf
einem Bildschirm neben dem MRI dargestellt und separieren so die Informa-
tionen vom Operationsfeld, was die Hand-Augen-Koordination erschwert, die
mentale Beanspruchung des Arztes erhöht und eine ohnehin schon problematis-
che ergonomische Situation verschlimmert. Darüber hinaus ist die Software zur
Steuerung des MRT und Bereitstellung der Bilddaten für die Diagnostik konzipiert,
also mit vielen Funktionen, die bei Interventionen nicht benötigt werden, und kann
nur mit konventionellen Eingabemodalitäten wie Trackball, Maus oder Tastatur
bedient werden. Aus diesem Grund wird die Steuerung des MRT und die Interaktion
mit Bilddaten oft an Assistenten delegiert, was eine Indirektion einführt, die häufig
Verwirrung und Frustration verursacht und den Ablauf der Intervention stört.
In dieser Dissertation werden Lösungsansätze für diese Probleme präsentiert. Es
werden das erste projektorbasierte Augmented-Reality-Nadel-Navigationssystem für
den Einsatz innerhalb der MRT-Röhre zur Unterstützung MRT-geführter Interven-
tionen sowie ein berührungsloses Gestensteuerungs-Interface zur direkten, sterilen
interventionellen Interaktion mit medizinischen Bilddaten und Steuerung des MRT
vorgestellt. Das Projektor-Kamera-System wird mit einem structured-Light-Ansatz
kalibriert und mit dem MRT registriert, um die visuellen Informationen von zwei
eigens entwickelten Nadel-Navigationskonzepten exakt mit dem Operationsfeld
zu überlagern. Das berührungslose Gestenset ist metaphorisch und selbsterk-
lärend gestaltet und wurde in zwei verschiedenen Interventionsszenarien evaluiert.
Die Auswertung zeigt vielversprechende Ergebnisse in Bezug auf Genauigkeit
und Gebrauchstauglichkeit. Aufgrund ihres allgemeinen Designs können die in
dieser Arbeit vorgestellten Systeme und Konzepte nicht nur den Arbeitsablauf
von MRT-geführten perkutanen Ablationen verbessern, sondern auch auf andere
Interventionen übertragen werden. In zukünftigen Arbeiten sollten die Projektions-
und Navigationsinformationen an sich durch Atmung bewegende innere und äußere Strukturen angepasst werden, um schließlich die klinische Anwendbarkeit erreichen
zu können.},
keywords = {},
pubstate = {published},
tppubtype = {misc}
}
lation, bedienen sich Radiologen bildgebender Verfahren, die Auskunft über
die aktuelle Position des Ablationsapplikators und die umgebenden Risikostruk-
turen geben. Bei MRT-gestützten Interventionen werden diese Bilddaten auf
einem Bildschirm neben dem MRI dargestellt und separieren so die Informa-
tionen vom Operationsfeld, was die Hand-Augen-Koordination erschwert, die
mentale Beanspruchung des Arztes erhöht und eine ohnehin schon problematis-
che ergonomische Situation verschlimmert. Darüber hinaus ist die Software zur
Steuerung des MRT und Bereitstellung der Bilddaten für die Diagnostik konzipiert,
also mit vielen Funktionen, die bei Interventionen nicht benötigt werden, und kann
nur mit konventionellen Eingabemodalitäten wie Trackball, Maus oder Tastatur
bedient werden. Aus diesem Grund wird die Steuerung des MRT und die Interaktion
mit Bilddaten oft an Assistenten delegiert, was eine Indirektion einführt, die häufig
Verwirrung und Frustration verursacht und den Ablauf der Intervention stört.
In dieser Dissertation werden Lösungsansätze für diese Probleme präsentiert. Es
werden das erste projektorbasierte Augmented-Reality-Nadel-Navigationssystem für
den Einsatz innerhalb der MRT-Röhre zur Unterstützung MRT-geführter Interven-
tionen sowie ein berührungsloses Gestensteuerungs-Interface zur direkten, sterilen
interventionellen Interaktion mit medizinischen Bilddaten und Steuerung des MRT
vorgestellt. Das Projektor-Kamera-System wird mit einem structured-Light-Ansatz
kalibriert und mit dem MRT registriert, um die visuellen Informationen von zwei
eigens entwickelten Nadel-Navigationskonzepten exakt mit dem Operationsfeld
zu überlagern. Das berührungslose Gestenset ist metaphorisch und selbsterk-
lärend gestaltet und wurde in zwei verschiedenen Interventionsszenarien evaluiert.
Die Auswertung zeigt vielversprechende Ergebnisse in Bezug auf Genauigkeit
und Gebrauchstauglichkeit. Aufgrund ihres allgemeinen Designs können die in
dieser Arbeit vorgestellten Systeme und Konzepte nicht nur den Arbeitsablauf
von MRT-geführten perkutanen Ablationen verbessern, sondern auch auf andere
Interventionen übertragen werden. In zukünftigen Arbeiten sollten die Projektions-
und Navigationsinformationen an sich durch Atmung bewegende innere und äußere Strukturen angepasst werden, um schließlich die klinische Anwendbarkeit erreichen
zu können.
2018

Mewes, A; Heinrich, F; Hensen, B; Wacker, F; Lawonn, K; Hansen, C
Concepts for augmented reality visualisation to support needle guidance inside the MRI Journal Article
In: Healthcare Technology Letters, vol. 5, no. 5, pp. 172–176, 2018, ISSN: 2053-3713, (_eprint: https://ietresearch.onlinelibrary.wiley.com/doi/pdf/10.1049/htl.2018.5076).
@article{mewes_concepts_2018,
title = {Concepts for augmented reality visualisation to support needle guidance inside the MRI},
author = {A Mewes and F Heinrich and B Hensen and F Wacker and K Lawonn and C Hansen},
url = {https://onlinelibrary.wiley.com/doi/abs/10.1049/htl.2018.5076},
doi = {10.1049/htl.2018.5076},
issn = {2053-3713},
year = {2018},
date = {2018-01-01},
urldate = {2018-01-01},
journal = {Healthcare Technology Letters},
volume = {5},
number = {5},
pages = {172–176},
abstract = {During MRI-guided interventions, navigation support is often separated from the operating field on displays, which impedes the interpretation of positions and orientations of instruments inside the patient's body as well as hand–eye coordination. To overcome these issues projector-based augmented reality can be used to support needle guidance inside the MRI bore directly in the operating field. The authors present two visualisation concepts for needle navigation aids which were compared in an accuracy and usability study with eight participants, four of whom were experienced radiologists. The results show that both concepts are equally accurate ( and ), useful and easy to use, with clear visual feedback about the state and success of the needle puncture. For easier clinical applicability, a dynamic projection on moving surfaces and organ movement tracking are needed. For now, tests with patients with respiratory arrest are feasible.},
note = {_eprint: https://ietresearch.onlinelibrary.wiley.com/doi/pdf/10.1049/htl.2018.5076},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Pannicke, E; Hatscher, B; Hensen, B; Mewes, A; Hansen, C; Wacker, F; Vick, R
MR Compatible and Sterile Gesture Interaction for Interventions Journal Article
In: 2018.
@article{pannicke_mr_2018,
title = {MR Compatible and Sterile Gesture Interaction for Interventions},
author = {E Pannicke and B Hatscher and B Hensen and A Mewes and C Hansen and F Wacker and R Vick},
url = {https://cdn.website-editor.net/ed4a3740370b414595b38f3fc1951e45/files/uploaded/TOC_IMRI2018.pdf},
year = {2018},
date = {2018-01-01},
urldate = {2018-01-01},
abstract = {One of the major limitations during IMRI procedures is the lack of interaction with the scanner within the cabin.
Physicians have to use MR-safe control panels, which are often unavailable, unintuitive or provide not enough input
options. Alternatively, interaction tasks are delegated to an assistant outside the scanner cabin verbally or by hand
signs. This workaround requires a well-rehearsed team high level of experience, which is very sensitive to team
changes},
keywords = {},
pubstate = {published},
tppubtype = {article}
}
Physicians have to use MR-safe control panels, which are often unavailable, unintuitive or provide not enough input
options. Alternatively, interaction tasks are delegated to an assistant outside the scanner cabin verbally or by hand
signs. This workaround requires a well-rehearsed team high level of experience, which is very sensitive to team
changes
2017
Mewes, A; Hensen, B; Wacker, F; Hansen, C
Touchless interaction with software in interventional radiology and surgery: a systematic literature review Journal Article
In: International Journal of Computer Assisted Radiology and Surgery, vol. 12, no. 2, pp. 291–305, 2017, ISSN: 1861-6410, 1861-6429.
@article{mewes_touchless_2017,
title = {Touchless interaction with software in interventional radiology and surgery: a systematic literature review},
author = {A Mewes and B Hensen and F Wacker and C Hansen},
url = {http://link.springer.com/10.1007/s11548-016-1480-6},
doi = {10.1007/s11548-016-1480-6},
issn = {1861-6410, 1861-6429},
year = {2017},
date = {2017-02-01},
urldate = {2025-08-14},
journal = {International Journal of Computer Assisted Radiology and Surgery},
volume = {12},
number = {2},
pages = {291–305},
abstract = {Purpose In this article, we systematically examine the current state of research of systems that focus on touchless humancomputer interaction in operating rooms and interventional radiology suites. We further discuss the drawbacks of current solutions and underline promising technologies for future development.
Methods A systematic literature search of scientific papers that deal with touchless control of medical software in the immediate environment of the operation room and interventional radiology suite was performed. This includes methods for touchless gesture interaction, voice control and eye tracking.
Results 55 research papers were identified and analyzed in detail including 33 journal publications. Most of the identified literature (62 %) deals with the control of medical image viewers. The others present interaction techniques for laparoscopic assistance (13 %), telerobotic assistance and operating room control (9 % each) as well as for robotic operating room assistance and intraoperative registration (3.5 % each). Only 8 systems (14.5 %) were tested in a real clinical environment, 7 (12.7 %) were not evaluated at all.
Conclusion In the last ten years, many advancements have led to robust touchless interaction approaches. However, only a few have been systematically evaluated in real operating room settings. Further research is required to cope with cur-},
keywords = {},
pubstate = {published},
tppubtype = {article}
}
Methods A systematic literature search of scientific papers that deal with touchless control of medical software in the immediate environment of the operation room and interventional radiology suite was performed. This includes methods for touchless gesture interaction, voice control and eye tracking.
Results 55 research papers were identified and analyzed in detail including 33 journal publications. Most of the identified literature (62 %) deals with the control of medical image viewers. The others present interaction techniques for laparoscopic assistance (13 %), telerobotic assistance and operating room control (9 % each) as well as for robotic operating room assistance and intraoperative registration (3.5 % each). Only 8 systems (14.5 %) were tested in a real clinical environment, 7 (12.7 %) were not evaluated at all.
Conclusion In the last ten years, many advancements have led to robust touchless interaction approaches. However, only a few have been systematically evaluated in real operating room settings. Further research is required to cope with cur-
2016
Mewes, A; Saalfeld, P; Riabikin, O; Skalej, M; Hansen, C
A gesture-controlled projection display for CT-guided interventions Journal Article
In: International Journal of Computer Assisted Radiology and Surgery, vol. 11, no. 1, pp. 157–164, 2016, ISSN: 1861-6410, 1861-6429.
@article{mewes_gesture-controlled_2016,
title = {A gesture-controlled projection display for CT-guided interventions},
author = {A Mewes and P Saalfeld and O Riabikin and M Skalej and C Hansen},
url = {http://link.springer.com/10.1007/s11548-015-1215-0},
doi = {10.1007/s11548-015-1215-0},
issn = {1861-6410, 1861-6429},
year = {2016},
date = {2016-01-01},
urldate = {2025-08-14},
journal = {International Journal of Computer Assisted Radiology and Surgery},
volume = {11},
number = {1},
pages = {157–164},
abstract = {Purpose The interaction with interventional imaging systems within a sterile environment is a challenging task for physicians. Direct physician-machine interaction during an intervention is rather limited because of sterility and workspace restrictions.
Methods We present a gesture-controlled projection display that enables a direct and natural physicianmachine interaction during computed tomography (CT)-based interventions. Therefore, a graphical user interface is projected on a radiation shield located in front of the physician. Hand gestures in front of this display are captured and classified using a Leap Motion Controller (LMC). We propose a gesture set to control basic functions of intervention software such as gestures for 2D image exploration, 3D object manipulation and selection. Our methods were evaluated in a clinically oriented user study with 12 participants.
Results The results of the performed user study confirm that the display and the underlying interaction concept are accepted by clinical users. The recognition of the gestures is robust, although there is potential for improvements. The gesture training times are less than 10 minutes, but vary heavily between the participants of the study. The developed gestures are connected logically to the intervention software and intuitive to use.
Conclusion The proposed gesture-controlled projection display counters current thinking, namely, it gives the radiologist complete control of the intervention software. It opens new possibilities for direct physician-machine interaction during CT-based interventions and is well suited to become an integral part of future interventional suites.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}
Methods We present a gesture-controlled projection display that enables a direct and natural physicianmachine interaction during computed tomography (CT)-based interventions. Therefore, a graphical user interface is projected on a radiation shield located in front of the physician. Hand gestures in front of this display are captured and classified using a Leap Motion Controller (LMC). We propose a gesture set to control basic functions of intervention software such as gestures for 2D image exploration, 3D object manipulation and selection. Our methods were evaluated in a clinically oriented user study with 12 participants.
Results The results of the performed user study confirm that the display and the underlying interaction concept are accepted by clinical users. The recognition of the gestures is robust, although there is potential for improvements. The gesture training times are less than 10 minutes, but vary heavily between the participants of the study. The developed gestures are connected logically to the intervention software and intuitive to use.
Conclusion The proposed gesture-controlled projection display counters current thinking, namely, it gives the radiologist complete control of the intervention software. It opens new possibilities for direct physician-machine interaction during CT-based interventions and is well suited to become an integral part of future interventional suites.
Weiß, S; Schnurr, A; Mewes, A; Ho, T
Mobile Augmented Reality and 3D Printing to Involve Patients in Treatment Decisions for Prostate Cancer Journal Article
In: 2016.
@article{weis_mobile_2016,
title = {Mobile Augmented Reality and 3D Printing to Involve Patients in Treatment Decisions for Prostate Cancer},
author = {S Weiß and A Schnurr and A Mewes and T Ho},
url = {https://www.var.ovgu.de/pub/2016_CURAC_Weiss.pdf},
year = {2016},
date = {2016-01-01},
abstract = {For prostate therapy, the involvement of patients in the decision process is of increasing importance. However, transferring knowledge to the patient often proves to be di cult because the concerned organ is not visible and the available image data is too complex to interpret for non-physicians. To improve the situation, our work aims to developing a tool for simple knowledge transfer between physician and patient. By combining an augmented reality (AR) application on a tablet computer with a 3D print of a prostate, details about the status quo are conveyed in a simple to understand manner. Our AR application supports two di↵erent interaction paradigms that are compared in a user study with 11 participants. The study aimed at evaluating usability (ISONORM) and intuitive use (QUESI). Our results show, that completion of the given tasks was done faster on the touch display (mean = 120s},
keywords = {},
pubstate = {published},
tppubtype = {article}
}
2015
Saalfeld, P; Mewes, A; Luz, M; Preim, B; Hansen, C
Comparative Evaluation of Gesture and Touch Input for Medical Software Book Section
In: Pielot, M; Diefenbach, S; Henze, N (Ed.): Mensch und Computer 2015 - Tagungsband, pp. 143–152, DE GRUYTER, 2015, ISBN: 978-3-11-044334-9 978-3-11-044392-9.
@incollection{pielot_comparative_2015,
title = {Comparative Evaluation of Gesture and Touch Input for Medical Software},
author = {P Saalfeld and A Mewes and M Luz and B Preim and C Hansen},
editor = {M Pielot and S Diefenbach and N Henze},
url = {https://www.degruyter.com/document/doi/10.1515/9783110443929-016/html},
doi = {10.1515/9783110443929-016},
isbn = {978-3-11-044334-9 978-3-11-044392-9},
year = {2015},
date = {2015-08-01},
urldate = {2025-08-14},
booktitle = {Mensch und Computer 2015 - Tagungsband},
pages = {143–152},
publisher = {DE GRUYTER},
abstract = {The interaction with medical software during interventions challenges physicians due to the limited space and the necessary sterility. Current input modalities such as touch screen control present a direct, natural interaction which addresses usability aspects but do not consider these challenges. A promising input modality is freehand gesture interaction, which allows sterile input and a possibly larger interaction space. This work compares gesture and touch input regarding task duration to perform typical intervention tasks and intuitiveness. A user study with ten medical students shows mostly significantly better results for touch screen interaction. Despite the advantages of freehand gestures, it is debatable whether these can compensate the better efficiency and usability results of touch screen interaction in the operating room.},
keywords = {},
pubstate = {published},
tppubtype = {incollection}
}
Hettig, J; Mewes, A; Riabikin, O; Skalej, M; Preim, B; Hansen, C
Exploration of 3D Medical Image Data for Interventional Radiology using Myoelectric Gesture Control Journal Article
In: 2015.
@article{hettig_exploration_2015,
title = {Exploration of 3D Medical Image Data for Interventional Radiology using Myoelectric Gesture Control},
author = {J Hettig and A Mewes and O Riabikin and M Skalej and B Preim and C Hansen},
url = {https://www.var.ovgu.de/pub/2015_Hettig_MyoArmband_VCBM.pdf},
year = {2015},
date = {2015-01-01},
abstract = {Human-computer interaction with medical images in a sterile environment is a challenging task. It is often delegated to an assistant or performed directly by the physician with an interaction device wrapped in a sterile plastic sheath. This process is time-consuming and inefficient. To address this challenge, we introduce a gesture-based interface for a medical image viewer that is completely touchlessly controlled by the Myo Gesture Control Armband (Thalmic Labs). Based on a clinical requirement analysis, we propose a minimal gesture set to support basic interaction tasks with radiological images and 3D models. We conducted two user studies and a clinical test to evaluate the interaction device and our new gesture control interface. The evaluation results prove the applicability of our approach and provide an important foundation for future research in physician-machine interaction.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}
2014
Mewes, A; Adler, S; Rose, G; Hansen, C
Augmented-Reality-Mikroskop - Implementierung einer flexiblen Datenverbindung zwischen CT-Angiographieanlage und Mikroskop Proceedings Article
In: Proceedings of the Annual Meeting of the German Society of Computer- and Robot-Assisted Surgery, pp. 28–31, München, 2014.
@inproceedings{mewes_augmented-reality-mikroskop_2014,
title = {Augmented-Reality-Mikroskop - Implementierung einer flexiblen Datenverbindung zwischen CT-Angiographieanlage und Mikroskop},
author = {A Mewes and S Adler and G Rose and C Hansen},
url = {https://www.var.ovgu.de/pub/Mewes_2014_CURAC.pdf},
year = {2014},
date = {2014-01-01},
booktitle = {Proceedings of the Annual Meeting of the German Society of Computer- and Robot-Assisted Surgery},
pages = {28–31},
address = {München},
abstract = {Während einer neurochirurgischen Intervention muss es dem Operateur möglich sein diagnostische Bilder zu aktuali-
sieren. Bei dem hier verwendeten neurochirurgischen Mikroskop können diagnostische Bilder als virtuelle Modelle und
Schnittbilder eingeblendet werden, um einen Sichtwechsel zwischen Patient und externen Bildschirmen zu vermeiden.
Um eine intraoperative Aktualisierung der Modelle zu ermöglichen, wurde im Rahmen dieser Arbeit das Mikroskop an
eine C-Arm-Angiographieanlage über einen PACS-Server angebunden. Außerdem wird ein effizienter Workflow vorge-
stellt, der u. a. die Modellgenerierung aus den DICOM-Bildern mit Hilfe der Entwicklungsumgebung MeVisLab vor-
sieht. Unter Laborbedingungen durchgeführte Experimente ergaben eine mittlere Dauer der Modellaktualisierung von
maximal 18,5 ± 2,5 min. Die Abweichung der virtuellen Überlagerung von realen Strukturen im Mikroskop liegt im We-
sentlichen im Submillimeterbereich. Das Mikroskopsystem bildet mit diesem Workflow eine solide Forschungsgrundlage
für zukünftige Projekte im Bereich der neurochirurgischen Navigation},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}
sieren. Bei dem hier verwendeten neurochirurgischen Mikroskop können diagnostische Bilder als virtuelle Modelle und
Schnittbilder eingeblendet werden, um einen Sichtwechsel zwischen Patient und externen Bildschirmen zu vermeiden.
Um eine intraoperative Aktualisierung der Modelle zu ermöglichen, wurde im Rahmen dieser Arbeit das Mikroskop an
eine C-Arm-Angiographieanlage über einen PACS-Server angebunden. Außerdem wird ein effizienter Workflow vorge-
stellt, der u. a. die Modellgenerierung aus den DICOM-Bildern mit Hilfe der Entwicklungsumgebung MeVisLab vor-
sieht. Unter Laborbedingungen durchgeführte Experimente ergaben eine mittlere Dauer der Modellaktualisierung von
maximal 18,5 ± 2,5 min. Die Abweichung der virtuellen Überlagerung von realen Strukturen im Mikroskop liegt im We-
sentlichen im Submillimeterbereich. Das Mikroskopsystem bildet mit diesem Workflow eine solide Forschungsgrundlage
für zukünftige Projekte im Bereich der neurochirurgischen Navigation
0000
Saalfeld, P; Mewes, A; Hansen, C; Preim, B
Gaze-Based Annotations: Labels on Demand Journal Article
In: 0000.
@article{saalfeld_gaze-based_201,
title = {Gaze-Based Annotations: Labels on Demand},
author = {P Saalfeld and A Mewes and C Hansen and B Preim},
url = {https://www.var.ovgu.de/pub/Saalfeld_2015_CURAC.pdf},
abstract = {We present an approach that tracks the gaze position of a user to determine a structure of interest in a medical planning model. This structure is automatically and dynamically annotated with an external label. The approach considers aspects from hand-crafted illustrations and interactive applications. In general, labels are simultaneously shown for all structures in interactive applications. However, this leads to visual clutter and is costly regarding computation time. Both aspects could be neglected if annotations are only shown regarding user’s demands, i.e., only one label at a time. We use an eye tracker to obtain the user’s gaze positions and, thus, the structure of interest. We address problems due to imprecise input with smoothing and increasing the selection area around each structure. The prototype was evaluated with an unstructured interview, which confirmed the suitability of our approach, e.g., during interventions where surgeons benefit from sterile interaction.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}