Dr. Florian Heinrich
Short Bio
I am a postdoctoral researcher in the Virtual and Augmented Reality Group. After completing my Bachelor’s and Master’s degrees in Computational Visualistics and Computer Science at the University of Magdeburg, I earned my PhD on projector-based augmented reality for needle-based medical interventions. Following a period as a senior researcher at the University of Würzburg, I returned to Magdeburg, where I currently lead the junior research group on Human-Machine Interaction at the medical engineering Research Campus STIMULATE. In this role, I contribute to interdisciplinary research projects and teaching activities in the field of extended reality.
| Current position | Postdoctoral Researcher & Research Group Leader of the Human-Machine Interaction Group, Research Campus STIMULATE |
|---|---|
| Academic degree | PhD in Computer Science (Dr.-Ing.), Otto von Guericke University Magdeburg |
| Previous position | Senior Research Scientist, Chair of Human-Computer Interaction, University of Würzburg |
| Awards | Faculty Research Award (2024), medvis Award / Karl-Heinz-Höhne Prize (2021), Best Paper Awards and Honorable Mentions at IEEE ISMAR 2025, ACM VRST 2024, HRI 2025 and IEEE VR 2023 |
| Teaching | Virtual and Augmented Reality; Three-Dimensional and Advanced Interaction |
Find me also here:
Google Scholar |
LinkedIn |
ORCID |
ResearchGate
Research Interests
My research focuses on XR technologies for medical applications. I am particularly interested in novel visualization and interaction concepts for augmented and virtual environments. This includes:
- Visualization methods of medical data in XR
- Projector-based augmented reality and its simulation in virtual reality
- Interaction techniques and interfaces for XR and collaborative robotic systems
- XR-based support for clinical workflows and medical training
- Evaluation of user experience and usability in immersive environments
Publications
2025

Mielke, T; Allgaier, M; Hansen, C; Heinrich, F
Extended Reality Check: Evaluating XR Prototyping for Human-Robot Interaction in Contact-Intensive Tasks Journal Article
In: IEEE Transactions on Visualization and Computer Graphics, vol. 31, iss. 11, pp. 10035 - 10044, 2025.
@article{mielke_extended_2025,
title = {Extended Reality Check: Evaluating XR Prototyping for Human-Robot Interaction in Contact-Intensive Tasks},
author = {T Mielke and M Allgaier and C Hansen and F Heinrich},
doi = {10.1109/TVCG.2025.3616753},
year = {2025},
date = {2025-10-02},
urldate = {2025-10-02},
journal = {IEEE Transactions on Visualization and Computer Graphics},
volume = {31},
issue = {11},
pages = {10035 - 10044},
abstract = {Extended Reality (XR) has the potential to improve efficiency and safety in the user-centered development of human-robot interaction. However, the validity of using XR prototyping for user studies for contact-intensive robotic tasks remains underexplored. These in-contact tasks are particularly relevant due to challenges arising from indirect force perception in robot control. Therefore, in this work, we investigate a representative example of such a task: robotic ultrasound. A user study was conducted to assess the transferability of results from a simulated user study to real-world conditions, comparing two force-assistance approaches. The XR simulation replicates the physical study set-up employing a virtual robotic arm, its control interface, ultrasound imaging, and two force-assistance methods: automation and force visualization. Our results indicate that while differences in force deviation, perceived workload, and trust emerge between real and simulated setups, the overall findings remain consistent. Specifically, partial automation of robot control improves performance and trust while reducing workload, and visual feedback decreases force deviation in both real and simulated conditions. These findings highlight the potential of XR for comparative studies, even in complex robotic tasks.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Mielke, T; Heinrich, F; Hansen, C
Gesturing Towards Efficient Robot Control: Exploring Sensor Placement and Control Modes for Mid-Air Human-Robot Interaction Proceedings Article
In: 2025 IEEE International Conference on Robotics and Automation (ICRA), 2025.
@inproceedings{mielke_gesturing_2025,
title = {Gesturing Towards Efficient Robot Control: Exploring Sensor Placement and Control Modes for Mid-Air Human-Robot Interaction},
author = {T Mielke and F Heinrich and C Hansen},
doi = {10.1109/ICRA55743.2025.11127519},
year = {2025},
date = {2025-09-02},
urldate = {2025-01-01},
booktitle = {2025 IEEE International Conference on Robotics and Automation (ICRA)},
abstract = {While collaborative robots effectively combine robotic precision with human capabilities, traditional control methods such as button presses or hand guidance can be slow and physically demanding. This has led to an increasing interest in natural user interfaces that integrate hand gesturebased interactions for more intuitive and flexible robot control. Therefore, this paper systematically explores mid-air robot control by comparing position and rate control modes with different state-of-the-art and novel sensor placements. A user study was conducted to evaluate each combination in terms of accuracy, task duration, perceived workload, and physical exertion. Our results indicate that position control is more efficient than rate control. Traditional desk-mounted sensors can provide a good balance between accuracy and comfort. However, robot-mounted sensors are a viable alternative for short-term, accurate control with less spatial requirements. Legmounted sensors, while comfortable, pose challenges to handeye coordination. Based on these findings, we provide design implications for improving the usability and comfort of midair human-robot interaction. Future research should extend this evaluation to a wider range of tasks and environments.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}

Mielke, T; Heinrich, F; Hansen, C
Enhancing AR-to-Robot Registration Accuracy: A Comparative Study of Marker Detection Algorithms and Registration Parameters Proceedings Article
In: 2025 IEEE International Conference on Robotics and Automation (ICRA), IEEE, 2025.
@inproceedings{mielke_enhancing_2025,
title = {Enhancing AR-to-Robot Registration Accuracy: A Comparative Study of Marker Detection Algorithms and Registration Parameters},
author = {T Mielke and F Heinrich and C Hansen},
doi = {10.1109/ICRA55743.2025.11128039},
year = {2025},
date = {2025-09-02},
urldate = {2025-09-02},
booktitle = {2025 IEEE International Conference on Robotics and Automation (ICRA)},
publisher = {IEEE},
abstract = {Augmented Reality (AR) offers potential for enhancing human-robot collaboration by enabling intuitive interaction and real-time feedback. A crucial aspect of AR-robot integration is accurate spatial registration to align virtual content with the physical robotic workspace. This paper systematically investigates the effects of different tracking techniques and registration parameters on AR-to-robot registration accuracy, focusing on paired-point methods. We evaluate four marker detection algorithms - ARToolkit, Vuforia, ArUco, and retroreflective tracking - analyzing the influence of viewing distance, angle, marker size, point distance, distribution, and quantity. Our results show that ARToolkit provides the highest registration accuracy. While larger markers and positioning registration point centroids close to target locations consistently improved accuracy, other factors such as point distance and quantity were highly dependent on the tracking techniques used. Additionally, we propose an effective refinement method using point cloud registration, significantly improving accuracy by integrating data from points recorded between registration locations. These findings offer practical guidelines for enhancing AR-robot registration, with future work needed to assess the transferability to other AR devices and robots.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}

Schreiter, J; Heinrich, F; Hatscher, B; Schott, D; Hansen, C
Multimodal human–computer interaction in interventional radiology and surgery: a systematic literature review Journal Article
In: International Journal of Computer Assisted Radiology and Surgery, vol. 20, no. 4, pp. 807–816, 2025, ISSN: 1861-6429.
@article{schreiter_multimodal_2025,
title = {Multimodal human–computer interaction in interventional radiology and surgery: a systematic literature review},
author = {J Schreiter and F Heinrich and B Hatscher and D Schott and C Hansen},
url = {https://doi.org/10.1007/s11548-024-03263-3},
doi = {10.1007/s11548-024-03263-3},
issn = {1861-6429},
year = {2025},
date = {2025-04-01},
urldate = {2025-04-01},
journal = {International Journal of Computer Assisted Radiology and Surgery},
volume = {20},
number = {4},
pages = {807–816},
abstract = {As technology advances, more research dedicated to medical interactive systems emphasizes the integration of touchless and multimodal interaction (MMI). Particularly in surgical and interventional settings, this approach is advantageous because it maintains sterility and promotes a natural interaction. Past reviews have focused on investigating MMI in terms of technology and interaction with robots. However, none has put particular emphasis on analyzing these kind of interactions for surgical and interventional scenarios.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Schwenderling, L; Schotte, M; Joeres, F; Heinrich, F; Hanke, L; Huettl, F; Huber, T; Hansen, C
Teach Me Where to Look: Dual-task Attention Training in Augmented Reality Proceedings Article
In: Proceedings of the Extended Abstracts of the CHI Conference on Human Factors in Computing Systems, pp. 1–8, ACM, Yokohama Japan, 2025, ISBN: 979-8-4007-1395-8.
@inproceedings{schwenderling_teach_2025,
title = {Teach Me Where to Look: Dual-task Attention Training in Augmented Reality},
author = {L Schwenderling and M Schotte and F Joeres and F Heinrich and L Hanke and F Huettl and T Huber and C Hansen},
url = {https://dl.acm.org/doi/10.1145/3706599.3720198},
doi = {10.1145/3706599.3720198},
isbn = {979-8-4007-1395-8},
year = {2025},
date = {2025-04-01},
urldate = {2025-04-01},
booktitle = {Proceedings of the Extended Abstracts of the CHI Conference on Human Factors in Computing Systems},
pages = {1–8},
publisher = {ACM},
address = {Yokohama Japan},
abstract = {Regular eye contact is essential in medicine to recognize signs of pain. However, it is difficult to remember this during training as attention is tied up in learning. While augmented reality (AR) has shown promising results for medical education, there is no training for attention allocation yet. Therefore, three auditory and three visual attention guidance tools in AR are evaluated for their use in medical dual-task training settings. In expert reviews with six participants in human-computer interaction and medical didactics, advantages, disadvantages, and refinements for the cues were developed. For visual cues, an overt but less occluding cue was preferred for constant visibility of the primary task. A more diegetic cue design was proposed for the auditory cues to use a patient simulation as a reminder of the regular face glance. In general, several cues were found to be suitable for gaze guidance training, requiring only minor changes for improvement.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}

Mielke, T; Allgaier, M; Schott, D; Hansen, C; Heinrich, F
Virtual Studies, Real Results? Assessing the Impact of Virtualization on Human-Robot Interaction Proceedings Article
In: Proceedings of the Extended Abstracts of the CHI Conference on Human Factors in Computing Systems, pp. 1–8, ACM, Yokohama Japan, 2025, ISBN: 979-8-4007-1395-8.
@inproceedings{mielke_virtual_2025,
title = {Virtual Studies, Real Results? Assessing the Impact of Virtualization on Human-Robot Interaction},
author = {T Mielke and M Allgaier and D Schott and C Hansen and F Heinrich},
url = {https://dl.acm.org/doi/10.1145/3706599.3719724},
doi = {10.1145/3706599.3719724},
isbn = {979-8-4007-1395-8},
year = {2025},
date = {2025-04-01},
urldate = {2025-04-01},
booktitle = {Proceedings of the Extended Abstracts of the CHI Conference on Human Factors in Computing Systems},
pages = {1–8},
publisher = {ACM},
address = {Yokohama Japan},
abstract = {Extended Reality (XR) shows potential for human-centered evaluation of real-world scenarios and could improve efficiency and safety in robotic research. However, the validity of XR Human-Robot Interaction (HRI) studies remains underexplored. This paper investigates the transferability of HRI studies across virtualization levels for three tasks. Our results indicate XR study validity is task-specific, with task virtualization as a key influencing factor. Partially virtualized settings with virtual tasks and a real robot, as well as fully virtualized setups with a simulated robot, yielded results comparable to real setups for pick-and-place and robotic ultrasound. However, for precision-dependent peg-in-hole, differences were observed between real and virtualized conditions regarding completion time, perceived workload, and ease. Demonstrating the task dependency of XR transferability and comparing virtualization levels, our work takes an important step in assessing XR study validity. Future work should isolate factors affecting transferability and assess relative validity in the absence of absolute validity.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}

Heinrich, F; Schott, D; Schwenderling, L; Hansen, C
Do You See What I See? Evaluating Relative Depth Judgments Between Real and Virtual Projections Proceedings Article
In: Proceedings of the Extended Abstracts of the CHI Conference on Human Factors in Computing Systems, pp. 1–8, Association for Computing Machinery, New York, NY, USA, 2025, ISBN: 979-8-4007-1395-8.
@inproceedings{heinrich_you_2025,
title = {Do You See What I See? Evaluating Relative Depth Judgments Between Real and Virtual Projections},
author = {F Heinrich and D Schott and L Schwenderling and C Hansen},
url = {https://doi.org/10.1145/3706599.3720157},
doi = {10.1145/3706599.3720157},
isbn = {979-8-4007-1395-8},
year = {2025},
date = {2025-04-01},
urldate = {2025-04-01},
booktitle = {Proceedings of the Extended Abstracts of the CHI Conference on Human Factors in Computing Systems},
pages = {1–8},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
series = {CHI EA '25},
abstract = {Projector-based augmented reality (AR) is promising in different domains with less issues in discomfort or shortage of space. However, due to limitations like high costs and cumbersome calibration, this AR modality remains underused. To address this problem, a stereoscopic projector-based AR simulation was implemented for a cost-effective video see-through AR headset. To evaluate the validity of this simulation, a relative depth judgment experiment was conducted to compare this method with a physical projection system. Consistent results suggest that a known interaction effect between visualization and disparity mode could be successfully reproduced using both the physical projection and the virtual simulation. In addition, first findings indicate that there are no significant differences between these projection modalities. The results indicate that other perception-related effects observed for projector-based AR may also be applicable to virtual projection simulations and that future findings determined using only these simulations may also be applicable to real projections.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}

Mielke, T; Heinrich, F; Hansen, C
SensARy Substitution: Augmented Reality Techniques to Enhance Force Perception in Touchless Robot Control Journal Article
In: IEEE Transactions on Visualization and Computer Graphics, vol. 31, no. 5, pp. 3235–3244, 2025, ISSN: 1941-0506.
@article{mielke_sensary_2025,
title = {SensARy Substitution: Augmented Reality Techniques to Enhance Force Perception in Touchless Robot Control},
author = {T Mielke and F Heinrich and C Hansen},
url = {https://ieeexplore.ieee.org/document/10926846},
doi = {10.1109/TVCG.2025.3549856},
issn = {1941-0506},
year = {2025},
date = {2025-03-14},
urldate = {2025-05-01},
journal = {IEEE Transactions on Visualization and Computer Graphics},
volume = {31},
number = {5},
pages = {3235–3244},
abstract = {The lack of haptic feedback in touchless human-robot interaction is critical in applications such as robotic ultrasound, where force perception is crucial to ensure image quality. Augmented reality (AR) is a promising tool to address this limitation by providing sensory substitution through visual or vibrotactile feedback. The implementation of visual force feedback requires consideration not only of feedback design but also of positioning. Therefore, we implemented two different visualization types at three different positions and investigated the effects of vibrotactile feedback on these approaches. Furthermore, we examined the effects of multimodal feedback compared to visual or vibrotactile output alone. Our results indicate that sensory substitution eases the interaction in contrast to a feedback-less baseline condition, with the presence of visual support reducing average force errors and being subjectively preferred by the participants. However, the more feedback was provided, the longer users needed to complete their tasks. Regarding visualization design, a 2D bar visualization reduced force errors compared to a 3D arrow concept. Additionally, the visualizations being displayed directly on the ultrasound screen were subjectively preferred. With findings regarding feedback modality and visualization design our work represents an important step toward sensory substitution for touchless human-robot interaction.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Schreiter, J; Mielke, T; Georgiades, M; Pech, M; Hansen, C; Heinrich, F
Exploring Interaction Concepts for the Manipulation of a Collaborative Robot: A Comparative Study Proceedings Article
In: Proceedings of the 2025 ACM/IEEE International Conference on Human-Robot Interaction, pp. 55–64, IEEE Press, Melbourne, Australia, 2025.
@inproceedings{schreiter_exploring_2025,
title = {Exploring Interaction Concepts for the Manipulation of a Collaborative Robot: A Comparative Study},
author = {J Schreiter and T Mielke and M Georgiades and M Pech and C Hansen and F Heinrich},
year = {2025},
date = {2025-01-01},
urldate = {2025-01-01},
booktitle = {Proceedings of the 2025 ACM/IEEE International Conference on Human-Robot Interaction},
pages = {55–64},
publisher = {IEEE Press},
address = {Melbourne, Australia},
series = {HRI '25},
abstract = {Robotic systems have the potential to enhance a wide range of domains, such as medical workflows, by automating individual steps of complex processes. However, human-robot interaction (HRI) is of critical importance, as effective collaboration between humans and robots is essential even in highly automated environments. Recent research has predominantly focused on the development of interaction methods rather than systematically comparing existing approaches. Therefore, we conducted a user study (n=20) to compare different HRI concepts for end effector manipulation combined with clutching mechanisms for manipulation activation in an alignment task using the example of robotic ultrasound (US). Manipulation methods included hand-guiding, teleoperation, and touchless interaction, while clutching mechanisms were realized through hand, voice, and foot interaction. The results indicate advantages of hand-guiding for manipulation. While no significant differences were observed between clutching mechanisms, strong evidence suggests comparable performance across these modalities. Notably, significant interaction effects on perceived workload reveal that the optimal clutching mechanism depends on the selected manipulation technique. This work underscores the critical importance of selecting appropriate HRI concepts and understanding the dependencies of manipulation techniques with clutching mechanisms. While our study included the usage of a robotic US, the insights gained are broadly transferable across various domains involving robotic manipulation tasks in human-robot collaboration.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}
2024

Joeres, F; Zittlau, P; Herbrich, W; Heinrich, F; Rose, G; Hansen, C
Concept development of a cross-reality ecosystem for urban knowledge transfer spaces Proceedings Article
In: 2024 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), pp. 166–169, 2024, (ISSN: 2771-1110).
@inproceedings{joeres_concept_2024,
title = {Concept development of a cross-reality ecosystem for urban knowledge transfer spaces},
author = {F Joeres and P Zittlau and W Herbrich and F Heinrich and G Rose and C Hansen},
url = {https://ieeexplore.ieee.org/abstract/document/10765174},
doi = {10.1109/ISMAR-Adjunct64951.2024.00043},
year = {2024},
date = {2024-10-01},
urldate = {2024-10-01},
booktitle = {2024 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)},
pages = {166–169},
abstract = {This paper presents the development of a cross-reality (CR) ecosystem designed for an urban knowledge transfer space (KTS) in a post-industrial urban environment. The project is part of a larger initiative aimed at transforming a former industrial river port into a dynamic KTS, facilitating interactions between scientific, commercial, residential, and cultural stakeholders. Our research explores the potential of multimodal mixed reality (XR) technologies to enhance engagement with the content and stakeholders of the KTS. Through a three-phase process, we identified key stakeholders and their target audiences, selected appropriate XR technologies, and developed initial use cases that integrate web applications, mobile augmented reality (AR), and XR head-mounted displays. The preliminary findings indicate that these technologies can effectively cater to diverse user groups, providing different levels of virtuality and interaction. However, challenges remain, particularly in stakeholder engagement and the evolving nature of the KTS initiative. Ongoing work includes the development of a Web-XR-based prototype, which will be iteratively refined to better meet user needs and adapt to future technological advancements. This research contributes to the understanding of how CR technologies can be employed in urban transformation processes, offering insights into the design of flexible and scalable CR ecosystems.},
note = {ISSN: 2771-1110},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}

Schott, D; Heinrich, F; Kunz, M; Mandel, J; Albrecht, A; Braun-Dullaeus, R; Hansen, C
CardioCoLab: Collaborative Learning of Embryonic Heart Anatomy in Mixed Reality Journal Article
In: Eurographics Workshop on Visual Computing for Biology and Medicine, 2024, (Artwork Size: 5 pages Edition: 1191 ISBN: 9783038682448 Publisher: The Eurographics Association).
@article{schott_cardiocolab_2024,
title = {CardioCoLab: Collaborative Learning of Embryonic Heart Anatomy in Mixed Reality},
author = {D Schott and F Heinrich and M Kunz and J Mandel and A Albrecht and R Braun-Dullaeus and C Hansen},
url = {https://diglib.eg.org/handle/10.2312/vcbm20241191},
doi = {10.2312/VCBM.20241191},
year = {2024},
date = {2024-01-01},
urldate = {2024-01-01},
journal = {Eurographics Workshop on Visual Computing for Biology and Medicine},
abstract = {The complexity of embryonic heart development presents significant challenges for medical education, particularly in illustrating dynamic morphological changes over short time periods. Traditional teaching methods, such as 2D textbook illustrations and static models, are often insufficient for conveying these intricate processes. To address this gap, we developed a multi-user Mixed Reality (MR) system designed to enhance collaborative learning and interaction with virtual heart models. Building on previous research, we identified the needs of both students and teachers, implementing various interaction and visualization features iteratively. An evaluation with teachers and students (N = 12) demonstrated the system's effectiveness in improving engagement and understanding of embryonic heart development. The study highlights the potential of MR in medical seminar settings as a valuable addition to medical education by enhancing traditional learning methods.},
note = {Artwork Size: 5 pages
Edition: 1191
ISBN: 9783038682448
Publisher: The Eurographics Association},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Polenz, L; Joeres, F; Hansen, C; Heinrich, F
Simulating projective Augmented Reality Visualizations in Virtual Reality: Is VR a feasible Environment for medical AR Evaluations? Proceedings Article
In: Extended Abstracts of the CHI Conference on Human Factors in Computing Systems, pp. 1–8, Association for Computing Machinery, New York, NY, USA, 2024, ISBN: 979-8-4007-0331-7.
@inproceedings{polenz_simulating_2024,
title = {Simulating projective Augmented Reality Visualizations in Virtual Reality: Is VR a feasible Environment for medical AR Evaluations?},
author = {L Polenz and F Joeres and C Hansen and F Heinrich},
url = {https://doi.org/10.1145/3613905.3650843},
doi = {10.1145/3613905.3650843},
isbn = {979-8-4007-0331-7},
year = {2024},
date = {2024-01-01},
urldate = {2024-01-01},
booktitle = {Extended Abstracts of the CHI Conference on Human Factors in Computing Systems},
pages = {1–8},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
series = {CHI EA '24},
abstract = {Augmented Reality (AR) has demonstrated potential in medical applications, such as enhancing surgical navigation. However, evaluating medical AR visualizations entails high costs and effort to provide suitable hardware solutions. This is particularly crucial in projective AR, as these systems require several error-prone calibration and registration steps. This work investigates the suitability of Virtual Reality (VR) as a cost-effective and controlled study environment for evaluating projective AR visualizations. A virtual twin of a real laboratory environment was created, and a user study comparing two needle navigation visualizations was conducted. The study simulated identical experiments in both AR and VR to assess if similar results would emerge. Our findings indicate that both AR and VR experiments exhibited comparable effects in terms of performance and workload of both needle insertion visualizations. This study serves as a preliminary step in demonstrating the feasibility of using VR as an evaluation environment for projective AR visualizations.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}

Schott, D; Kunz, M; Heinrich, F; Mandel, J; Albrecht, A; Braun-Dullaeus, R; Hansen, C
Stand Alone or Stay Together: An In-situ Experiment of Mixed-Reality Applications in Embryonic Anatomy Education Proceedings Article
In: Proceedings of the 30th ACM Symposium on Virtual Reality Software and Technology, pp. 1–11, Association for Computing Machinery, New York, NY, USA, 2024, ISBN: 979-8-4007-0535-9.
@inproceedings{schott_stand_2024,
title = {Stand Alone or Stay Together: An In-situ Experiment of Mixed-Reality Applications in Embryonic Anatomy Education},
author = {D Schott and M Kunz and F Heinrich and J Mandel and A Albrecht and R Braun-Dullaeus and C Hansen},
url = {https://dl.acm.org/doi/10.1145/3641825.3687706},
doi = {10.1145/3641825.3687706},
isbn = {979-8-4007-0535-9},
year = {2024},
date = {2024-01-01},
urldate = {2024-01-01},
booktitle = {Proceedings of the 30th ACM Symposium on Virtual Reality Software and Technology},
pages = {1–11},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
series = {VRST '24},
abstract = {Where traditional media and methods reach their limits in anatomy education, mixed-reality (MR) environments can provide effective learning support because of their high interactivity and spatial visualization capabilities. However, the underlying design and pedagogical requirements are as diverse as the technologies themselves. This paper examines the effectiveness of individual- and collaborative learning environments for anatomy education, using embryonic heart development as an example. Both applications deliver the same content using identical visualizations and hardware but differ in interactivity and pedagogical approach. The environments were evaluated in a user study with medical students (n = 90) during their examination phase, assessing usability, user experience, social interaction/co-presence, cognitive load, and personal preference. Additionally, we conducted a knowledge test before and after an MR learning session to determine educational effects compared to a conventional anatomy seminar. Results indicate that the individual learning environment was generally preferred. However, no significant difference in learning effectiveness could be shown between the conventional approach and the MR applications. This suggests that both can effectively complement traditional seminars despite their different natures. Our study contributes to understanding how different MR settings could be tailored for anatomical education.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}
2023

Heinrich, F; Bornemann, K; Polenz, L; Lawonn, K; Hansen, C
Clutch & Grasp: Activation gestures and grip styles for device-based interaction in medical spatial augmented reality Journal Article
In: International Journal of Human-Computer Studies, vol. 180, pp. 103117, 2023, ISSN: 1071-5819.
@article{heinrich_clutch_2023,
title = {Clutch & Grasp: Activation gestures and grip styles for device-based interaction in medical spatial augmented reality},
author = {F Heinrich and K Bornemann and L Polenz and K Lawonn and C Hansen},
url = {https://www.sciencedirect.com/science/article/pii/S107158192300126X},
doi = {10.1016/j.ijhcs.2023.103117},
issn = {1071-5819},
year = {2023},
date = {2023-12-01},
urldate = {2023-12-01},
journal = {International Journal of Human-Computer Studies},
volume = {180},
pages = {103117},
abstract = {Presenting medical volume data using augmented reality (AR) can facilitate the identification of anatomical structures, the perception of their spatial relations and the development of mental maps compared to more commonly used monitors. However, interaction methods explored in these conventional settings may not be applicable in AR environments, or perform differently. In terms of mode activation, gestural interaction was shown to be a viable, touchless alternative to traditional input devices, which is desirable in sterile medical use cases. Therefore, we present a user study (n = 21) comparing hand and foot gestures with voice commands for the activation of interaction modes within a projector-based, spatial AR prototype to visualize medical volume data. Interaction itself was performed via hand movements captured by a data glove. Consistent, statistically significant results across measured variables suggest advantages of voice commands. In addition, a second experiment (n = 17) compared the hand-based interaction with two motion-sensitive devices held in power and in precision grip respectively. All modes were activated using voice commands. No considerable differences between tested grip styles could be determined. The findings suggest that the choice of preferable interaction devices is user and use case dependent.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Schott, D; Heinrich, F; Stallmeister, L; Moritz, J; Hensen, B; Hansen, C
Is this the vReal Life? Manipulating Visual Fidelity of Immersive Environments for Medical Task Simulation Proceedings Article
In: 2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 1171–1180, IEEE, Sydney, Australia, 2023, ISBN: 979-8-3503-2838-7.
@inproceedings{schott_is_2023,
title = {Is this the vReal Life? Manipulating Visual Fidelity of Immersive Environments for Medical Task Simulation},
author = {D Schott and F Heinrich and L Stallmeister and J Moritz and B Hensen and C Hansen},
url = {https://ieeexplore.ieee.org/document/10316533/},
doi = {10.1109/ISMAR59233.2023.00134},
isbn = {979-8-3503-2838-7},
year = {2023},
date = {2023-10-01},
urldate = {2023-10-01},
booktitle = {2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)},
pages = {1171–1180},
publisher = {IEEE},
address = {Sydney, Australia},
abstract = {Recent developments and research advances contribute to an ever-increasing trend towards quality levels close to what we experience in reality. In this work, we investigate how different degrees of these quality characteristics affect user performance, qualia of user experience (UX), and sense of presence in an example medical task. To this end, a two-way within-subjects design user study was conducted, in which three different levels of visual fidelity were compared. In addition, two different interaction modalities were considered: (1) the use of conventional VR controllers and (2) natural hand interaction using 3D-printed, spatially-registered replicas of medical devices, to interact with their virtual representations. Consistent results indicate that higher degrees of visual fidelity evoke a higher sense of presence and UX. However, user performance was less affected. Moreover, no differences were detected between both interaction modalities for the examined task. Future work should investigate the discovered interaction effects between quality levels and interaction modalities in more detail and examine whether these results can be reproduced in tasks that require more precision. This work provides insights into the implications to consider when studying interactions in VR and paves the way for investigations into early phases of medical product development and workflow analysis.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}

Huettl, F; Heinrich, F; Boedecker, C; Vradelis, L; Ludt, A; Kneist, W; Lang, H; Hansen, C; Huber, T
In: Journal of the American College of Surgeons, vol. 237, no. 2, pp. 292, 2023, ISSN: 1879-1190.
@article{huettl_real-time_2023,
title = {Real-Time Augmented Reality Annotation for Surgical Education during Laparoscopic Surgery: Results from a Single-Center Randomized Controlled Trial and Future Aspects},
author = {F Huettl and F Heinrich and C Boedecker and L Vradelis and A Ludt and W Kneist and H Lang and C Hansen and T Huber},
url = {https://journals.lww.com/journalacs/abstract/2023/08000/real_time_augmented_reality_annotation_for.20.aspx},
doi = {10.1097/XCS.0000000000000712},
issn = {1879-1190},
year = {2023},
date = {2023-08-01},
urldate = {2023-08-01},
journal = {Journal of the American College of Surgeons},
volume = {237},
number = {2},
pages = {292},
abstract = {Background: We developed an interactive augmented reality tool (HoloPointer) that enables real-time annotation on a laparoscopy monitor for intraoperative guidance. This application operates exclusively via verbal commands and head movements to ensure a sterile workflow. Study design: Purpose of this randomized controlled clinical trial was to evaluate the integration of this new technology into the operating room. This prospective single-center study included 32 elective laparoscopic cholecystectomies (29 surgical teams, 15 trainees, 13 trainers). Primary objectives and assessment measures was the HoloPointer's influence on surgical performance (subjective assessment, global operative assessment of laparoscopic skills - GOALS, and Critical View of Safety -CVS). Secondary objectives and outcome variables were its influence on operation time, quality of assistance (5 point likert-scale), and user-friendliness (System Usability Scale - SUS, 0-100 points). Results: Gestural corrections were reduced by 59.4% (4.6 SD 8.1 vs. 1.9 SD 4.7; p > 0.05) and verbal corrections by 36.1% (17.8 SD 12.9 vs. 11.4 SD 8.1; p > 0.05). Subjective surgical performance could be improved by 84,6% of participants. No statistically significant differences were observed for objective parameters GOALS, CVS and operation time. In the SUS, the application achieved an average score of 72.5 SD 16.3 (good user-friendliness). Of the participants, 69.2% wanted to use the HoloPointer more frequently. Conclusion: The majority of trainees had improved their surgical performance using the HoloPointer in elective laparoscopic cholecystectomies, and the rate of classic but potentially misleading corrections was noticeably reduced. The HoloPointer has the potential to improve education in minimally invasive surgery.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Schwenderling, L; Kleinau, A; Herbrich, W; Kasireddy, H; Heinrich, F; Hansen, C
Activation modes for gesture-based interaction with a magic lens in AR anatomy visualisation Journal Article
In: Computer Methods in Biomechanics and Biomedical Engineering: Imaging & Visualization, vol. 11, no. 4, pp. 1243–1250, 2023, ISSN: 2168-1163, (Publisher: Taylor & Francis).
@article{schwenderling_activation_2023,
title = {Activation modes for gesture-based interaction with a magic lens in AR anatomy visualisation},
author = {L Schwenderling and A Kleinau and W Herbrich and H Kasireddy and F Heinrich and C Hansen},
url = {https://doi.org/10.1080/21681163.2022.2157749},
doi = {10.1080/21681163.2022.2157749},
issn = {2168-1163},
year = {2023},
date = {2023-07-01},
urldate = {2023-07-01},
journal = {Computer Methods in Biomechanics and Biomedical Engineering: Imaging & Visualization},
volume = {11},
number = {4},
pages = {1243–1250},
abstract = {Learning human anatomy is key for health-related education and often requires expensive and time-consuming cadaver dissection courses. Augmented reality (AR) for the representation of spatially registered 3D models can be used as a low-cost and flexible alternative. However, suitable visualisation and interaction approaches are needed to display multilayered anatomy data. This paper features a spherical volumetric AR Magic Lens controlled by mid-air hand gestures to explore the human anatomy on a phantom. Defining how gestures control associated actions is important for intuitive interaction. Therefore, two gesture activation modes were investigated in a user study (n = 24). Performing the gestures once to toggle actions showed a higher interaction count since an additional stop gesture was used. Holding the gestures was favoured in the qualitative feedback. Both modes showed similar performance in terms of accuracy and task completion time. Overall, direct gesture manipulation of a magic lens for anatomy visualisation is, thus, recommended.},
note = {Publisher: Taylor & Francis},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Schott, D; Kunz, M; Wunderling, T; Heinrich, F; Braun-Dullaeus, R; Hansen, C
CardioGenesis4D: Interactive Morphological Transitions of Embryonic Heart Development in a Virtual Learning Environment Journal Article
In: IEEE Transactions on Visualization and Computer Graphics, vol. 29, no. 5, pp. 2615–2625, 2023, ISSN: 1941-0506.
@article{schott_cardiogenesis4d_2023,
title = {CardioGenesis4D: Interactive Morphological Transitions of Embryonic Heart Development in a Virtual Learning Environment},
author = {D Schott and M Kunz and T Wunderling and F Heinrich and R Braun-Dullaeus and C Hansen},
url = {https://ieeexplore.ieee.org/document/10049681},
doi = {10.1109/TVCG.2023.3247110},
issn = {1941-0506},
year = {2023},
date = {2023-05-01},
urldate = {2023-05-01},
journal = {IEEE Transactions on Visualization and Computer Graphics},
volume = {29},
number = {5},
pages = {2615–2625},
abstract = {In the embryonic human heart, complex dynamic shape changes take place in a short period of time on a microscopic scale, making this development difficult to visualize. However, spatial understanding of these processes is essential for students and future cardiologists to properly diagnose and treat congenital heart defects. Following a user centered approach, the most crucial embryological stages were identified and translated into a virtual reality learning environment (VRLE) to enable the understanding of the morphological transitions of these stages through advanced interactions. To address individual learning types, we implemented different features and evaluated the application regarding usability, perceived task load, and sense of presence in a user study. We also assessed spatial awareness and knowledge gain, and finally obtained feedback from domain experts. Overall, students and professionals rated the application positively. To minimize distraction from interactive learning content, such VRLEs should consider features for different learning types, allow for gradual habituation, and at the same time provide enough playful stimuli. Our work previews how VR can be integrated into a cardiac embryology education curriculum.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}
2022

Schwenderling, L; Heinrich, F; Hansen, C
Augmented reality visualization of automated path planning for percutaneous interventions: a phantom study Journal Article
In: International Journal of Computer Assisted Radiology and Surgery, vol. 17, no. 11, pp. 2071–2079, 2022, ISSN: 1861-6429.
@article{schwenderling_augmented_2022,
title = {Augmented reality visualization of automated path planning for percutaneous interventions: a phantom study},
author = {L Schwenderling and F Heinrich and C Hansen},
url = {https://doi.org/10.1007/s11548-022-02690-4},
doi = {10.1007/s11548-022-02690-4},
issn = {1861-6429},
year = {2022},
date = {2022-11-01},
urldate = {2022-11-01},
journal = {International Journal of Computer Assisted Radiology and Surgery},
volume = {17},
number = {11},
pages = {2071–2079},
abstract = {Insertion point identification is a major challenge for percutaneous interventions. Planning in 2D slice image data is time-consuming and inefficient. Automated path planning can help to overcome these challenges. However, the setup of the intervention room is difficult to consider. In addition, transferring the insertion point to the skin is often prone to error. Therefore, a visualization for an automated path planning was implemented.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Schreiter, J; Schott, D; Schwenderling, L; Hansen, C; Heinrich, F; Joeres, F
AR-Supported Supervision of Conditional Autonomous Robots: Considerations for Pedicle Screw Placement in the Future Journal Article
In: Journal of Imaging, vol. 8, no. 10, pp. 255, 2022, ISSN: 2313-433X, (Publisher: Multidisciplinary Digital Publishing Institute).
@article{schreiter_ar-supported_2022,
title = {AR-Supported Supervision of Conditional Autonomous Robots: Considerations for Pedicle Screw Placement in the Future},
author = {J Schreiter and D Schott and L Schwenderling and C Hansen and F Heinrich and F Joeres},
url = {https://www.mdpi.com/2313-433X/8/10/255},
doi = {10.3390/jimaging8100255},
issn = {2313-433X},
year = {2022},
date = {2022-10-01},
urldate = {2022-10-01},
journal = {Journal of Imaging},
volume = {8},
number = {10},
pages = {255},
abstract = {Robotic assistance is applied in orthopedic interventions for pedicle screw placement (PSP). While current robots do not act autonomously, they are expected to have higher autonomy under surgeon supervision in the mid-term. Augmented reality (AR) is promising to support this supervision and to enable human–robot interaction (HRI). To outline a futuristic scenario for robotic PSP, the current workflow was analyzed through literature review and expert discussion. Based on this, a hypothetical workflow of the intervention was developed, which additionally contains the analysis of the necessary information exchange between human and robot. A video see-through AR prototype was designed and implemented. A robotic arm with an orthopedic drill mock-up simulated the robotic assistance. The AR prototype included a user interface to enable HRI. The interface provides data to facilitate understanding of the robot’s ”intentions”, e.g., patient-specific CT images, the current workflow phase, or the next planned robot motion. Two-dimensional and three-dimensional visualization illustrated patient-specific medical data and the drilling process. The findings of this work contribute a valuable approach in terms of addressing future clinical needs and highlighting the importance of AR support for HRI.},
note = {Publisher: Multidisciplinary Digital Publishing Institute},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Schott, D; Heinrich, F; Stallmeister, L; Hansen, C
Exploring object and multi-target instrument tracking for AR-guided interventions Journal Article
In: Current Directions in Biomedical Engineering, vol. 8, no. 1, pp. 74–77, 2022, ISSN: 2364-5504, (Publisher: De Gruyter).
@article{schott_exploring_2022,
title = {Exploring object and multi-target instrument tracking for AR-guided interventions},
author = {D Schott and F Heinrich and L Stallmeister and C Hansen},
url = {https://www.degruyterbrill.com/document/doi/10.1515/cdbme-2022-0019/html},
doi = {10.1515/cdbme-2022-0019},
issn = {2364-5504},
year = {2022},
date = {2022-07-01},
urldate = {2022-07-01},
journal = {Current Directions in Biomedical Engineering},
volume = {8},
number = {1},
pages = {74–77},
abstract = {The rapid development of available hard- and software for computer-assisted or augmented reality (AR) guided interventions creates a need for fast and inexpensive prototyping environments. However, intraoperative tracking systems in particular represent a high cost threshold. Therefore, this work presents a low-cost tracking method based on a conventional RGB camera. Here, a combined approach of multiple image targets and 3D object target recognition is implemented. The system is evaluated with a systematic accuracy assessment analyzing a total of 385 3D positions. On average, a deviation of 15,69+-9,95 mm was measured. In addition, a prototypical AR-based needle navigation visualization was developed using Microsoft HoloLens 2. This system’s feasibility and usability was evaluated positively in a pilot study (n=3).},
note = {Publisher: De Gruyter},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Schott, D; Heinrich, F; Labsch, D; Hensen, B; Hansen, C
Towards multimodal interaction for needlebased procedures in a virtual radiology suite Journal Article
In: Current Directions in Biomedical Engineering, vol. 8, no. 1, pp. 70–73, 2022, ISSN: 2364-5504, (Publisher: De Gruyter).
@article{schott_towards_2022,
title = {Towards multimodal interaction for needlebased procedures in a virtual radiology suite},
author = {D Schott and F Heinrich and D Labsch and B Hensen and C Hansen},
url = {https://www.degruyterbrill.com/document/doi/10.1515/cdbme-2022-0018/html},
doi = {10.1515/cdbme-2022-0018},
issn = {2364-5504},
year = {2022},
date = {2022-07-01},
urldate = {2022-07-01},
journal = {Current Directions in Biomedical Engineering},
volume = {8},
number = {1},
pages = {70–73},
abstract = {Touchless interaction is popular in the medical domain because it maintains sterility and ensures physicians’ autonomy. Evaluating these technologies, however, proves difficult due to technical and human hurdles. Virtual reality leaves these limitations behind and allows for the exploration of promising concepts by simulating an environment and the interactions that takes place within it.We present a virtual radiology suite in the context of needle-based MR-interventions to evaluate touchless interactions. Hand and foot inputs were implemented on a custom interface and evaluated in a user study (n= 16). Results show that activating the system and manipulating values was faster with foot input. However, multimodal interaction is preferable because it is less demanding.},
note = {Publisher: De Gruyter},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Heinrich, F; Schwenderling, L; Joeres, F; Hansen, C
2D versus 3D: A Comparison of Needle Navigation Concepts between Augmented Reality Display Devices Proceedings Article
In: 2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 260–269, 2022, (ISSN: 2642-5254).
@inproceedings{heinrich_2d_2022,
title = {2D versus 3D: A Comparison of Needle Navigation Concepts between Augmented Reality Display Devices},
author = {F Heinrich and L Schwenderling and F Joeres and C Hansen},
url = {https://ieeexplore.ieee.org/document/9756753},
doi = {10.1109/VR51125.2022.00045},
year = {2022},
date = {2022-03-01},
urldate = {2022-03-01},
booktitle = {2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)},
pages = {260–269},
abstract = {Surgical procedures requiring needle navigation assistance suffer from complicated hand-eye coordination and are mentally demanding. Augmented reality (AR) can help overcome these issues. How-ever, only an insufficient amount of fundamental research has focused on the design and hardware selection of such AR needle navigation systems. This work contributes to this research area by presenting a user study (n=24) comparing three state-of-the-art navigation concepts displayed by an optical see-through head-mounted display and a stereoscopic projection system. A two-dimensional glyph visualization resulted in higher targeting accuracy but required more needle insertion time. In contrast, punctures guided by a three-dimensional see-through vision concept were less accurate but faster and were favored in a qualitative interview. The third concept, a static representation of the correctly positioned needle, showed too high target errors for clinical accuracy needs. This concept per-formed worse when displayed by the projection system. Besides that, no meaningful differences between the evaluated AR display devices were detected. User preferences and use case restrictions, e.g., sterility requirements, seem to be more crucial selection criteria. Future work should focus on improving the accuracy of the see-through vision concept. Until then, the glyph visualization is recommended.},
note = {ISSN: 2642-5254},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}

Chheang, V; Heinrich, F; Joeres, F; Saalfeld, P; Preim, B; Hansen, C
Group WiM: A Group Navigation Technique for Collaborative Virtual Reality Environments Proceedings Article
In: 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), pp. 556–557, 2022.
@inproceedings{chheang_group_2022,
title = {Group WiM: A Group Navigation Technique for Collaborative Virtual Reality Environments},
author = {V Chheang and F Heinrich and F Joeres and P Saalfeld and B Preim and C Hansen},
url = {https://ieeexplore.ieee.org/document/9757426},
doi = {10.1109/VRW55335.2022.00129},
year = {2022},
date = {2022-03-01},
urldate = {2022-03-01},
booktitle = {2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)},
pages = {556–557},
abstract = {In this work, we present a group World-in-Miniature (WiM) navigation technique that allows a guide to navigate a team in collaborative virtual reality (VR) environments. We evaluated the usability, discomfort, and user performance of the proposed technique compared to state-of-the-art group teleportation in a user study łeft(ntextbackslash,=textbackslash,21textbackslashright). The results show that the proposed technique induces less discomfort for the guide and has slight usability advantages. Additionally, the group WiM technique seems superior in regards to task completion time for obstructed target destination. However, it performs similarly to the group teleportation technique in direct line of sight cases. The group WiM technique provides potential benefits for effective group navigation in complex virtual environments and harder-to-reach target locations.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}
2021

Heinrich, F; Schwenderling, L; Streuber, M; Bornemann, K; Lawonn, K; Hansen, C
Effects of Surface Visualizations on Depth Perception in Projective Augmented Reality Proceedings Article
In: 2021 IEEE 2nd International Conference on Human-Machine Systems (ICHMS), pp. 1–6, 2021.
@inproceedings{heinrich_effects_2021,
title = {Effects of Surface Visualizations on Depth Perception in Projective Augmented Reality},
author = {F Heinrich and L Schwenderling and M Streuber and K Bornemann and K Lawonn and C Hansen},
url = {https://ieeexplore.ieee.org/document/9582452},
doi = {10.1109/ICHMS53169.2021.9582452},
year = {2021},
date = {2021-09-01},
urldate = {2021-09-01},
booktitle = {2021 IEEE 2nd International Conference on Human-Machine Systems (ICHMS)},
pages = {1–6},
abstract = {Depth perception is a common issue in augmented reality (AR). Projective AR, where the spatial relations between the projection surface and displayed virtual contents need to be represented properly, is particularly affected. This is crucial in the medical domain, e.g., for the distances between the patient’s skin and projected inner anatomical structures, but not much research was conducted in this context before. To this end, this work investigates the applicability of surface visualization techniques to support the perception of spatial relations in projective AR. Four methods previously explored in different domains were combined with the projection of inner anatomical structures on a human torso phantom. They were evaluated in a comparative user study (n=21) with respect to a distance estimation and a sorting task. Measures included Task completion time, accuracy, total Head movement and Confidence of the participants. Consistent results across variables show advantages of more occluding surface visualizations for the distance estimation task. Opposite results were obtained for the sorting task. This suggests that the amount of needed surface preservation depends on the use case and individual occlusion compromises need to be explored in future work.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}

Schwenderling, L; Hansen, C; Heinrich, F
AR visualization of automated access path planning for percutaneous interventions Journal Article
In: Current Directions in Biomedical Engineering, vol. 7, no. 1, pp. 48–52, 2021, ISSN: 2364-5504, (Publisher: De Gruyter).
@article{schwenderling_ar_2021,
title = {AR visualization of automated access path planning for percutaneous interventions},
author = {L Schwenderling and C Hansen and F Heinrich},
url = {https://www.degruyterbrill.com/document/doi/10.1515/cdbme-2021-1011/html},
doi = {10.1515/cdbme-2021-1011},
issn = {2364-5504},
year = {2021},
date = {2021-08-01},
urldate = {2021-08-01},
journal = {Current Directions in Biomedical Engineering},
volume = {7},
number = {1},
pages = {48–52},
abstract = {Minimally invasive interventions, e.g., percutaneous needle interventions, have many advantages compared to traditional surgery. However, they may require complex and time-consuming planning with experience-dependent success. Automated access path planning is faster and more consistent but individual preferences and situational circumstances are not considered. To this end, displaying the path planning results directly on the patient’s skin, using projector-based augmented reality (AR), was investigated. A constraint-based path planning was implemented to evaluate the quality of every path, taking into account risk structures and path length. A visualization was developed to display the results on the skin and to allow for path selection. The choice of the path followed by a navigated insertion was evaluated in a pilot study (n=5), considering four levels of the visualization with different amounts of displayed insertion points. Participants stated that they preferred to have multiple potential puncture points displayed. However, the results for the considered variables show only small differences. Overall, it has been shown that projectorbased AR visualization of automated access path planning is possible and enables individual, situation-adapted insertion point selection. More research is required to further explore optimal display of paths.},
note = {Publisher: De Gruyter},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Heinrich, F; Apilla, V; Lawonn, K; Hansen, C; Preim, B; Meuschke, M
Estimating depth information of vascular models: A comparative user study between a virtual reality and a desktop application Journal Article
In: Computers & Graphics, vol. 98, pp. 210–217, 2021, ISSN: 0097-8493.
@article{heinrich_estimating_2021,
title = {Estimating depth information of vascular models: A comparative user study between a virtual reality and a desktop application},
author = {F Heinrich and V Apilla and K Lawonn and C Hansen and B Preim and M Meuschke},
url = {https://www.sciencedirect.com/science/article/pii/S0097849321001138},
doi = {10.1016/j.cag.2021.05.014},
issn = {0097-8493},
year = {2021},
date = {2021-08-01},
urldate = {2021-08-01},
journal = {Computers & Graphics},
volume = {98},
pages = {210–217},
abstract = {Vascular structures are assessed, e.g., in tumor surgery to understand the influence of a planned resection on the vascular supply and venous drainage. The understanding of complex branching vascular structures may benefit from immersive virtual reality (VR) visualization. Therefore, the estimation of distance, depth and shape information is a crucial task to support diagnosis and therapy decisions. Depending on the visualization techniques used, perceptual issues can influence this process and may thus lead to false conclusions. Many studies were carried out to study depth perception for different variants of vessel visualization. However, these studies are restricted to desktop applications. Since VR exhibits specific perceptual problems, we aim at an understanding of the appropriateness of vessel visualization techniques in VR. Therefore, this paper presents a user study that investigates the effects of three commonly used visualization techniques on depth perception. The set of visualization techniques comprises Phong shading, pseudo-chromadepth and fog shading. An immersive VR setup of the study using a head-mounted display (HMD) was compared to a traditional desktop setup. Results suggest that depth judgments are less error-prone and more certain in VR than in desktop environments. Moreover, depth-enhancing visualization techniques had greater effects in the desktop study compared to the VR study.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Joeres, F; Heinrich, F; Schott, D; Hansen, C
Towards natural 3D interaction for laparoscopic augmented reality registration Journal Article
In: Computer Methods in Biomechanics and Biomedical Engineering: Imaging & Visualization, vol. 9, no. 4, pp. 384–391, 2021, ISSN: 2168-1163, (Publisher: Taylor & Francis _eprint: https://doi.org/10.1080/21681163.2020.1834877).
@article{joeres_towards_2021,
title = {Towards natural 3D interaction for laparoscopic augmented reality registration},
author = {F Joeres and F Heinrich and D Schott and C Hansen},
url = {https://doi.org/10.1080/21681163.2020.1834877},
doi = {10.1080/21681163.2020.1834877},
issn = {2168-1163},
year = {2021},
date = {2021-07-01},
urldate = {2021-07-01},
journal = {Computer Methods in Biomechanics and Biomedical Engineering: Imaging & Visualization},
volume = {9},
number = {4},
pages = {384–391},
abstract = {Augmented reality (AR) is a widely researched route for navigation support in laparoscopic surgery. Accurate registration is a crucial component for such AR systems. We introduce two methods for interactive registration that aim to be minimally invasive to the workflow and to mimic natural manipulation of 3D objects. The methods utilise spatially tracked laparoscopic tools to manipulate the virtual 3D content. We comparatively evaluated the methods against a reference, landmark-based registration method in a user study with 12 participants. We tested the methods for registration accuracy, time, and subjective usability perception. Our methods did not outperform the reference method on these parameters but showed promising results. The results indicate that our methods present no finalised solutions but that one of them is a promising approach for which we identified concrete improvement measures to be implemented in future research.},
note = {Publisher: Taylor & Francis
_eprint: https://doi.org/10.1080/21681163.2020.1834877},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Heinrich, F; Huettl, F; Schmidt, G; Paschold, M; Kneist, W; Huber, T; Hansen, C
HoloPointer: a virtual augmented reality pointer for laparoscopic surgery training Journal Article
In: International Journal of Computer Assisted Radiology and Surgery, vol. 16, no. 1, pp. 161–168, 2021, ISSN: 1861-6410, 1861-6429.
@article{heinrich_holopointer_2021,
title = {HoloPointer: a virtual augmented reality pointer for laparoscopic surgery training},
author = {F Heinrich and F Huettl and G Schmidt and M Paschold and W Kneist and T Huber and C Hansen},
url = {https://link.springer.com/10.1007/s11548-020-02272-2},
doi = {10.1007/s11548-020-02272-2},
issn = {1861-6410, 1861-6429},
year = {2021},
date = {2021-01-01},
urldate = {2021-01-01},
journal = {International Journal of Computer Assisted Radiology and Surgery},
volume = {16},
number = {1},
pages = {161–168},
abstract = {Purpose In laparoscopic surgery training, experts guide novice physicians to desired instrument positions or indicate relevant areas of interest. These instructions are usually given via verbal communication or using physical pointing devices. To facilitate a sterile work flow and to improve training, new guiding methods are needed. This work proposes to use optical see-through augmented reality to visualize an interactive virtual pointer on the laparoscopic.
Methods After an interdisciplinary development, the pointer’s applicability and feasibility for training was evaluated and it was compared to a standard condition based on verbal and gestural communication only. In this study, ten surgical trainees were guided by an experienced trainer during cholecystectomies on a laparoscopic training simulator. All trainees completed a virtual cholecystectomy with and without the interactive virtual pointer in alternating order. Measures included procedure time, economy of movement and error rates.
Results Results of standardized variables revealed significantly improved economy of movement (p 0.047) and error rates (p 0.047), as well as an overall improved user performance (Total z-score; p 0.031) in conditions using the proposed method.
Conclusion The proposed HoloPointer is a feasible and applicable tool for laparoscopic surgery training. It improved objective performance metrics without prolongation of the task completion time in this pre-clinical setup.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}
Methods After an interdisciplinary development, the pointer’s applicability and feasibility for training was evaluated and it was compared to a standard condition based on verbal and gestural communication only. In this study, ten surgical trainees were guided by an experienced trainer during cholecystectomies on a laparoscopic training simulator. All trainees completed a virtual cholecystectomy with and without the interactive virtual pointer in alternating order. Measures included procedure time, economy of movement and error rates.
Results Results of standardized variables revealed significantly improved economy of movement (p 0.047) and error rates (p 0.047), as well as an overall improved user performance (Total z-score; p 0.031) in conditions using the proposed method.
Conclusion The proposed HoloPointer is a feasible and applicable tool for laparoscopic surgery training. It improved objective performance metrics without prolongation of the task completion time in this pre-clinical setup.
2020

Heinrich, F; Schwenderling, L; Joeres, F; Lawonn, K; Hansen, C
Comparison of Augmented Reality Display Techniques to Support Medical Needle Insertion Journal Article
In: IEEE Transactions on Visualization and Computer Graphics, vol. 26, no. 12, pp. 3568–3575, 2020, ISSN: 1941-0506.
@article{heinrich_comparison_2020,
title = {Comparison of Augmented Reality Display Techniques to Support Medical Needle Insertion},
author = {F Heinrich and L Schwenderling and F Joeres and K Lawonn and C Hansen},
url = {https://ieeexplore.ieee.org/abstract/document/9211732},
doi = {10.1109/TVCG.2020.3023637},
issn = {1941-0506},
year = {2020},
date = {2020-12-01},
urldate = {2020-12-01},
journal = {IEEE Transactions on Visualization and Computer Graphics},
volume = {26},
number = {12},
pages = {3568–3575},
abstract = {Augmented reality (AR) may be a useful technique to overcome issues of conventionally used navigation systems supporting medical needle insertions, like increased mental workload and complicated hand-eye coordination. Previous research primarily focused on the development of AR navigation systems designed for specific displaying devices, but differences between employed methods have not been investigated before. To this end, a user study involving a needle insertion task was conducted comparing different AR display techniques with a monitor-based approach as baseline condition for the visualization of navigation information. A video see-through stationary display, an optical see-through head-mounted display and a spatial AR projector-camera-system were investigated in this comparison. Results suggest advantages of using projected navigation information in terms of lower task completion time, lower angular deviation and affirmative subjective participant feedback. Techniques requiring the intermediate view on screens, i.e. the stationary display and the baseline condition, showed less favorable results. Thus, benefits of providing AR navigation information compared to a conventionally used method could be identified. Significant objective measures results, as well as an identification of advantages and disadvantages of individual display techniques contribute to the development and design of improved needle navigation systems.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Bornemann, K; Heinrich, F; Lawonn, K; Hansen, C
Exploration of medical volume data in projective augmented reality: an interactive demonstration Proceedings Article
In: Proceedings of Mensch und Computer 2020, pp. 507–509, Association for Computing Machinery, New York, NY, USA, 2020, ISBN: 978-1-4503-7540-5.
@inproceedings{bornemann_exploration_2020,
title = {Exploration of medical volume data in projective augmented reality: an interactive demonstration},
author = {K Bornemann and F Heinrich and K Lawonn and C Hansen},
url = {https://doi.org/10.1145/3404983.3410415},
doi = {10.1145/3404983.3410415},
isbn = {978-1-4503-7540-5},
year = {2020},
date = {2020-09-01},
urldate = {2020-09-01},
booktitle = {Proceedings of Mensch und Computer 2020},
pages = {507–509},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
series = {MuC '20},
abstract = {This article describes the implementation and usage of a prototype to explore medical volume data in projective augmented reality (AR). To allow real-time rendering of the data, a volume ray casting algorithm is implemented on the GPU. Furthermore, the volume data is distorted to be superimposed correctly from the user's perspective. To allow exploration of the dataset, the user can use an HTC Vive Controller to show or hide structures with either clipping or windowing. In the future, the proposed prototype might be used for diagnosis, planning and computer assisted surgery. First tests indicate that this setup increases spatial understanding of the volume data compared to conventional computer monitors.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}

Heinrich, F; Bornemann, K; Lawonn, K; Hansen, C
Interacting with Medical Volume Data in Projective Augmented Reality Book Section
In: Martel, A; Abolmaesumi, P; Stoyanov, D; Mateus, D; Zuluaga, M; Zhou, S; Racoceanu, D; Joskowicz, Leo (Ed.): Medical Image Computing and Computer Assisted Intervention – MICCAI 2020, vol. 12263, pp. 429–439, Springer International Publishing, Cham, 2020, ISBN: 978-3-030-59715-3 978-3-030-59716-0, (Series Title: Lecture Notes in Computer Science).
@incollection{martel_interacting_2020,
title = {Interacting with Medical Volume Data in Projective Augmented Reality},
author = {F Heinrich and K Bornemann and K Lawonn and C Hansen},
editor = {A Martel and P Abolmaesumi and D Stoyanov and D Mateus and M Zuluaga and S Zhou and D Racoceanu and Leo Joskowicz},
url = {https://link.springer.com/10.1007/978-3-030-59716-0_41},
doi = {10.1007/978-3-030-59716-0_41},
isbn = {978-3-030-59715-3 978-3-030-59716-0},
year = {2020},
date = {2020-01-01},
urldate = {2020-01-01},
booktitle = {Medical Image Computing and Computer Assisted Intervention – MICCAI 2020},
volume = {12263},
pages = {429–439},
publisher = {Springer International Publishing},
address = {Cham},
abstract = {Medical volume data is usually explored on monoscopic monitors. Displaying this data in three-dimensional space facilitates the development of mental maps and the identification of anatomical structures and their spatial relations. Using augmented reality (AR) may further enhance these effects by spatially aligning the volume data with the patient. However, conventional interaction methods, e.g. mouse and keyboard, may not be applicable in this environment. Appropriate interaction techniques are needed to naturally and intuitively manipulate the image data. To this end, a user study comparing four gestural interaction techniques with respect to both clipping and windowing tasks was conducted. Image data was directly displayed on a phantom using stereoscopic projective AR and direct volume visualization. Participants were able to complete both tasks with all interaction techniques with respectively similar clipping accuracy and windowing efficiency. However, results suggest advantages of gestures based on motion-sensitive devices in terms of reduced task completion time and less subjective workload. This work presents an important first step towards a surgical AR visualization system enabling intuitive exploration of volume data. Yet, more research is required to assess the interaction techniques’ applicability for intraoperative use.},
note = {Series Title: Lecture Notes in Computer Science},
keywords = {},
pubstate = {published},
tppubtype = {incollection}
}
2019

Heinrich, F; Schmidt, G; Jungmann, F; Hansen, C
Augmented Reality Visualisation Concepts to Support Intraoperative Distance Estimation Proceedings Article
In: Proceedings of the 25th ACM Symposium on Virtual Reality Software and Technology, pp. 1–2, Association for Computing Machinery, New York, NY, USA, 2019, ISBN: 978-1-4503-7001-1.
@inproceedings{heinrich_augmented_2019,
title = {Augmented Reality Visualisation Concepts to Support Intraoperative Distance Estimation},
author = {F Heinrich and G Schmidt and F Jungmann and C Hansen},
url = {https://doi.org/10.1145/3359996.3364818},
doi = {10.1145/3359996.3364818},
isbn = {978-1-4503-7001-1},
year = {2019},
date = {2019-11-01},
urldate = {2019-11-01},
booktitle = {Proceedings of the 25th ACM Symposium on Virtual Reality Software and Technology},
pages = {1–2},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
series = {VRST '19},
abstract = {The estimation of distances and spatial relations between surgical instruments and surrounding anatomical structures is a challenging task for clinicians in image-guided surgery. Using augmented reality (AR), navigation aids can be displayed directly at the intervention site to support the assessment of distances and reduce the risk of damage to healthy tissue. To this end, four distance-encoding visualisation concepts were developed using a head-mounted optical see-through AR setup and evaluated by conducting a comparison study. Results suggest the general advantage of the proposed methods compared to a blank visualisation providing no additional information. Using a Distance Sensor concept signalising the proximity of nearby structures resulted in the least time the instrument was located below 5mm to surrounding risk structures and yielded the least amount of collisions with them.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}

Heinrich, F; Bornemann, K; Lawonn, K; Hansen, C
Depth Perception in Projective Augmented Reality: An Evaluation of Advanced Visualization Techniques Proceedings Article
In: Proceedings of the 25th ACM Symposium on Virtual Reality Software and Technology, pp. 1–11, Association for Computing Machinery, New York, NY, USA, 2019, ISBN: 978-1-4503-7001-1.
@inproceedings{heinrich_depth_2019,
title = {Depth Perception in Projective Augmented Reality: An Evaluation of Advanced Visualization Techniques},
author = {F Heinrich and K Bornemann and K Lawonn and C Hansen},
url = {https://doi.org/10.1145/3359996.3364245},
doi = {10.1145/3359996.3364245},
isbn = {978-1-4503-7001-1},
year = {2019},
date = {2019-11-01},
urldate = {2019-11-01},
booktitle = {Proceedings of the 25th ACM Symposium on Virtual Reality Software and Technology},
pages = {1–11},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
series = {VRST '19},
abstract = {Augmented reality (AR) is a promising tool to convey useful information at the place where it is needed. However, perceptual issues with augmented reality visualizations affect the estimation of distances and depth and thus can lead to critically wrong assumptions. These issues have been successfully investigated for video see-through modalities. Moreover, advanced visualization methods encoding depth information by displaying additional depth cues were developed. In this work, state-of-the-art visualization concepts were adopted for a projective AR setup. We conducted a user study to assess the concepts’ suitability to convey depth information. Participants were asked to sort virtual cubes by using the provided depth cues. The investigated visualization concepts consisted of conventional Phong shading, a virtual mirror, depth-encoding silhouettes, pseudo-chromadepth rendering and an illustrative visualization using supporting line depth cues. Besides different concepts, we altered between a monoscopic and a stereoscopic display mode to examine the effects of stereopsis. Consistent results across variables show a clear ranking of examined concepts. The supporting lines approach and the pseudo-chromadepth rendering performed best. Stereopsis was shown to provide significant advantages for depth perception, while the current visualization technique had only little effect on investigated measures in this condition. However, similar results were achieved using the supporting lines and the pseudo-chromadepth concepts in a monoscopic setup. Our study showed the suitability of advanced visualization concepts for the rendering of virtual content in projective AR. Specific depth estimation results contribute to the future design and development of applications for these systems.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}

Heinrich, F; Schwenderling, L; Becker, M; Skalej, M; Hansen, C
HoloInjection: augmented reality support for CT-guided spinal needle injections Journal Article
In: Healthcare Technology Letters, vol. 6, no. 6, pp. 165–171, 2019, (Publisher: The Institution of Engineering and Technology).
@article{heinrich_holoinjection_2019,
title = {HoloInjection: augmented reality support for CT-guided spinal needle injections},
author = {F Heinrich and L Schwenderling and M Becker and M Skalej and C Hansen},
url = {https://digital-library.theiet.org/doi/10.1049/htl.2019.0062},
doi = {10.1049/htl.2019.0062},
year = {2019},
date = {2019-11-01},
urldate = {2019-11-01},
journal = {Healthcare Technology Letters},
volume = {6},
number = {6},
pages = {165–171},
abstract = {The correct placement of needles is decisive for the success of many minimally-invasive interventions and therapies. These needle insertions are usually only guided by radiological imaging and can benefit from additional navigation support. Augmented reality (AR) is a promising tool to conveniently provide needed information and may thus overcome the limitations of existing approaches. To this end, a prototypical AR application was developed to guide the insertion of needles to spinal targets using the mixed reality glasses Microsoft HoloLens. The system's registration accuracy was attempted to measure and three guidance visualisation concepts were evaluated concerning achievable in-plane and out-of-plane needle orientation errors in a comparison study. Results suggested high registration accuracy and showed that the AR prototype is suitable for reducing out-of-plane orientation errors. Limitations, like comparatively high in-plane orientation errors, effects of the viewing position and missing image slices indicate potential for improvement that needs to be addressed before transferring the application to clinical trials.},
note = {Publisher: The Institution of Engineering and Technology},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Heinrich, F; Joeres, F; Lawonn, K; Hansen, C
Comparison of Projective Augmented Reality Concepts to Support Medical Needle Insertion Journal Article
In: IEEE Transactions on Visualization and Computer Graphics, vol. 25, no. 6, pp. 2157–2167, 2019, ISSN: 1077-2626, 1941-0506, 2160-9306.
@article{heinrich_comparison_2019,
title = {Comparison of Projective Augmented Reality Concepts to Support Medical Needle Insertion},
author = {F Heinrich and F Joeres and K Lawonn and C Hansen},
url = {https://ieeexplore.ieee.org/document/8667734/},
doi = {10.1109/TVCG.2019.2903942},
issn = {1077-2626, 1941-0506, 2160-9306},
year = {2019},
date = {2019-06-01},
urldate = {2019-06-01},
journal = {IEEE Transactions on Visualization and Computer Graphics},
volume = {25},
number = {6},
pages = {2157–2167},
abstract = {Augmented reality (AR) is a promising tool to improve instrument navigation in needle-based interventions. Limited research has been conducted regarding suitable navigation visualizations. In this work, three navigation concepts based on existing approaches were compared in a user study using a projective AR setup. Each concept was implemented with three different scales for accuracy-to-color mapping and two methods of navigation indicator scaling. Participants were asked to perform simulated needle insertion tasks with each of the resulting 18 prototypes. Insertion angle and insertion depth accuracies were measured and analyzed, as well as task completion time and participants’ subjectively perceived task difficulty. Results show a clear ranking of visualization concepts across variables. Less consistent results were obtained for the color and indicator scaling factors. Results suggest that logarithmic indicator scaling achieved better accuracy, but participants perceived it to be more difficult than linear scaling. With specific results for angle and depth accuracy, our study contributes to the future composition of improved navigation support and systems for precise needle insertion or similar applications.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Heinrich, F; Schmidt, G; Bornemann, K; Roethe, A; Essayed, W; Hansen, C
Visualization concepts to improve spatial perception for instrument navigation in image-guided surgery Proceedings Article
In: Fei, Baowei; Linte, Cristian A. (Ed.): Medical Imaging 2019: Image-Guided Procedures, Robotic Interventions, and Modeling, pp. 77, SPIE, San Diego, United States, 2019, ISBN: 978-1-5106-2549-5 978-1-5106-2550-1.
@inproceedings{heinrich_visualization_2019,
title = {Visualization concepts to improve spatial perception for instrument navigation in image-guided surgery},
author = {F Heinrich and G Schmidt and K Bornemann and A Roethe and W Essayed and C Hansen},
editor = {Baowei Fei and Cristian A. Linte},
url = {https://www.spiedigitallibrary.org/conference-proceedings-of-spie/10951/2512761/Visualization-concepts-to-improve-spatial-perception-for-instrument-navigation-in/10.1117/12.2512761.full},
doi = {10.1117/12.2512761},
isbn = {978-1-5106-2549-5 978-1-5106-2550-1},
year = {2019},
date = {2019-03-01},
urldate = {2019-03-01},
booktitle = {Medical Imaging 2019: Image-Guided Procedures, Robotic Interventions, and Modeling},
pages = {77},
publisher = {SPIE},
address = {San Diego, United States},
abstract = {Image-guided surgery near anatomical or functional risk structures poses a challenging task for surgeons. To this end, surgical navigation systems that visualize the spatial relation between patient anatomy (represented by 3D images) and surgical instruments have been described. The provided 3D visualizations of these navigation systems are often complex and thus might increase the mental effort for surgeons. Therefore, an appropriate intraoperative visualization of spatial relations between surgical instruments and risk structures poses a pressing need. We propose three visualization methods to improve spatial perception in navigated surgery. A pointer ray encodes the distance between a tracked instrument tip and risk structures along the tool’s main axis. A side-looking radar visualizes the distance between the instrument tip and nearby structures by a ray rotating around the tool. Virtual lighthouses visualize the distances between the instrument tip and predefined anatomical landmarks as color-coded lights flashing between the instrument tip and the landmarks. Our methods aim to encode distance information with low visual complexity. To evaluate our concepts’ usefulness, we conducted a user study with 16 participants. During the study, the participants were asked to insert a pointer tool into a virtual target inside a phantom without touching nearby risk structures or boundaries. Results showed that our concepts were perceived as useful and suitable to improve distance assessment and spatial awareness of risk structures and surgical instruments. Participants were able to safely maneuver the instrument while our navigation cues increased participant confidence of successful avoidance of risk structures.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}

Mewes, A; Heinrich, F; Kägebein, U; Hensen, B; Wacker, F; Hansen, C
Projector‐based augmented reality system for interventional visualization inside MRI scanners Journal Article
In: The International Journal of Medical Robotics and Computer Assisted Surgery, vol. 15, no. 1, pp. e1950, 2019, ISSN: 1478-5951, 1478-596X.
@article{clive_projectorbased_2019,
title = {Projector‐based augmented reality system for interventional visualization inside MRI scanners},
author = {A Mewes and F Heinrich and U Kägebein and B Hensen and F Wacker and C Hansen},
editor = {L Clive},
url = {https://onlinelibrary.wiley.com/doi/10.1002/rcs.1950},
doi = {10.1002/rcs.1950},
issn = {1478-5951, 1478-596X},
year = {2019},
date = {2019-02-01},
urldate = {2019-02-01},
journal = {The International Journal of Medical Robotics and Computer Assisted Surgery},
volume = {15},
number = {1},
pages = {e1950},
abstract = {Background Navigation support in the interventional MRI is separated from the operating field, which makes it difficult to interpret positions and orientations and to coordinate the nessecary hand movements.
Methods We developed a projector-based augmented reality system to enable visual navigation of tracked instruments on pre-planned paths, and the visualization of risk structures directly on the patient inside the MRI bore. To assess the accuracy of the system a user study with clinicians was carried out in needle navigation test scenario.
Results The targets were reached with an error of 1.7 ± 0.5 mm, the entry points with an error of 1.7 ± 0.8 mm.
Conclusion The results suggest that the projected augmented reality navigation directly on the patient is accurate enough to target lesions with a size of down to 15 mm, so that prototype can serve as a platform for current and future research in augmented reality visualization and dynamic registration.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}
Methods We developed a projector-based augmented reality system to enable visual navigation of tracked instruments on pre-planned paths, and the visualization of risk structures directly on the patient inside the MRI bore. To assess the accuracy of the system a user study with clinicians was carried out in needle navigation test scenario.
Results The targets were reached with an error of 1.7 ± 0.5 mm, the entry points with an error of 1.7 ± 0.8 mm.
Conclusion The results suggest that the projected augmented reality navigation directly on the patient is accurate enough to target lesions with a size of down to 15 mm, so that prototype can serve as a platform for current and future research in augmented reality visualization and dynamic registration.

Heinrich, F; Joeres, F; Lawonn, K; Hansen, C
Effects of Accuracy-to-Colour Mapping Scales on Needle Navigation Aids visualised by Projective Augmented Reality Journal Article
In: 2019.
@article{heinrich_eects_2019,
title = {Effects of Accuracy-to-Colour Mapping Scales on Needle Navigation Aids visualised by Projective Augmented Reality},
author = {F Heinrich and F Joeres and K Lawonn and C Hansen},
year = {2019},
date = {2019-01-01},
urldate = {2019-01-01},
abstract = {Instrument navigation in needle-based interventions can benefit from augmented reality (AR) visualisation. Design aspects of these visualisations have been investigated to a limited degree. This work examined colourspecific parameters for AR instrument navigation, that have not been successfully researched before. Three different mapping methods to encode accuracy information to colour and two colour scales varying different colour channels were evaluated in a user study. Angular and depth accuracy of inserted needles were measured and task difficulty was subjectively rated. Result trends indicate benefits of mapping accuracy to discrete colours based on thresholds and using single hue colour scales that vary in the luminance or saturation channel. Yet, more research is required to validate the exposed indications. This work can constitute a valuable basis for this.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}
2018

Mewes, A; Heinrich, F; Hensen, B; Wacker, F; Lawonn, K; Hansen, C
Concepts for augmented reality visualisation to support needle guidance inside the MRI Journal Article
In: Healthcare Technology Letters, vol. 5, no. 5, pp. 172–176, 2018, ISSN: 2053-3713, (_eprint: https://ietresearch.onlinelibrary.wiley.com/doi/pdf/10.1049/htl.2018.5076).
@article{mewes_concepts_2018,
title = {Concepts for augmented reality visualisation to support needle guidance inside the MRI},
author = {A Mewes and F Heinrich and B Hensen and F Wacker and K Lawonn and C Hansen},
url = {https://onlinelibrary.wiley.com/doi/abs/10.1049/htl.2018.5076},
doi = {10.1049/htl.2018.5076},
issn = {2053-3713},
year = {2018},
date = {2018-01-01},
urldate = {2018-01-01},
journal = {Healthcare Technology Letters},
volume = {5},
number = {5},
pages = {172–176},
abstract = {During MRI-guided interventions, navigation support is often separated from the operating field on displays, which impedes the interpretation of positions and orientations of instruments inside the patient's body as well as hand–eye coordination. To overcome these issues projector-based augmented reality can be used to support needle guidance inside the MRI bore directly in the operating field. The authors present two visualisation concepts for needle navigation aids which were compared in an accuracy and usability study with eight participants, four of whom were experienced radiologists. The results show that both concepts are equally accurate ( and ), useful and easy to use, with clear visual feedback about the state and success of the needle puncture. For easier clinical applicability, a dynamic projection on moving surfaces and organ movement tracking are needed. For now, tests with patients with respiratory arrest are feasible.},
note = {_eprint: https://ietresearch.onlinelibrary.wiley.com/doi/pdf/10.1049/htl.2018.5076},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Heinrich, F; Rohde, S; Huber, T; Paschold, M; Kneist, W; Lang, H; Preim, B; Hansen, C
VR-basierte Interaktion mit 3D-Organmodellen zur Planung und Simulation laparoskopischer Eingriffe Journal Article
In: 2018.
@article{heinrich_vr-basierte_2018,
title = {VR-basierte Interaktion mit 3D-Organmodellen zur Planung und Simulation laparoskopischer Eingriffe},
author = {F Heinrich and S Rohde and T Huber and M Paschold and W Kneist and H Lang and B Preim and C Hansen},
year = {2018},
date = {2018-01-01},
urldate = {2018-01-01},
abstract = {Die Planung chirurgischer Eingriffe stellt einen entscheidenden Schritt bei der Behandlung von Erkrankungen wie Lebermetastasen dar. Zur Unterst¨utzung von Chirurgen wurden daher zahlreiche Anwendungen zur Planung und abstrakten Simulation solcher Eingriffe entwickelt. Diese stoßen jedoch schnell an verschiedene Grenzen, wie unzureichende Interaktions- und Visualisierungsm¨oglichkeiten. In dieser Arbeit wird ein auf aktueller Virtual-Reality-Technologie basierender Softwareprototyp vorgestellt, der einen neuen L¨osungsansatz in diesem Bereich darstellen soll und sich zun¨achst auf laparoskopische Leberresektionen spezialisiert. Durch den Einsatz geeigneter Eingabeger¨ate und der M¨oglichkeit patientenspezifische Datenmodelle in die virtuelle Umgebung einzuf¨ugen, liefert der Prototyp neuartige Interaktionsm¨oglichkeiten zur Eingriffsplanung sowie individuelle Trainingsm¨oglichkeiten. Eine erste qualitative Nutzerstudie zeigt die N¨utzlichkeit und Benutzbarkeit der Anwendung und best¨atigt, dass der entwickelte Prototyp eine sinnvolle Basis f¨ur zuk¨unftige Arbeiten bildet. Die Weiterentwicklung des hier vorgestellten Ansatzes k¨onnte zuk¨unftig dazu beitragen das Potenzial bisheriger Planungs- und Simulationsmethoden besser auszusch¨opfen.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}

Heinrich, F; Rohde, S; Huber, T; Paschold, M; Kneist, W; Lang, H; Preim, B; Hansen, C
VR-basierte Interaktion mit 3D-Organmodellen zur Planung und Simulation laparoskopischer Eingriffe Journal Article
In: 2018.
@article{heinrich_vr-basierte_2018-1,
title = {VR-basierte Interaktion mit 3D-Organmodellen zur Planung und Simulation laparoskopischer Eingriffe},
author = {F Heinrich and S Rohde and T Huber and M Paschold and W Kneist and H Lang and B Preim and C Hansen},
url = {https://www.var.ovgu.de/pub/2018_Heinrich_CURAC.pdf},
year = {2018},
date = {2018-01-01},
urldate = {2018-01-01},
abstract = {Die Planung chirurgischer Eingriffe stellt einen entscheidenden Schritt bei der Behandlung von Erkrankungen wie Lebermetastasen dar. Zur Unterst¨utzung von Chirurgen wurden daher zahlreiche Anwendungen zur Planung und abstrakten Simulation solcher Eingriffe entwickelt. Diese stoßen jedoch schnell an verschiedene Grenzen, wie unzureichende Interaktions- und Visualisierungsm¨oglichkeiten. In dieser Arbeit wird ein auf aktueller Virtual-Reality-Technologie basierender Softwareprototyp vorgestellt, der einen neuen L¨osungsansatz in diesem Bereich darstellen soll und sich zun¨achst auf laparoskopische Leberresektionen spezialisiert. Durch den Einsatz geeigneter Eingabeger¨ate und der M¨oglichkeit patientenspezifische Datenmodelle in die virtuelle Umgebung einzuf¨ugen, liefert der Prototyp neuartige Interaktionsm¨oglichkeiten zur Eingriffsplanung sowie individuelle Trainingsm¨oglichkeiten. Eine erste qualitative Nutzerstudie zeigt die N¨utzlichkeit und Benutzbarkeit der Anwendung und best¨atigt, dass der entwickelte Prototyp eine sinnvolle Basis f¨ur zuk¨unftige Arbeiten bildet. Die Weiterentwicklung des hier vorgestellten Ansatzes k¨onnte zuk¨unftig dazu beitragen das Potenzial bisheriger Planungs- und Simulationsmethoden besser auszusch¨opfen.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}