Successful Completion of Student Projects

In cooperation with the Chair of Visualization (Prof. Bernhard Preim, Dr. Patrick Saalfeld) and the research campus STIMULATE (Prof. Rose), a number of student projects in the field of AR/VR have recently been successfully completed. The projects were funded by the Faculty of Computer Science of the Otto-von-Guericke-University Magdeburg from funds of the Hochschulpakt 2020. A brief overview of the projects is given below.

AR-supported balance training

In this project, an AR application to enable balance training was implemented. The application can be used with stroke patients or people with vertigo problems to improve the balance feeling playfully. The MediBalanceBoard (MediTECH Electronic GmbH, Magdeburg) is used to move a virtual ball in a wooden box.

With a MediBalanceBoard, the user moves a virtual ball in a wooden box to a target position. The ball and the wooden box are visualized using Hololens AR glasses.
With a MediBalanceBoard, the user moves a virtual ball in a wooden box to a target position. The ball and the wooden box are visualized using Hololens AR glasses.

Multimodal interaction with AR content

Direct interaction in the medical context makes the user independent of assistants and allows exact input. In interventional scenarios, however, medical instruments must be used simultaneously by the operator. This can represent an additional mental burden for the user. This project investigates the influence of different input modalities for AR interaction on a primary task.

Left: Study to investigate the influence of selection tasks (ball ring) on a primary tasks (beam) in AR. The selection task depicts interaction with medical image data in abstract form, the primary task the guidance of a medical instrument. Right: User navigates through AR image data with hand gestures.
Left: Study to investigate the influence of selection tasks (ball ring) on a primary tasks (beam) in AR. The selection task depicts interaction with medical image data in abstract form, the primary task the guidance of a medical instrument. Right: User navigates through AR image data with hand gestures.

Hands-free navigation in radiological image data

Radiological interventions require the navigation of medical instruments without a direct line of sight based on real-time image data. This requires occasional adjustments of the magnification factor or the image section. Human-machine interaction therefore requires the interruption of the current task or the instruction of an assistant. To provide direct control, this project explored hands-free interaction with radiological image data using the user interface of an angiography system. For this purpose, different input methods such as speech, head movement and body posture were recorded and combined into suitable interaction patterns.

Left: Sketches of potential interaction methods and processes. Right: User study with software prototype.
Left: Sketches of potential interaction methods and processes. Right: User study with software prototype.

VR multi-user conference room for liver surgery planning

In this project, a virtual conference room in which several users (doctors) simultaneously plan a liver intervention was prototypically implemented. For this purpose, several users can interact with an enlarged three-dimensional liver model as well as their 2D-layer images.

Virtual conference room for planning liver interventions.
Virtual conference room for planning liver interventions.