Trauma Anatomy

Trauma Anatomy in virtual reality is designed to train medical students and healthcare professionals.

To implement this challenge, we used the following technologies and tools:

  • Unity 3D
  • Mixed Reality Toolkit
  • Acer Windows Mixed Reality Headset with Motion Controllers

The project is designed using a Unity3D engine. To simplify working with motion controllers, we used Mixed Reality Toolkit Asset.

We have developed a universal character controller for virtual reality which includes the following functions:

  • Motion with the joystick. Move with the joystick on the controller relative to the view of the camera along the Y-axis.
  • Rotation. Position and rotation of the game camera are set by the virtual reality glasses. The rotation needed to be set to the object that is higher in the hierarchy. This raises the following problem: if the user is far from the starting point in the real world when the parent is rotated around its axis, the position of the game camera will change as it rotates around the parent axis. The solution was to rotate the parent around the child-axis, namely around the axis of the game camera.
  • Teleportation. A ray is thrown from the controller, the object with which the ray collides is verified by the normal of the surface to determine if the surface is horizontal. If the surface is horizontal, then the point at which the beam hit is considered acceptable and can be teleported to it. If the surface is vertical, then from the point with some offset back, a second beam is dropped down and detected the floor.
  • Collision detection. Since the collider placed on the game chamber will not restrain it from passing through the walls, it was decided to place the collider on a parent, but the position of the collider should indicate the position of the camera. When users move in the real world while touching the virtual wall, the collider placed on the parent will push them back.
  • Interactive objects. Treatment of patients implies interaction with various medical instruments. Users can pick objects with both hands, move them from hand-to-hand, throw them away and interact with the patient’s body.
  • Hand models and animations. Since the simulator uses human hand models, a lot of animations are involved to visualize holding and using various tools. For this purpose, we have developed an animator controller with a number of sub-states for each non-standard object.
  • Interaction with the NPC. To interact with the NPC, we have developed an algorithm for recognizing voice commands based on the Windows.Speech
 

Other projects




Testimonials

Share
Share