Trauma Anatomy

Trauma Anatomy in virtual reality is designed to train medical students and healthcare professionals.

To implement this challenge, we used the following technologies and tools:

  • Unity 3D
  • Mixed Reality Toolkit
  • Acer Windows Mixed Reality Headset with Motion Controllers

The project is designed using a Unity3D engine. To simplify working with motion controllers, we used Mixed Reality Toolkit Asset.

We have developed a universal character controller for virtual reality which includes the following functions:

  • Motion with the joystick. Move with the joystick on the controller relative to the view of the camera along the Y-axis.
  • Rotation. Position and rotation of the game camera are set by the virtual reality glasses. The rotation needed to be set to the object that is higher in the hierarchy. This raises the following problem: if the user is far from the starting point in the real world when the parent is rotated around its axis, the position of the game camera will change as it rotates around the parent axis. The solution was to rotate the parent around the child-axis, namely around the axis of the game camera.
  • Teleportation. A ray is thrown from the controller, the object with which the ray collides is verified by the normal of the surface to determine if the surface is horizontal. If the surface is horizontal, then the point at which the beam hit is considered acceptable and can be teleported to it. If the surface is vertical, then from the point with some offset back, a second beam is dropped down and detected the floor.
  • Collision detection. Since the collider placed on the game chamber will not restrain it from passing through the walls, it was decided to place the collider on a parent, but the position of the collider should indicate the position of the camera. When users move in the real world while touching the virtual wall, the collider placed on the parent will push them back.
  • Interactive objects. Treatment of patients implies interaction with various medical instruments. Users can pick objects with both hands, move them from hand-to-hand, throw them away and interact with the patient’s body.
  • Hand models and animations. Since the simulator uses human hand models, a lot of animations are involved to visualize holding and using various tools. For this purpose, we have developed an animator controller with a number of sub-states for each non-standard object.
  • Interaction with the NPC. To interact with the NPC, we have developed an algorithm for recognizing voice commands based on the Windows.Speech
 

Other projects




Testimonials

I've ordered a small MVP from VironIT and they managed to do it under a tight deadline and with good quality. Now we are building an extended version of our MVP, hope it also be delivered in time.

– Jules MJD Maessen, Founder & Sales at VERO Visuals

I've been working with VironIT almost a year. We have built together several software solutions and platforms. VironIT performance is unbelievable high. Working with VironIT is easy. I strongly recommend this company.

– Janis Kondrats, CTO at Anatomy Next

We needed to create a proof of concept of a Blockchain-based software for our investor and to speed up the development we placed order at VironIT. We liked the performance and communication level at all steps. I think I’ll get back to you after we’ve get funded. Thanks.

– Denis Tolstashov, CEO at Wimix LLC

Working with their talented iOS professionals is an absolute pleasure. We have an enjoyable time engaging in dialogue to solve complex user experience issues. Consistent communication and pace of delivery a what I appreciate most about working with this group.

– Eric Gardner, Founder & Managing Partner at Baltic Partners