For hundreds of years, medical students have been dissecting cadavers to learn about the inner workings of the human body. Nowadays, however, virtual reality app development will become the new standard for medical training and education. Millennial students can put on a headset and their teacher can guide them through a lesson on a virtual human subject.
One healthcare startup decided to build Google Maps for the human body using 3D and augmented reality technologies. It entrusted us with developing a VR version of two individual-focused applications: Skull Anatomy and Head & Neck.
The Head & Neck app is a viewer for bones and muscles of the head and neck areas. To facilitate further exploration of the head, the app allows users to hide selected pieces or make them semi-transparent.
The Skull Anatomy application allows one to explore the skull in unprecedented detail with the ability to move, rotate, scale, expand, fade, and hide the model of the skull. This application has two modes: View and Learn. View mode is similar to Head & Neck in the functions it offers; users can tap on any skull part to select it and get a summary of it, as well as select multiple skull parts to get a list of them. Learn mode provides more extensive functionalities than View mode: each bone can be explored through reference information and comprehensible colored models.
Who is the Anatomy app for?
Virtual reality medical software is designed for a range of users, from medical and allied-health students to educators, healthcare professionals, patients, artists, and curious minds. It not only helps students grasp the challenging subject of anatomy, but it is also easily understood by individuals without a medical background. It is an advanced learning tool that could complement any anatomy curriculum and help people visualize and explore anatomy.
The application delivers accurate visual and textual information, immediate response time, and intuitive navigation. It satisfies the highest standards of medical and scientific accuracy. All anatomical definitions and clinical correlations are written by professors of anatomy and medical professionals.
To implement this challenge, we used the following technologies and tools:
- Unity 3D
- Mixed Reality Toolkit
- Acer Windows Mixed Reality Headset with Motion Controllers
The project is designed using a Unity3D engine. To simplify working with motion controllers, we used Mixed Reality Toolkit Asset. You can download the Mixed Reality Toolkit from the GitHub page.
Also, we strongly recommend looking through the contents of this link, which describe how to prepare your work environment to use Mixed Reality Toolkit-Unity in your Unity 3D project.
Features that have been implemented:
- Simple drag, resize, and rotate
- Explore the skull in unprecedented detail
There are, of course, many ways one could choose to design the controllers; we have used the design pattern called State Machine.
To generate descriptions for skull parts, we created a ScriptableObject asset that allows one to display detailed information about the currently selected Object using the Inspector window (sometimes referred to as “the Inspector”).
To display the text description, we have used World Space Canvas, which can be freely positioned and rotated in the Scene. You can put a Canvas on any wall, floor, ceiling, or slanted surface (or hanging freely in the air). Just use the normal Translate and Rotate tools in the toolbar.
Now let’s look at the following piece of code that illustrates a bone output on a mesh.
Each skull part is a 3D object. You can tap on the bone to select it and get a name and summary.
To get the name of the bone, we have applied the following approach: we painted the texture of each skull part. When we put the ray on the pixel, we got the color and compared it with the array of colors and names.
Inside the application, we provide a high-performance teleportation implementation that allows you to immediately jump longer distances.Teleportation is a type of locomotion that allows the player to move around a VR environment with minimal discomfort. With teleportation, the user points the controller at a location they’d like to move to, and then initiates the teleportation action, they are transitioned to that location via a rapid animation tuned to maintain user comfort.
To implement teleportation, we have used a Line Renderer component. It takes an array of two or more points in 3D space and draws a straight line between each one. A single Line Renderer component can be used to draw anything from a simple straight line to a complex spiral. The line is always continuous; if you need to draw two or more completely separate lines, you should use multiple GameObjects because each has its own Line Renderer.
We have created the following list of useful links during our research.
- Mixed Reality Toolkit-Unity. GitHub repository
- Unity creation engine
- The Inspector window
- Line Renderer
Where else can non-gaming applications for virtual reality be used?
Virtual and augmented reality are already in use for commercial gaming applications. The 2018 analyst forecast shows a steady increase in VR device shipment; it is estimated to be 20 million units.
But what about industries beyond gaming? A few markets are already starting to use this new technology, and more will likely follow as both AR and VR expand and become cheaper for businesses to implement. Below, information about how AR and VR may be used for non-gaming applications is listed:
- Travel and tourism
- Mechanical engineering and industrial design
- Architecture and civil engineering
- Mental health