Interactive virtual anatomical model using a projector and sensors


Interactive virtual anatomical model using a projector and sensors

A research group at Gifu University is developing a virtual anatomical model. It is currently being used as a medical teaching resource to educate users about cerebral nerves.

“This object has a sensor which detects its position and orientation, so when you move it like this, the picture moves with it. First of all, you can use this capability to bring the picture and the object together. When you look from that side, I think things appear quite distorted, but to someone looking though this window, which has another sensor, it appears as if there’s a virtual object within this object. To someone else, it looks distorted, but seen through my eyes, it looks as if the projected image is in line with the moving 3D view.”

Virtual anatomical model simulation enables users to easily understand 3D positions and structures. Consequently, it’s thought to be an effective way to learn about the structure of complicated, multi-branched facial nerves and the movements of expressive muscles.

“For example, when you try to display a straight line, if you project it on a curved surface, the line becomes curved. We use a computer to solve the inverse relationship. So, we’re using technology called curvature compensation, to obtain a straight line from a curved surface.”

“This system only requires objects, sensors, and a projector, so we think it could be commercialized quickly. Also, because it uses real objects, it provides a haptic experience without needing special equipment. Consequently, it gives natural haptic feedback. So, when you manipulate objects, the graphics move along with you, rather than you acting in an empty space using a mouse and keyboard or hand sensors. This system can be used by anyone, with no need to learn special skills.”


About Author

Comments are closed.