Game Console Technology in the Operating Room

In the future, a surgeon will need only a hand movement to browse through a patient’s x-ray images during an operation. Until now it has been much harder to operate the monitors because surgical team members are not allowed to touch any objects other than surgical instruments, due to the risk of infection. That’s why experts from Siemens Healthcare and the global research unit Corporate Technology have adapted Microsoft’s Xbox technology to make it possible for surgeons to use hand and arm movements to change images, zoom in on details, and rotate three-dimensional images.

The system, which is not yet offered commercially, could be used in the future to provide support during minimally invasive operations, in which a catheter inserted through a small incision is used to perform the surgical procedure. The surgeon must have access to all the details of the patient’s anatomy during the entire operation. This is why monitors in the operating theater display cross-sectional x-ray images or three-dimensional x-ray pictures. Now the gesture control system enables the surgeon to switch back and forth between individual images or views.

The basis of the system is provided by the Xbox game console’s Kinect technology, which projects a dot pattern of infrared light into the room and records the scene with a camera from another angle. Three-dimensional objects or people distort the captured pattern, making it possible to calculate the distance between the light source and every point in the camera image. Out of these three-dimensional clouds made up of points, the software identifies the people in the room and recognizes their movements.

However, the system isn’t aimed at translating slow, precise hand movements into exact commands. With this in mind, the experts from Corporate Technology developed algorithms that first identify the surgeon’s hands within the point cloud. To this end, the algorithms are targeted to analyze only the zone in front of the individual users, where the hands of the person using the system would be expected to appear. Once the system distinguishes the hands, it registers their movements without becoming „confused“ by the presence of other people in the room. Experts from Siemens Healthcare created the link between the gesture recognition and the display of the x-ray images. They defined the required commands – for example, spreading the hands apart to enlarge a section of an image. Following successful prototype tests, the technology will be tested in two hospitals under semi-realistic conditions – in other words, without patients.


Scroll to Top