Hand gesture guided robot-assisted surgery based on a direct augmented reality interface
Published in Computer Methods and Programs in Biomedicine, 2014
Recommended citation: R. Wen, W.-L. Tay, B. P. Nguyen, C.-B. Chng, and C.-K. Chui, "Hand gesture guided robot-assisted surgery based on a direct augmented reality interface," Computer methods and programs in biomedicine, vol. 116, pp. 68-80, 2014.
Radiofrequency (RF) ablation is a good alternative to hepatic resection for treatment of liver tumors. However, accurate needle insertion requires precise hand-eye coordination and is also affected by the difficulty of RF needle navigation. This paper proposes a cooperative surgical robot system, guided by hand gestures and supported by an augmented reality (AR)-based surgical field, for robot-assisted percutaneous treatment. It establishes a robot-assisted natural AR guidance mechanism that incorporates the advantages of the following three aspects: AR visual guidance information, surgeon’s experiences and accuracy of robotic surgery. A projector-based AR environment is directly overlaid on a patient to display preoperative and intraoperative information, while a mobile surgical robot system implements specified RF needle insertion plans. Natural hand gestures are used as an intuitive and robust method to interact with both the AR system and surgical robot. The proposed system was evaluated on a mannequin model. Experimental results demonstrated that hand gesture guidance was able to effectively guide the surgical robot, and the robot-assisted implementation was found to improve the accuracy of needle insertion. This human–robot cooperative mechanism is a promising approach for precise transcutaneous ablation therapy.
Recommended citation: R. Wen, W.-L. Tay, B. P. Nguyen, C.-B. Chng, and C.-K. Chui, “Hand gesture guided robot-assisted surgery based on a direct augmented reality interface,” Computer methods and programs in biomedicine, vol. 116, pp. 68-80, 2014.