Ashton Abaya Tahmid Anwar Omar Harraz Zachary Emmanuel Miranda Thao Tran

Validation of Markerless Motion Tracking for Quantification of Surgical Skills

Biomedical Engineering

Ashton Abaya, Tahmid Anwar, Omar Harraz, Zachary Emmanuel Miranda, and Thao Tran

Abstract

The quantification of surgical skills is critically important for assessing the effectiveness of surgical training and optimizing the performances of skilled surgeons. Hand motion tracking in simulated surgery scenarios is emerging as a promising tool to extract motion features to quantify surgical skill. As traditional motion tracking systems require wearing physical markers which can restrict natural movement, markerless motion tracking systems are a potential avenue for tracking hand motion during simulated or real surgeries. Our project develops a real time algorithm utilizing Google’s MediaPipe for tracking markerless 3D hand trajectories, which is validated against the marker based PhaseSpace system. The markerless data was recorded using an Intel RealSense Depth Camera (D435), while the PhaseSpace used 10 IR cameras to track fingertip markers. The MediaPipe based algorithm processes pre recorded video files to generate 3D fingertip trajectories, whereas the PhaseSpace directly outputs 3D coordinates from marker movements. For validation, 3D fingertip trajectories from both systems were compared by plotting coordinate data and calculating Euclidean distance differences between corresponding points. This analysis evaluates the accuracy and feasibility of using markerless systems in surgical simulation.

Video

Research poster

ASU student research icon: Health