Transformer Based Tissue Classification in Robotic Needle Biopsy

Fanxin Wang*, Yikun Cheng, Sudipta S Mukherjee, Rohit Bhargava, Thenkurussi Kesavadas

*Corresponding author for this work

Research output: Chapter in Book or Report/Conference proceedingConference Proceedingpeer-review

Abstract

Image-guided minimally invasive robotic surgery is commonly employed for tasks such as needle biopsies or localized therapies. However, the nonlinear deformation of various tissue types presents difficulties for surgeons in achieving precise needle tip placement, particularly when relying on low-fidelity biopsy imaging systems. In this paper, we introduce a method to classify needle biopsy interventions and identify tissue types based on a comprehensive needle-tissue contact model that incorporates both position and force parameters. We trained a transformer model using a comprehensive dataset collected from a formerly developed robotics platform, which consists of synthetic and porcine tissue from various locations (liver, kidney, heart, belly, hock) marked with interaction phases (pre-puncture, puncture, post-puncture, neutral). This model achieves a significant classification accuracy of 0.93. Our demonstrated method can assist surgeons in identifying transitions to different tissues, aiding surgeons with tissue awareness.
Original languageEnglish
Title of host publication 2024 IEEE International Conference on Systems, Man, and Cybernetics (SMC 2024)
Publication statusAccepted/In press - 2024

Fingerprint

Dive into the research topics of 'Transformer Based Tissue Classification in Robotic Needle Biopsy'. Together they form a unique fingerprint.

Cite this