Breast-conserving surgery is a well-accepted breast cancer treatment. However, it is still challenging for the surgeon to accurately localize the tumor during the surgery. Also, the guidance provided by current methods is 1 dimensional distance information, which is indirect and not intuitive. Therefore, it creates problems on a large re-excision rate, and a prolonged surgical time. To solve these problems, we have developed a fiber-delivered optoacoustic guide (OG), which mimics the traditional localization guide wire and is preoperatively placed into tumor mass, and an augmented reality (AR) system to provide real-time visualization on the location of the tumor with sub-millimeter variance. By a nano-composite light diffusion sphere and light absorbing layer formed on the tip of an optical fiber, the OG creates an omnidirectional acoustic source inside tumor mass under pulsed laser excitation. The optoacoustic signal generated has a high dynamic range (~ 58dB) and spreads in a large apex angle of 320 degrees. Then, an acoustic radar with three ultrasound transducers is attached to the breast skin, and triangulates the location of the OG tip. With an AR system to sense the location of the acoustic radar, the relative position of the OG tip inside the tumor to the AR display is calculated and rendered. This provides direct visual feedback of the tumor location to surgeons, which will greatly ease the surgical planning during the operation and save surgical time. A proof-of-concept experiment using a tablet and a stereo-vision camera is demonstrated and 0.25 mm tracking variance is achieved.
|