电穿孔
生物医学工程
体内
材料科学
不可逆电穿孔
医学
化学
生物
生物化学
基因
生物技术
作者
Kenneth N. Aycock,Sabrina N. Campelo,Zaid S. Salameh,Joshua M.K. Davis,David A. Iannitti,Iain H. McKillop,Rafael V. Davalos
标识
DOI:10.1109/tbme.2024.3468159
摘要
Irreversible electroporation (IRE) is a minimally thermal tissue ablation modality used to treat solid tumors adjacent to critical structures. Widespread clinical adoption of IRE has been limited due to complicated anesthetic management requirements and technical demands associated with placing multiple needle electrodes in anatomically challenging environments. High-frequency irreversible electroporation (H-FIRE) delivered using a novel single-insertion bipolar probe system could potentially overcome these limitations, but ablation volumes have remained small using this approach. While H-FIRE is minimally thermal in mode of action, high voltages or multiple pulse trains can lead to unwanted Joule heating. In this work, we improve the H-FIRE waveform design to increase the safe operating voltage using a single-insertion bipolar probe before electrical arcing occurs. By uniformly increasing interphase () and interpulse () delays, we achieved higher maximum operating voltages for all pulse lengths. Additionally, increasing pulse length led to higher operating voltages up to a certain delay length (25 μs), after which shorter pulses enabled higher voltages. We then delivered novel H-FIRE waveforms via an actively cooled single-insertion bipolar probe in swine liver in vivo to determine the upper limits to ablation volume possible using a single-needle H-FIRE device. Ablations up to 4.62 0.12 cm x 1.83 0.05 cm were generated in 5 minutes without a requirement for cardiac synchronization during treatment. Ablations were minimally thermal, easily visualized with ultrasound, and stimulated an immune response 24 hours post H-FIRE delivery. These data suggest H-FIRE can rapidly produce clinically relevant, minimally thermal ablations with a more user-friendly electrode design.
科研通智能强力驱动
Strongly Powered by AbleSci AI