医学
成像体模
支架
管腔(解剖学)
图像质量
冠状动脉
核医学
放射科
冠状动脉支架
狭窄
心脏成像
生物医学工程
再狭窄
动脉
外科
人工智能
图像(数学)
计算机科学
作者
Arwed Elias Michael,Denise Schoenbeck,Jendrik Becker-Assmann,Julius Henning Niehoff,Thomas Flohr,Bernhard Schmidt,Christoph Panknin,Matthias Baer‐Beck,Tilman Hickethier,David Maintz,Alexander C. Bunck,Jan Borggrefe,Marcus Wiemer,Volker Rudolph,Jan Robert Kroeger
标识
DOI:10.1016/j.ejrad.2023.110983
摘要
Imaging stents and in-stent stenosis remains a challenge in coronary computed tomography angiography (CCTA). In comparison to conventional Computed Tomography, Photon Counting CT (PCCT) provides decisive clinical advantages, among other things by providing low dose ultra-high resolution imaging of coronary arteries. This work investigates the image quality in CCTA using clinically established kernels and those optimized for the imaging of cardiac stents in PCCT, both for in-vitro stent imaging in 400 μm standard resolution mode (SRM) and 200 μm Ultra High Resolution Mode (UHR).Based on experimental scans, vascular reconstruction kernels (Bv56, Bv64, Bv72) were optimized. In an established phantom, 10 different coronary stents with 3 mm diameter were scanned in the first clinically available PCCT. Scans were reconstructed with clinically established and optimized kernels. Four readers measured visible stent lumen, performed ROI-based density measurements and rated image quality.Regarding the visible stent lumen, UHR is significantly superior to SRM (p < 0.001). In all levels, the optimized kernels are superior to the clinically established kernels (p < 0.001). One optimized kernel showed a significant reduction of noise compared to the clinically established kernels. Overall image quality is improved with optimized kernels.In a phantom study PCCT UHR with optimized kernels for stent imaging significantly improves the ability to assess the in-stent lumen of small cardiac stents. We recommend using UHR with an optimized sharp vascular reconstruction kernel (Bv72uo) for imaging of cardiac stent.
科研通智能强力驱动
Strongly Powered by AbleSci AI