医学
四分位间距
完全性肺静脉畸形连接
外科
危险系数
静脉
肺静脉
心脏病学
内科学
置信区间
烧蚀
作者
Aditya Sengupta,Kimberlee Gauvreau,Aditya K. Kaza,Christopher W. Baird,David N. Schidlow,Pedro J. del Nido,Meena Nathan
标识
DOI:10.1016/j.athoracsur.2022.05.058
摘要
Outcomes after total anomalous pulmonary venous connection (TAPVC) repair remain suboptimal due to recurrent pulmonary vein (PV) obstruction requiring reinterventions. We sought to develop a clinical prediction rule for PV reintervention after TAPVC repair.Data from consecutive patients who underwent TAPVC repair at a single institution from January 1980 to January 2020 were retrospectively reviewed after Institutional Review Board approval. The primary outcome was postdischarge (late) unplanned PV surgical or transcatheter reintervention. Echocardiographic criteria were used to assess PV residual lesion severity at discharge (class 1: no residua; class 2: minor residua; class 3: major residua). Competing risk models were used to develop a weighted risk score for late reintervention.Of 437 patients who met entry criteria, there were 81 (18.5%) reinterventions at a median follow-up of 15.6 (interquartile range, 5.5-22.2) years. On univariable analysis, minor and major PV residua, age, single-ventricle physiology, infracardiac and mixed TAPVC, and preoperative obstruction were associated with late reintervention (all P < .05). The final risk prediction model included PV residua (class 2: subdistribution hazard ratio [SHR], 4.8; 95% CI, 2.8-8.1; P < .001; class 3: SHR, 6.4; 95% CI, 3.5-11.7; P < .001), age <1 year (SHR, 3.3; 95% CI, 1.3-8.5; P = .014), and preoperative obstruction (SHR, 1.8; 95% CI, 1.1-2.8; P = .015). A risk score comprising PV residua (class 2 or 3: 3 points), age (neonate or infant: 2 points), and obstruction (1 point) was formulated. Higher risk scores were significantly associated with worse freedom from reintervention (P < .001).A risk prediction model of late reintervention may guide prognostication of high-risk patients after TAPVC repair.
科研通智能强力驱动
Strongly Powered by AbleSci AI