医学
利奈唑啉
逻辑回归
内科学
血小板
休克(循环)
回顾性队列研究
风险因素
遗传学
生物
细菌
万古霉素
金黄色葡萄球菌
作者
Xiaonian Han,Jinping Wang,Xin Zan,Lirong Peng,Xiaojing Nie
标识
DOI:10.1007/s11096-021-01342-y
摘要
Background Previous reports about risk factors for linezolid-induced thrombocytopenia have been insufficient, often due to the variability in study design and population, and some factors have not yet been studied. Aim The aims of this study are to determine potential risk factors for linezolid-induced thrombocytopenia, and to analyze the influencing factors of different thrombocytopenia definitions. Method This retrospective study involved patients who were administered intravenous linezolid for ≥ 1 day between January 1, 2015 and January 1, 2021. Their demographic and clinical data were extracted from electronic medical records. Thrombocytopenia was defined as: ①thrombocytopenia with platelet count < 100 × 109/L and a decrease in 25% or more from baseline of the platelet count (criterion 1); ②thrombocytopenia due to a platelet count drop decrease of 25% or more from baseline (criterion 2). Risk factors were determined via binary logistic regression analysis. Results This study included 320 patients. Binary logistic regression analysis indicated that baseline platelet count (p < 0.001), linezolid therapy duration (p = 0.001) and shock (patients require vasoactive medications) (p = 0.019) were independent risk factors for criterion-1thrombocytopenia, while linezolid therapy duration (p < 0.001) and shock (p = 0.015) were independent risk factors for criterion-2 thrombocytopenia. There was also a significant correlation between shock and early-onset thrombocytopenia (p = 0.005 and 0.019 for criterion 1 and criterion 2, respectively). Conclusion Linezolid therapy duration and shock were common causes of different thrombocytopenia definitions; shock was correlated with early-onset thrombocytopenia. Platelet count should be monitored during linezolid therapy especially during long-duration therapy and in shock patients.
科研通智能强力驱动
Strongly Powered by AbleSci AI