兴奋剂
材料科学
电化学
阴极
电导率
锂(药物)
离子
价(化学)
热液循环
纳米技术
分析化学(期刊)
化学工程
化学
电极
光电子学
物理化学
医学
工程类
色谱法
内分泌学
有机化学
作者
Jun Cong,Shaohua Luo,Pengyu Li,Xin Yan,Lixiong Qian,Shengxue Yan
标识
DOI:10.1016/j.apsusc.2023.158646
摘要
Na3V2(PO4)3 (NVP) has become a hot material in the research field of sodium-ion batteries (SIBs) as a result of its unique structural advantages. Unfortunately, the poor electronic conductivity and ion diffusion ability fundamentally limit the practical development of NVP. Considering the above defects, a series of Li+ doped Na3−xLixV2(PO4)3/C cathode materials for SIBs are synthesized through hydrothermal assisted sol–gel method. The cell volume steadily decline with the raise of Li+ doping amount, which is consistent with the usual doping effect. Moreover, with the increase of lithium doping amount, Li+ preferentially occupies Na(2) site and then occupies Na(1) site. Through the first principles calculation, it is clear that the introduction of Li+ can significantly reinforce the conductivity of the material system and diminish the electron localization phenomenon. The analysis of (1 1 6) crystal plane by TEM shows that Li doping will reduce the interplanar spacing. The change of V valence during the charge–discharge process of Na2.8Li0.2V2(PO4)3/C is consistent with that of NVP by V element XANES characterization. There is no conventional phase transition phenomenon in the ex-situ XRD high-voltage charging state. The above facts indicate that lithium doping obtains a stable solid solution NVP. The electrochemical test results show that Na2.8Li0.2V2(PO4)3/C can provide an amazing reversible capacity of 116.9 mAh/g (theoretical capacity of 117 mAh/g) and capacity retention rate of 99.82 % after 500 cycles. GITT results show that Li+ can significantly increase the Na+ diffusion coefficient of the material. Such excellent electrochemical performance is attributed to the fact that the doping of Li+ activates the activity of Na+ in Na(2) site and improves the reaction efficiency.
科研通智能强力驱动
Strongly Powered by AbleSci AI