材料科学
阴极
格子(音乐)
压电
表面改性
电压
化学工程
氧气
兴奋剂
高压
化学物理
离子
光电子学
复合材料
电气工程
物理化学
有机化学
声学
工程类
物理
化学
作者
Yizhen Huang,Dan Su,Lu Zheng,Guangchang Yang,Shuo Li,Juantao Jiang,Qichang Pan,Sijiang Hu,Yaohao Li,Qingyu Li,Hongqiang Wang,Fenghua Zheng,Xing Ou
标识
DOI:10.1016/j.ensm.2024.103678
摘要
The Ni-rich cathode materials cycled above 4.5 V can achieve a high specific capacity of over 200 mAh·g-1. However, redox-induced migration of oxygen anions from the bulk to the surface under high voltage tends to induce the loss of lattice oxygen, thus leading to the severe degradation of long-term cycling stability. Here, a well-designed surface modification strategy with piezoelectric material has been proposed to suppress the lattice oxygen evolution of Ni-rich cathode materials. Specifically, LiGaO2 is employed as a surface modification material, which can construct a lattice-compatible piezoelectric layer on the surface of LiNi0.8Co0.1Mn0.1O2 (NCM). It facilitates the formation of strong interfacial interaction, thus eliminating the interfacial strain. Meanwhile, the introduction of LiGaO2 can trigger the piezoelectric effect and induce a built-in electric field spontaneously, which effectively suppresses the lattice oxygen evolution during the high-voltage cycling. Furthermore, sub-surface Ga-doping can mitigate the migration of Ni-ions in the bulk to maintain the stability of transition metal-oxygen bonds at the deep charged state. As expected, the modified Ni-rich cathode exhibits impressive cycling stability even at harsh conditions with high capacity retentions of 78.2 % at a high rate of 5 C after 350 cycles, and 75.6 % at a high voltage of 4.7 V after 200 cycles at 1 C. In terms of practicality, it also provides extraordinary capacity retention of 95.7 % in a pouch-type full-battery at 0.5 C after 450 cycles. This innovative strategy proposes a new insight into suppressing the lattice oxygen evolution under high-voltage cycling by manipulating the surface chemistry of Ni-rich cathode materials.
科研通智能强力驱动
Strongly Powered by AbleSci AI