材料科学
晶界
拉曼光谱
烧结
离子电导率
陶瓷
无定形固体
电解质
声子
快离子导体
离子
等温过程
离子键合
电导率
分析化学(期刊)
化学物理
凝聚态物理
复合材料
电极
结晶学
热力学
物理化学
化学
微观结构
光学
物理
有机化学
色谱法
作者
Sayan Ghosh,C. Sudarshan,C. Sudakar
摘要
Lithium ions shuttle between electrodes through the ceramic solid electrolyte across the boundary regions in a solid-state Li-ion battery. This work demonstrates how phonon vibrations get altered by sintering conditions, and grain boundaries (GBs) could be useful in enhancing the ionic conductivity of solid electrolytes. GB engineered Li1.3Al0.3Ti1.7(PO4)3 (LATP) ceramics are prepared using a sol-gel process and performed sintering under different conditions, viz., spark plasma sintering (SPS) and conventional isothermal sintering (CIS). The former exhibits GB regions with amorphous characteristics, whereas the latter shows a sharp boundary between crystalline grains. LATP-SPS ceramic shows two orders of higher ionic conductivity (σ = 1.02 × 10−5 S/cm at 300 K and 100 Hz) than LATP-CIS. We investigate the interrelation between lattice vibration and lithium-ion migration by monitoring the changes in vibrational mode characteristics of LATP ceramics through temperature-dependent Raman spectroscopy. Raman modes of LATP-SPS exhibit a higher Raman shift (∼2 cm−1 at 123 K) due to increased defects, preferentially from grain boundary regions, compared to the LATP-CIS pellet. Most of the vibrational modes undergo a red shift (∼10 cm−1) with increasing temperature, except for the O–P–O bending mode [A1g(3)], which exhibits a blue shift (∼3 cm−1). These observations correlate with interstitial ionic migration in the LATP framework. Force constant of the observed Raman modes suggests that lithium-ion migration is assisted significantly by dynamic structural changes of the (PO4)3− sublattice. Anharmonicities observed from temperature-dependent changes in Raman profiles are explained using three-phonon and four-phonon scattering processes, which lower the migration barrier and, hence, contribute to higher ionic conductivity.
科研通智能强力驱动
Strongly Powered by AbleSci AI