材料科学
铁电性
微观结构
压电
相变
电介质
压电系数
凝聚态物理
电场
磁滞
分析化学(期刊)
矿物学
结晶学
复合材料
化学
物理
量子力学
光电子学
色谱法
作者
S. Murakami,Dawei Wang,Ali Mostaed,Amir Khesro,Antonio Feteira,Derek C. Sinclair,Zhongming Fan,Xiaoli Tan,Ian M. Reaney
摘要
Abstract The relationship between the piezoelectric properties and the structure/microstructure for 0.05Bi(Mg 2/3 Nb 1/3 )O 3 ‐(0.95‐x)BaTiO 3 ‐ x BiFeO 3 ( BBFT , x = 0.55, 0.60, 0.63, 0.65, 0.70, and 0.75) ceramics has been investigated. Scanning electron microscopy revealed a homogeneous microstructure for x < 0.75 but there was evidence of a core‐shell cation distribution for x = 0.75 which could be suppressed in part through quenching from the sintering temperature. X‐ray diffraction ( XRD ) suggested a gradual structural transition from pseudocubic to rhombohedral for 0.63 < x < 0.70, characterized by the coexistence of phases. The temperature dependence of relative permittivity, polarization‐electric field hysteresis loops, bipolar strain‐electric field curves revealed that BBFT transformed from relaxor‐like to ferroelectric behavior with an increase in x , consistent with changes in the phase assemblage and domain structure. The largest strain was 0.41% for x = 0.63 at 10 kV /mm. The largest effective piezoelectric coefficient ( d 33 * ) was 544 pm/V for x = 0.63 at 5 kV /mm but the largest Berlincourt d 33 (148 pC /N) was obtained for x = 0.70. We propose that d 33 * is optimized at the point of crossover from relaxor to ferroelectric which facilitates a macroscopic field induced transition to a ferroelectric state but that d 33 is optimized in the ferroelectric, rhombohedral phase. Unipolar strain was measured as a function of temperature for x = 0.63 with strains of 0.30% achieved at 175°C, accompanied by a significant decrease in hysteresis with respect to room temperature measurements. The potential for BBFT compositions to be used as high strain actuators is demonstrated by the fabrication of a prototype multilayer which achieved 3 μm displacement at 150°C.
科研通智能强力驱动
Strongly Powered by AbleSci AI