德拉姆
计算机科学
遗传算法
晶体管
电压
电子工程
算法
电气工程
机器学习
计算机硬件
工程类
作者
Jun Hui Park,Jung Nam Kim,Seonhaeng Lee,Gang-Jun Kim,Namhyun Lee,Rock‐Hyun Baek,Dae Hwan Kim,Changhyun Kim,Myounggon Kang,Yoon Kim
出处
期刊:IEEE Access
[Institute of Electrical and Electronics Engineers]
日期:2024-01-01
卷期号:12: 23881-23886
被引量:1
标识
DOI:10.1109/access.2024.3357241
摘要
Accurate current-voltage ( I-V ) modeling based on the Berkeley short-channel insulated-gate field-effect transistor model (BSIM) is pivotal for integrated circuit simulation. However, the current BSIM model does not support a buried-channel-array transistor (BCAT), which is the structure of the state-of-the-art commercial dynamic random access memory (DRAM) cell transistor. In this work, we propose an intelligent I-V modeling technique that combines genetic algorithm (GA) and deep learning (DL). This hybrid technique facilitates both optimization of BSIM parameter and accurate I-V modeling, even for devices not originally supported by BSIM. Additionally, we extended application of the DL to model one of the principal degradation mechanisms of transistor, the hot-carrier degradation (HCD). The successful modeling results of I-V characteristic and device degradation demonstrated that devices not supported by BSIM can be accurately modeled for integrated circuit simulations.
科研通智能强力驱动
Strongly Powered by AbleSci AI