MOSFET
跨导
晶体管
半导体器件建模
阈值电压
电子工程
西格玛
逻辑门
电压
电导
人工神经网络
物理
电流(流体)
电气工程
光电子学
计算机科学
CMOS芯片
工程类
人工智能
量子力学
凝聚态物理
作者
Ming-Yen Kao,Hei Kam,Chenming Hu
出处
期刊:IEEE Electron Device Letters
[Institute of Electrical and Electronics Engineers]
日期:2022-04-18
卷期号:43 (6): 974-977
被引量:52
标识
DOI:10.1109/led.2022.3168243
摘要
In this work, we propose using deep learning to improve the accuracy of the partially-physics-based conventional MOSFET current-voltage model. The benefits of having some physics-driven features in the model are discussed. Using a portion of the Berkeley Short-channel IGFET Common-Multi-Gate (BSIM-CMG), the industry-standard FinFET and GAAFET compact model, as the physics model and a 3-layer neural network with 6 neurons per layer, the resultant model can well predict IV, output conductance, and transconductance of a TCAD-simulated gate-all-around transistor (GAAFET) with outstanding 3-sigma errors of 1.3%, 4.1%, and 2.9%, respectively. Implications for circuit simulation are also discussed.
科研通智能强力驱动
Strongly Powered by AbleSci AI