非线性系统
功能(生物学)
控制理论(社会学)
计算机科学
物理
细胞生物学
生物
人工智能
量子力学
控制(管理)
出处
期刊:Cornell University - arXiv
日期:2024-03-28
标识
DOI:10.48550/arxiv.2403.19896
摘要
A simply implemented activation function with even cubic nonlinearity is introduced that increases the accuracy of neural networks without substantial additional computational resources. This is partially enabled through an apparent tradeoff between convergence and accuracy. The activation function generalizes the standard RELU function by introducing additional degrees of freedom through optimizable parameters that enable the degree of nonlinearity to be adjusted. The associated accuracy enhancement is quantified in the context of the MNIST digit data set through a comparison with standard techniques.
科研通智能强力驱动
Strongly Powered by AbleSci AI