双曲函数
乙状窦函数
激活函数
一般化
切线
人工神经网络
计算机科学
数学
应用数学
人工智能
数学分析
几何学
作者
Arvind Kumar,Sartaj Singh Sodhi
出处
期刊:Advances in intelligent systems and computing
日期:2023-01-01
卷期号:: 369-392
标识
DOI:10.1007/978-981-99-0550-8_30
摘要
There are a number of Activation Functions (AFs) present in the neural network. Among them, the hyperbolic tangent (TanH) and log sigmoid are commonly used AFs. The TanH AF is better when compared to logsigmoid. On all the numbers of hidden neurons or nodes, logsigmoid and TanH do not have shown better results or performance. For this purpose, we have presented six modified TanH with the help of a generalization of TanH AF. When logsigmoid and TanH do not show satisfactory results, then we may achieve better results with the help of the modified TanH proposed by us. In some situations, the modified TanH gives equal results as TanH, so we may also use the modified TanH for the verification of TanH results. Also, all these AFs are as powerful as logsigmoid and TanH. Like logsig and TanH, all of our modified TanH have four properties. First, these AFs are bounded range; second, all these are zero centered; according to the third and fourth properties, they are continuously differentiable and have a smooth S-shape. Due to all these properties, we can use all of the modified TanH for solving nonlinear problems. We have taken seven datasets for checking these AFs. First of all, we check the performance of the iris dataset (on 150 samples) using SCG, LM, and BR training algorithms. After that, we tested this on cancer (699 samples), glass (214 samples), body fat (252 samples), chemical (498 samples), wine (178 samples), and ovarian (216 samples) using SCG training algorithm for more satisfaction of the results.
科研通智能强力驱动
Strongly Powered by AbleSci AI