超参数
离群值
计算机科学
人工智能
图像(数学)
铰链损耗
功能(生物学)
接收机工作特性
模式识别(心理学)
二元分类
机器学习
班级(哲学)
二进制数
深度学习
数学
支持向量机
算术
生物
进化生物学
作者
Jie Du,Yanhong Zhou,Peng Liu,Chi‐Man Vong,Tianfu Wang
出处
期刊:IEEE transactions on neural networks and learning systems
[Institute of Electrical and Electronics Engineers]
日期:2023-06-01
卷期号:34 (6): 3234-3240
被引量:24
标识
DOI:10.1109/tnnls.2021.3110885
摘要
Current state-of-the-art class-imbalanced loss functions for deep models require exhaustive tuning on hyperparameters for high model performance, resulting in low training efficiency and impracticality for nonexpert users. To tackle this issue, a parameter-free loss (PF-loss) function is proposed, which works for both binary and multiclass-imbalanced deep learning for image classification tasks. PF-loss provides three advantages: 1) training time is significantly reduced due to NO tuning on hyperparameter(s); 2) it dynamically pays more attention on minority classes (rather than outliers compared to the existing loss functions) with NO hyperparameters in the loss function; and 3) higher accuracy can be achieved since it adapts to the changes of data distribution in each mini-batch instead of the fixed hyperparameters in the existing methods during training, especially when the data are highly skewed. Experimental results on some classical image datasets with different imbalance ratios (IR, up to 200) show that PF-loss reduces the training time down to 1/148 of that spent by compared state-of-the-art losses and simultaneously achieves comparable or even higher accuracy in terms of both G-mean and area under receiver operating characteristic (ROC) curve (AUC) metrics, especially when the data are highly skewed.
科研通智能强力驱动
Strongly Powered by AbleSci AI