计算机科学
约束(计算机辅助设计)
维数之咒
任务(项目管理)
正规化(语言学)
特征选择
特征(语言学)
降维
人工智能
机器学习
活力
数据挖掘
模式识别(心理学)
数学
管理
经济
几何学
哲学
语言学
物理
量子力学
作者
Yang Zhang,Jie Shi,Hong Zhao
标识
DOI:10.1016/j.eswa.2024.124588
摘要
Multi-task feature selection (MTFS) has been proven effective for reducing the curse of dimensionality in large-scale classification. Many existing MTFS methods assume that all tasks learn concurrently in a static environment without considering the dynamism of tasks in real-world scenarios. However, new tasks emerge dynamically in practical applications, meaning that the aforementioned static assumption is insufficient. In this paper, we construct a dynamic multi-task feature selection framework to achieve feature reduction for constantly arriving new tasks. First, we modify the traditional mapping by shifting from hard labels to soft labels. Unlike the conventional rigid mapping, the new flexible loss function changes the direct mapping strategy to an indirect one. Second, we use the orthogonal regularization term to constrain the independent relationship between new and old tasks. This ensures that the selected relevant features for new tasks differ from prior tasks. Finally, we integrate the flexible loss and the orthogonal regularization term in the dynamic multi-task feature selection framework. Our method outperforms nine other advanced feature selection methods in terms of effectiveness and efficiency across six datasets. For example, the ACC value of our method is almost 1% higher than the next-best method on the large-scale SUN dataset.
科研通智能强力驱动
Strongly Powered by AbleSci AI