聚类分析
计算机科学
理论计算机科学
聚类系数
嵌入
图嵌入
图形
人工智能
星团(航天器)
模式识别(心理学)
数据挖掘
拓扑(电路)
数学
组合数学
程序设计语言
作者
Huiling Xu,Wei Xia,Quanxue Gao,Jungong Han,Xinbo Gao
标识
DOI:10.1016/j.neunet.2021.05.008
摘要
Towards exploring the topological structure of data, numerous graph embedding clustering methods have been developed in recent years, none of them takes into account the cluster-specificity distribution of the nodes representations, resulting in suboptimal clustering performance. Moreover, most existing graph embedding clustering methods execute the nodes representations learning and clustering in two separated steps, which increases the instability of its original performance. Additionally, rare of them simultaneously takes node attributes reconstruction and graph structure reconstruction into account, resulting in degrading the capability of graph learning. In this work, we integrate the nodes representations learning and clustering into a unified framework, and propose a new deep graph attention auto-encoder for nodes clustering that attempts to learn more favorable nodes representations by leveraging self-attention mechanism and node attributes reconstruction. Meanwhile, a cluster-specificity distribution constraint, which is measured by ℓ1,2-norm, is employed to make the nodes representations within the same cluster end up with a common distribution in the dimension space while representations with different clusters have different distributions in the intrinsic dimensions. Extensive experiment results reveal that our proposed method is superior to several state-of-the-art methods in terms of performance.
科研通智能强力驱动
Strongly Powered by AbleSci AI