自编码
聚类分析
对偶(语法数字)
人工智能
计算机科学
模式识别(心理学)
统计物理学
数学
深度学习
物理
艺术
文学类
作者
Lin Yang,Wentao Fan,Nizar Bouguila
出处
期刊:IEEE transactions on neural networks and learning systems
[Institute of Electrical and Electronics Engineers]
日期:2023-09-01
卷期号:34 (9): 6303-6312
被引量:12
标识
DOI:10.1109/tnnls.2021.3135460
摘要
In recent years, clustering methods based on deep generative models have received great attention in various unsupervised applications, due to their capabilities for learning promising latent embeddings from original data. This article proposes a novel clustering method based on variational autoencoder (VAE) with spherical latent embeddings. The merits of our clustering method can be summarized as follows. First, instead of considering the Gaussian mixture model (GMM) as the prior over latent space as in a variety of existing VAE-based deep clustering methods, the von Mises-Fisher mixture model prior is deployed in our method, leading to spherical latent embeddings that can explicitly control the balance between the capacity of decoder and the utilization of latent embedding in a principled way. Second, a dual VAE structure is leveraged to impose the reconstruction constraint for the latent embedding and its corresponding noise counterpart, which embeds the input data into a hyperspherical latent space for clustering. Third, an augmented loss function is proposed to enhance the robustness of our model, which results in a self-supervised manner through the mutual guidance between the original data and the augmented ones. The effectiveness of the proposed deep generative clustering method is validated through comparisons with state-of-the-art deep clustering methods on benchmark datasets. The source code of the proposed model is available at https://github.com/fwt-team/DSVAE.
科研通智能强力驱动
Strongly Powered by AbleSci AI