文字2vec
计算机科学
人工智能
自然语言处理
卷积神经网络
词(群论)
人工神经网络
自然语言
向量空间
多样性(控制论)
图层(电子)
计算
算法
嵌入
数学
化学
几何学
有机化学
作者
Konda Sai Varshitha,Chinni Guna Kumari,Muppala Hasvitha,Shaik Fiza,K Amarendra,Venubabu Rachapudi
标识
DOI:10.1109/iccmc56507.2023.10083608
摘要
Convolutional neural networks (CNN) are multi-layer neural networks that are used to learn hierarchical data properties. In recent times, CNN has achieved remarkable advances in the architecture and computation of Natural Language Processing (NLP). The Word2vec technique is considered to introduce Word embeddings, which are used to improve the performance of a variety of Natural Language Processing (NLP) applications. It is a well-known technique for learning word embeddings, which are dense representations of words in a lower-dimensional vector space. Two prominent approaches are used for learning word embeddings, which are dense representations of words in a lower-dimensional vector space, are Continuous Bag-of-Words (CBOW) and Skip-Gram.
科研通智能强力驱动
Strongly Powered by AbleSci AI