困惑
计算机科学
循环神经网络
乘法函数
字节
性格(数学)
词(群论)
语言模型
序列(生物学)
人工智能
深度学习
人工神经网络
自然语言处理
数学
遗传学
生物
操作系统
数学分析
几何学
作者
Ben Krause,Iain Murray,Steve Renals,Liang Lu
出处
期刊:Cornell University - arXiv
日期:2016-01-01
被引量:95
标识
DOI:10.48550/arxiv.1609.07959
摘要
We introduce multiplicative LSTM (mLSTM), a recurrent neural network architecture for sequence modelling that combines the long short-term memory (LSTM) and multiplicative recurrent neural network architectures. mLSTM is characterised by its ability to have different recurrent transition functions for each possible input, which we argue makes it more expressive for autoregressive density estimation. We demonstrate empirically that mLSTM outperforms standard LSTM and its deep variants for a range of character level language modelling tasks. In this version of the paper, we regularise mLSTM to achieve 1.27 bits/char on text8 and 1.24 bits/char on Hutter Prize. We also apply a purely byte-level mLSTM on the WikiText-2 dataset to achieve a character level entropy of 1.26 bits/char, corresponding to a word level perplexity of 88.8, which is comparable to word level LSTMs regularised in similar ways on the same task.
科研通智能强力驱动
Strongly Powered by AbleSci AI