计算机科学
系列(地层学)
时间序列
代表(政治)
超参数
文本生成
数据建模
人工神经网络
数据挖掘
人工智能
机器学习
数据库
古生物学
政治
政治学
法学
生物
作者
Yi Li,Yuxuan Gao,Jianyi Cai,Guoxiang Zheng,Hanlin Shi,Xiping Liu
标识
DOI:10.1109/ijcnn54540.2023.10191421
摘要
Data-to-text generation takes structured data as input and produces text that sufficiently describes the data as output. Recently, it has received a lot of attention from both research field and industry. However, as a critical data form, time series is less discussed in this domain. This paper proposes Repr2Seq, a data-to-text generation model for time series. To better capture the structure and core information of time series, Repr2Seq obtains representation vectors using time series representation learning methods, which are then fed into a neural network-based model to generate text sequences. To demonstrate the effectiveness of Repr2Seq, a dataset consisting of stock price series and corresponding comments is proposed. Experiments show that Repr2Seq achieves significant improvement over standard approaches and leads to satisfactory results. We also conduct experiments to investigate the effect of hyperparameters on the model and detect performance improvements in various settings.
科研通智能强力驱动
Strongly Powered by AbleSci AI