A temporal convolutional recurrent autoencoder based framework for compressing time series data

自编码 循环神经网络 计算机科学 深度学习 卷积神经网络 人工智能 编码器 时间序列 系列(地层学) 数据压缩 模式识别(心理学) 人工神经网络 算法 机器学习 生物 操作系统 古生物学
作者
Zhong Zheng,Zijun Zhang
出处
期刊:Applied Soft Computing [Elsevier BV]
卷期号:147: 110797-110797 被引量:3
标识
DOI:10.1016/j.asoc.2023.110797
摘要

The sharply growing volume of time series data due to recent sensing technology advancement poses emerging challenges to the data transfer speed and storage as well as corresponding energy consumption. To tackle the overwhelming volume of time series data in transmission and storage, compressing time series, which encodes time series into smaller size representations while enables authentic restoration of compressed ones with minimizing the reconstruction error, has attracted significant attention. Numerous methods have been developed and recent deep learning ones with minimal assumptions on data characteristics, such as recurrent autoencoders, have shown themselves to be competitive. Yet, capturing long-term dependencies in time series compression is a significant challenge calling further development. To make a response, this paper proposes a temporal convolutional recurrent autoencoder framework for more effective time series compression. First, two autoencoder modules, the temporal convolutional network encoder with a recurrent neural network decoder (TCN-RNN) and the temporal convolutional network encoder with an attention assisted recurrent neural network decoder (TCN-ARNN), are developed. The TCN-RNN employs only the recurrent neural network decoder to reconstruct the time series in reverse order. In contrast, the TCN-ARNN uses two recurrent neural networks to reconstruct the time series in both forward and reverse order in parallel. In addition, a timestep-wise attention network is developed to incorporate the forward and reverse reconstructions into the ultimate reconstruction with adaptive weights. Finally, a model selection procedure is developed to adaptively select between the TCN-RNN and TCN-ARNN based on their reconstruction performance on the validation dataset. Computational experiments on five datasets show that the proposed temporal convolutional recurrent autoencoder outperforms state-of-the-art benchmarking models in terms of lower reconstruction errors with the same compression ratio, achieving an improvement of up to 45.14% in the average of mean squared errors. Results indicate a promising potential of the proposed temporal convolutional recurrent autoencoder on the time series compression for various applications involving long time series data.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
xiaowang发布了新的文献求助10
刚刚
李健应助闪闪问安采纳,获得10
1秒前
小蘑菇应助liuqun采纳,获得10
1秒前
自信友桃完成签到,获得积分10
1秒前
1秒前
扥会完成签到,获得积分20
1秒前
奋斗瑶发布了新的文献求助10
2秒前
积极从蕾应助若俗人采纳,获得10
2秒前
若风完成签到,获得积分10
2秒前
2秒前
3秒前
小二郎应助小白采纳,获得10
3秒前
思源应助kk采纳,获得10
3秒前
称心的猪完成签到,获得积分10
4秒前
4秒前
隐形曼青应助羊青丝采纳,获得10
4秒前
Nanocapsule完成签到,获得积分10
4秒前
5秒前
xiaoxiao发布了新的文献求助10
6秒前
IDHNAPHO完成签到,获得积分10
6秒前
一个美女完成签到,获得积分10
6秒前
自觉的元芹完成签到,获得积分10
7秒前
小包发布了新的文献求助10
7秒前
希望天下0贩的0应助HEIHEI采纳,获得10
7秒前
EvenCai应助奋斗瑶采纳,获得10
8秒前
负责冰烟发布了新的文献求助10
8秒前
科研八戒完成签到,获得积分10
8秒前
斯文败类应助buding采纳,获得10
8秒前
Yy完成签到,获得积分10
8秒前
bourne78发布了新的文献求助200
8秒前
文南犬完成签到 ,获得积分10
8秒前
灰灰发布了新的文献求助10
9秒前
9秒前
共享精神应助77采纳,获得10
10秒前
英俊芷完成签到 ,获得积分10
10秒前
10秒前
ggg发布了新的文献求助10
11秒前
11秒前
牧紊完成签到 ,获得积分10
12秒前
xlz110完成签到,获得积分10
12秒前
高分求助中
【提示信息,请勿应助】关于scihub 10000
The Mother of All Tableaux: Order, Equivalence, and Geometry in the Large-scale Structure of Optimality Theory 3000
A new approach to the extrapolation of accelerated life test data 1000
徐淮辽南地区新元古代叠层石及生物地层 500
Coking simulation aids on-stream time 450
康复物理因子治疗 400
北师大毕业论文 基于可调谐半导体激光吸收光谱技术泄漏气体检测系统的研究 390
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 冶金 细胞生物学 免疫学
热门帖子
关注 科研通微信公众号,转发送积分 4016703
求助须知:如何正确求助?哪些是违规求助? 3556823
关于积分的说明 11322708
捐赠科研通 3289505
什么是DOI,文献DOI怎么找? 1812495
邀请新用户注册赠送积分活动 888064
科研通“疑难数据库(出版商)”最低求助积分说明 812086