动态时间归整
可微函数
系列(地层学)
算法
计算机科学
动态规划
维数(图论)
二次方程
时间序列
正规化(语言学)
功能(生物学)
时间复杂性
欧几里德距离
数学
模式识别(心理学)
人工智能
机器学习
组合数学
几何学
古生物学
数学分析
生物
进化生物学
作者
Marco Cuturi,Mathieu Blondel
出处
期刊:Cornell University - arXiv
日期:2017-01-01
被引量:9
标识
DOI:10.48550/arxiv.1703.01541
摘要
We propose in this paper a differentiable learning loss between time series, building upon the celebrated dynamic time warping (DTW) discrepancy. Unlike the Euclidean distance, DTW can compare time series of variable size and is robust to shifts or dilatations across the time dimension. To compute DTW, one typically solves a minimal-cost alignment problem between two time series using dynamic programming. Our work takes advantage of a smoothed formulation of DTW, called soft-DTW, that computes the soft-minimum of all alignment costs. We show in this paper that soft-DTW is a differentiable loss function, and that both its value and gradient can be computed with quadratic time/space complexity (DTW has quadratic time but linear space complexity). We show that this regularization is particularly well suited to average and cluster time series under the DTW geometry, a task for which our proposal significantly outperforms existing baselines. Next, we propose to tune the parameters of a machine that outputs time series by minimizing its fit with ground-truth labels in a soft-DTW sense.
科研通智能强力驱动
Strongly Powered by AbleSci AI