A paper published at Pattern Recognition

A paper published at Pattern Recognition

Created
December 1, 2022
Tags
PaperComputer Vision
Updated
December 1, 2022

We are pleased to announce that our paper “Deep Attentive Time Warping” has been accepted to Pattern Recognition.

Shinnosuke Matsuo, Xiaomeng Wu, Guntag Atarsaikhan, Akisato Kimura, Kunio Kashino, Brian Kenji Iwana, Seiichi Uchida, “Deep attentive time warping,” Pattern Recognition, 2022.

The similarity of time series is often used for tasks such as classification and verification. It is still difficult to determine the optimal method to measure the similarity of them due to the problems of intra-class variances (e.g., nonlinear time distortions) and differences in the optimal method for each task. Dynamic Time Warping (DTW) is proposed to solve the problem of time distortion. However, over-time warping can reduce the discriminative power, and there is a trade-off between robustness against temporal distortion and discriminative power. In addition, since DTW is not learnable, it is not possible to achieve adaptive time warping for each task.

image

We propose a neural network-based time warping model to obtain time warping adapted to target task. We use the attention model, which realizes an explicit time warping mechanism and greater and more adaptive time invariance. Our attention model outputs a attention weight that represents the correspondence between each time step of the two time series. Our model is trained based on metric learning, and learns the optimal correspondence.

image

In addition, we proposed a pre-training guided by DTW. This pre-training gives the our model temporal constraints. The constraints improve the accuracy and stabilize the learning during metric learning. Moreover, the proposed method can be plugged into models using DTW. It is possible to improve the accuracy of the conventional deep neural network models that use DTW.

To demonstrate the effective of the proposed method, we conducted experiments in two scenarios: stand-alone and plug-in. In the stand-alone scenario, we performed classifications on the UCR Time Series Archive with 85 datasets and Unipen with 3 datasets, and outperformed DTW on most of these datasets. In the plug-in scenario, we conducted signature verification on MCYT-100 and cofirmed that SoTA was achieved.

A pre-proof version can be found at https://doi.org/10.1016/j.patcog.2022.109201 , and the source code is now available at GitHub https://github.com/matsuo-shinnosuke/deep-attentive-time-warping .