张浩彬 (2023-03-27 15:40):
#paper 10.1109/ijcnn52387.2021.9533426 Self-Supervised Pre-training for Time Series Classification 少有的时间序列迁移学习文章,利用DTW计算距离建立代理任务构建正负样本来做学习,encoder用的transformer,新意少了点。
Self-Supervised Pre-training for Time Series Classification
翻译
Abstract:
Recently, significant progress has been made in time series classification with deep learning. However, using deep learning models to solve time series classification generally suffers from expensive calculations and difficulty of data labeling. In this work, we study self-supervised time series pre-training to overcome these challenges. Compared with the existing works, we focus on the universal and unlabeled time series pretraining. To this end, we propose a novel end-to-end neural network architecture based on self-attention, which is suitable for capturing long-term dependencies and extracting features from different time series. Then, we propose two different self-supervised pretext tasks for time series data type: Denoising and Similarity Discrimination based on DTW (Dynamic Time Warping). Finally, we carry out extensive experiments on 85 time series datasets (also known as UCR2015 [2]). Empirical results show that the time series model augmented with our proposed self-supervised pretext tasks achieves state-of-the-art / highly competitive results.
翻译
回到顶部