张浩彬 (2022-08-11 16:10):
#paper 10.48550/arXiv.1901.10738 Unsupervised Scalable Representation Learning for Multivariate Time Series 论文关键是:正负样本构造, triplet loss以及因果空洞卷积 适用:该无监督学习模型可以用于不定长的序列;短序列及长序列均可使用; 1.正负样本构造:对于某序列,随机选择长度,构造一个子序列。在这个子序列中,随机抽样一个子序列作为正样本;从其他序列中随机抽样作为一个负样本 2.改造的triplet loss 3. exponentially dilated causal convolutions作为特征提取器代替传统的rnn、lstm 结果表明由于现有的无监督方法,并且不差于有监督方法。
Unsupervised Scalable Representation Learning for Multivariate Time Series
翻译
Abstract:
Time series constitute a challenging data type for machine learning algorithms, due to their highly variable lengths and sparse labeling in practice. In this paper, we tackle this challenge by proposing an unsupervised method to learn universal embeddings of time series. Unlike previous works, it is scalable with respect to their length and we demonstrate the quality, transferability and practicability of the learned representations with thorough experiments and comparisons. To this end, we combine an encoder based on causal dilated convolutions with a novel triplet loss employing time-based negative sampling, obtaining general-purpose representations for variable length and multivariate time series.
翻译
回到顶部