高级检索

    黄开远, 罗娜. 基于Attention机制改进TimeGAN的小样本时间序列预测方法[J]. 华东理工大学学报(自然科学版), 2023, 49(6): 890-899. DOI: 10.14135/j.cnki.1006-3080.20220723001
    引用本文: 黄开远, 罗娜. 基于Attention机制改进TimeGAN的小样本时间序列预测方法[J]. 华东理工大学学报(自然科学版), 2023, 49(6): 890-899. DOI: 10.14135/j.cnki.1006-3080.20220723001
    HUANG Kaiyuan, LUO Na. Improved TimeGAN Based on Attention for Time Series Prediction Method with Few Shot[J]. Journal of East China University of Science and Technology, 2023, 49(6): 890-899. DOI: 10.14135/j.cnki.1006-3080.20220723001
    Citation: HUANG Kaiyuan, LUO Na. Improved TimeGAN Based on Attention for Time Series Prediction Method with Few Shot[J]. Journal of East China University of Science and Technology, 2023, 49(6): 890-899. DOI: 10.14135/j.cnki.1006-3080.20220723001

    基于Attention机制改进TimeGAN的小样本时间序列预测方法

    Improved TimeGAN Based on Attention for Time Series Prediction Method with Few Shot

    • 摘要: 利用深度学习进行时间序列预测时所表现出的优越性能在很大程度上得益于数量庞大的训练样本。然而,实际过程中普遍存在数据难以收集而无法准确建模的问题。为了解决时间序列预测中的小样本问题,本文提出了一种基于注意力机制并融合时间卷积网络与长短期记忆网络的数据增强网络(ATCLSTM-TimeGAN),通过在时间序列过程生成对抗网络(TimeGAN)中加入Soft-Attention机制来解决其动态信息丢失的问题。针对生成器的输入一般为随机向量,采用时间卷积结构与Self-Attention机制融合,获得更好的数据生成效果。为了验证生成数据的真实性与有用性,比较了不同的数据增强方法所生成数据的分布差异以及合成数据用于预测时的预测效果。实验结果表明,相比于其他方法,ATCLSTM-TimeGAN能够更好地覆盖原始数据的分布,有效地降低了小样本下的预测误差。

       

      Abstract: The superior performance of deep learning in predicting time series processes largely benefits from the large number of training samples. However, in the actual process, sample data is usually difficult to collect and cannot be accurately modeled. In order to solve the problem of small samples in time series prediction, this paper proposes a data augmentation network (ATCLSTM-TimeGAN) based on the attention mechanism and integrating temporal convolutional network and long short-term memory network. By incorporating a Soft-Attention mechanism into the Time-series Generative Adversarial Network (TimeGAN), the problem of dynamic information loss is addressed. Because the input to the generator is a series of random vector, the temporal convolution structure is combined with the Self-Attention mechanism to make the distribution of value in the normalized range interval and the distribution of real data form a corresponding relationship and achieve better data generation performance. In order to verify the authenticity and usefulness of the generated data, this paper compares the distribution differences of the data generated by different data augmentation methods and the predictive performance of the synthesized data when used for prediction. In the actual process dataset, the usefulness of the synthetic data for prediction is further verified. The experimental results show that compared to other data enhancement methods, ATCLSTM-TimeGAN can better cover the distribution of the original data and effectively reduce the prediction error under small samples.

       

    /

    返回文章
    返回