Multi-scale Chronological Aware Transformer (MCAformer) for Long-term Time Series Forecasting
Time series data exhibits inherent temporal dependencies, where the relationships between time steps should be unidirectional and decrease as the distance between the time steps increases. However, commonly used models like Transformers and MLPs often struggle to effectively capture this "chronological order" present in time series data.
To address this challenge, this paper proposes a novel model called the Multi-scale Chronological Aware Transformer (MCAformer). The key innovations of this model are:
-
It incorporates a unidirectional time decay matrix into the traditional attention mechanism to explicitly introduce the temporal ordering of the time series.

-
It leverages a multi-head attention approach, where each head uses a different time decay coefficient to construct the unidirectional time decay matrix. This allows the model to capture time series patterns at different time scales.
-
The model also employs a simple gated linear unit to adaptively adjust the weights of different variables, effectively modeling the correlations between them.
Experiments on public datasets demonstrate that the proposed MCAformer achieves strong performance, outperforming conventional time series models.

