Accurate traffic forecasting is important to enable intelligent transportation systems in a smart city. This problem is challenging due to the complicated spatial, short-term temporal and long-term periodical dependencies. Existing approaches have considered these factors in modeling. Most solutions apply CNN, or its extension Graph Convolution Networks (GCN) to model the spatial correlation. However, the convolution operator may not adequately model the non-Euclidean pair-wise correlations. In this paper, we propose a novel Attention-based Periodic-Temporal neural Network (APTN), an end-to-end solution for traffic foresting that captures spatial, short-term, and long-term periodical dependencies. APTN first uses an encoder attention mechanism to model both the spatial and periodical dependencies. Our model can capture these dependencies more easily because every node attends to all other nodes in the network, which brings regularization effect to the model and avoids overfitting between nodes. Then, a temporal attention is applied to select relevant encoder hidden states across all time steps. We evaluate our proposed model using real world traffic datasets and observe consistent improvements over state-of-the-art baselines.


    Zugriff

    Zugriff prüfen

    Verfügbarkeit in meiner Bibliothek prüfen

    Bestellung bei Subito €


    Exportieren, teilen und zitieren



    Titel :

    A Spatial–Temporal Attention Approach for Traffic Prediction


    Beteiligte:
    Shi, Xiaoming (Autor:in) / Qi, Heng (Autor:in) / Shen, Yanming (Autor:in) / Wu, Genze (Autor:in) / Yin, Baocai (Autor:in)


    Erscheinungsdatum :

    01.08.2021


    Format / Umfang :

    2047304 byte




    Medientyp :

    Aufsatz (Zeitschrift)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch




    STATNet: Spatial-temporal attention in the traffic prediction

    Moghadas, Seyed Mohamad / Gheibi, Amin / Alahi, Alexander | TIBKAT | 2022

    Freier Zugriff

    Spatial-temporal Traffic Congestion Prediction Based on Attention Mechanism

    Pu, Shilin / Chu, Liang / Zhang, Yuanjian et al. | IEEE | 2021