With detailed sensor and visual data from automobiles, a data-driven model can learn to classify crash-related events during a drive. We propose a neural network model accepting time-series vehicle sensor data and forward-facing videos as input for learning classification of crash-related events and varying types of such events. To elaborate, a novel recurrent neural network structure is introduced, namely, denoising gated recurrent unit with decay, in order to deal with time-series automobile sensor data with missing value and noises. Our model detects crash and near-crash events based on a large set of time-series data collected from naturalistic driving behavior. Furthermore, the model classifies those events involving pedestrians, a vehicle in front, or a vehicle on either side. The effectiveness of our model is evaluated with more than two thousand 30-s clips from naturalistic driving behavior data. The results show that the model, including sensory encoder with denoising gated recurrent unit with decay, visual encoder, and attention mechanism, outperforms gated recurrent unit with decay, gated CNN, and other baselines not only in event classification and but also in event-type classification.


    Access

    Download


    Export, share and cite



    Title :

    Denoising Recurrent Neural Networks for Classifying Crash-Related Events


    Contributors:
    Park, Sungjoon (author) / Seonwoo, Yeon (author) / Kim, Jiseon (author) / Kim, Jooyeon (author) / Oh, Alice (author)


    Publication date :

    2020-07-01


    Size :

    2029811 byte




    Type of media :

    Article (Journal)


    Type of material :

    Electronic Resource


    Language :

    English





    Applying Few-Shot Learning in Classifying Pedestrian Crash Typing

    Weng, Yanmo / Das, Subasish / Paal, Stephanie German | Transportation Research Record | 2023


    Real-Time Crash Risk Prediction using Long Short-Term Memory Recurrent Neural Network

    Yuan, Jinghui / Abdel-Aty, Mohamed / Gong, Yaobang et al. | Transportation Research Record | 2019