Vulnerable road user safety is of paramount importance as transport moves towards fully autonomous driving. The research question posed by this research is of how can we train a computer to be able to see and perceive a pedestrian’s movement. This work presents a dual network architecture, trained in tandem, which is capable of classifying the behaviour of a pedestrian from a single image with no prior context. The results show that the most successful network was able to achieve a correct classification accuracy of 94.3% when classifying images based on their behaviour. This shows the use of a novel data fusion method for pedestrian images and human poses. Having a network with these capabilities is important for the future of transport, as it will allow vehicles to correctly perceive the intention of pedestrians crossing the street, and will ultimately lead to fewer pedestrian casualties on our roads.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Classification of a Pedestrian’s Behaviour Using Dual Deep Neural Networks


    Additional title:

    Advs in Intelligent Syst., Computing


    Contributors:

    Conference:

    Science and Information Conference ; 2020 ; London, United Kingdom July 16, 2020 - July 17, 2020



    Publication date :

    2020-07-04


    Size :

    17 pages





    Type of media :

    Article/Chapter (Book)


    Type of material :

    Electronic Resource


    Language :

    English




    The paradox of pedestrian's risk aversion

    Hacohen, Shlomi / Shoval, Shraga / Shvalb, Nir | Elsevier | 2020


    Pedestrian's needs matter: Examining Manila's walking environment

    Mateo-Babiano, Iderlina | Online Contents | 2016



    Influence of built environment on pedestrian's crossing decision

    Granié, Marie-Axelle / Brenac, Thierry / Montel, Marie-Claude et al. | Elsevier | 2014


    Evaluating Pedestrian's Choice of Some Specific Facilities in Dhaka

    Rahaman, K.R. / Ohmori, N. / Harata, N. et al. | British Library Conference Proceedings | 2006