The legibility of traffic signs has been considered from the beginning of design, and traffic signs are easy to identify for humans. For computer systems, however, identifying traffic signs still poses a challenging problem. Both image‐processing and machine‐learning algorithms are constantly improving, aimed at better solving this problem. However, with a dramatic increase in the number of traffic signs, labelling a large amount of training data means high cost. Therefore, how to use a small number of labelled traffic sign data reasonably to build an efficient and high‐quality traffic sign recognition (TSR) model in the Internet‐of‐things–based (IOT‐based) transport system has been an urgent research goal. Here, the authors propose a novel semi‐supervised learning approach combining global and local features for TSR in an IOT‐based transport system. In their approach, histograms of oriented gradient, colour histograms (CH), and edge features (EF) are used to build different feature spaces. Meanwhile, on the unlabelled samples, a fusion feature space is found to alleviate the differences between different feature spaces. Extensive evaluations on a collection of signs from the German Traffic Sign Recognition Benchmark (GTSRB) dataset shows that the proposed approach outperforms the others and provides a potential solution for practical applications.


    Access

    Download


    Export, share and cite



    Traffic sign recognition by combining global and local features based on semi-supervised classification

    He, Zhenli / Nan, Fengtao / Li, Xinfa et al. | IET | 2019

    Free access


    Shape Classification for Traffic Sign Recognition

    Besserer, B. / Estable, S. / Ulmer, B. et al. | British Library Conference Proceedings | 1993


    Combining Local and Global Image Features for Object Class Recognition

    Lisin, D.A. / Mattar, M.A. / Blaschko, M.B. et al. | IEEE | 2005


    TRAFFIC SIGN RECOGNITION DEVICE AND TRAFFIC SIGN RECOGNITION METHOD

    MIYASATO KAZUHIRO / KOYASU TOSHIYA | European Patent Office | 2023

    Free access