Decision trees are attractive classifiers due to their high execution speed. But trees derived with traditional methods often cannot be grown to arbitrary complexity for possible loss of generalization accuracy on unseen data. The limitation on complexity usually means suboptimal accuracy on training data. Following the principles of stochastic modeling, we propose a method to construct tree-based classifiers whose capacity can be arbitrarily expanded for increases in accuracy for both training and unseen data. The essence of the method is to build multiple trees in randomly selected subspaces of the feature space. Trees in, different subspaces generalize their classification in complementary ways, and their combined classification can be monotonically improved. The validity of the method is demonstrated through experiments on the recognition of handwritten digits.


    Zugriff

    Zugriff prüfen

    Verfügbarkeit in meiner Bibliothek prüfen

    Bestellung bei Subito €


    Exportieren, teilen und zitieren



    Titel :

    Random decision forests


    Beteiligte:
    Tin Kam Ho (Autor:in)


    Erscheinungsdatum :

    01.01.1995


    Format / Umfang :

    598273 byte




    Medientyp :

    Aufsatz (Konferenz)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch



    Random Decision Forests

    Ho, T. K. | British Library Conference Proceedings | 1995


    Efficient Gaussian process classification using random decision forests

    Fröhlich, B. / Rodner, E. / Kemmler, M. et al. | British Library Online Contents | 2011


    Large-scale Gaussian process classification using random decision forests

    Fröhlich, B. / Rodner, E. / Kemmler, M. et al. | British Library Online Contents | 2012


    Predicting incident duration using random forests

    Hamad, Khaled / Al-Ruzouq, Rami / Zeiada, Waleed et al. | Taylor & Francis Verlag | 2020


    Travel Time Reliability Prediction Using Random Forests

    Zhao, Mo / Zhang, Xiaoxiao / Appiah, Justice et al. | Transportation Research Record | 2023