In this paper, we develop a general classification framework called Kullback-Leibler Boosting, or KLBoosting. KLBoosting has following properties. First, classification is based on the sum of histogram divergences along corresponding global and discriminating linear features. Second, these linear features, called KL features, are iteratively learnt by maximizing the projected Kullback-Leibler divergence in a boosting manner. Third, the coefficients to combine the histogram divergences are learnt by minimizing the recognition error once a new feature is added to the classifier. This contrasts conventional AdaBoost where the coefficients are empirically set. Because of these properties, KLBoosting classifier generalizes very well. Moreover, to apply KLBoosting to high-dimensional image space, we propose a data-driven Kullback-Leibler Analysis (KLA) approach to find KL features for image objects (e.g., face patches). Promising experimental results on face detection demonstrate the effectiveness of KLBoosting.


    Zugriff

    Zugriff prüfen

    Verfügbarkeit in meiner Bibliothek prüfen

    Bestellung bei Subito €


    Exportieren, teilen und zitieren



    Titel :

    Kullback-Leibler boosting


    Beteiligte:
    Ce Liu, (Autor:in) / Hueng-Yeung Shum, (Autor:in)


    Erscheinungsdatum :

    01.01.2003


    Format / Umfang :

    1019028 byte





    Medientyp :

    Aufsatz (Konferenz)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch



    Kullback-Leibler Boosting

    Liu, C. / Shum, H.-Y. / IEEE | British Library Conference Proceedings | 2003


    Kullback-Leibler Approach to Gaussian Mixture Reduction

    Runnalls, A.R. | Online Contents | 2007



    PCA and Kullback-Leibler Divergence-Based FDD Methods

    Chen, Hongtian / Jiang, Bin / Lu, Ningyun et al. | Springer Verlag | 2020


    Ellipticity and Circularity Measuring via Kullback–Leibler Divergence

    Misztal, K. | British Library Online Contents | 2016