The problem of distributed fusion of Gaussian mixture models (GMMs) provided by the local multiple model (MM) estimators is addressed in this article. Taking GMMs instead of combined Gaussian assumed probability density functions (pdfs) as the output of local MM estimators can retain more detailed (or internal) information about local estimations, but the accompanying challenge is to perform the fusion of GMMs. For this problem, a distributed fusion framework of GMMs under the minimum forward Kullback–Leibler (KL) divergence sum criterion is proposed first. Then, because the KL divergence between GMMs is not analytically tractable, two suboptimal distributed fusion algorithms are further developed within this framework. These two fusion algorithms all have closed forms. Numerical examples verify their effectiveness in terms of both computational efficiency and estimation accuracy.


    Zugriff

    Zugriff prüfen

    Verfügbarkeit in meiner Bibliothek prüfen

    Bestellung bei Subito €


    Exportieren, teilen und zitieren



    Titel :

    Distributed Fusion of Multiple Model Estimators Using Minimum Forward Kullback–Leibler Divergence Sum


    Beteiligte:
    Wei, Zheng (Autor:in) / Duan, Zhansheng (Autor:in) / Hanebeck, Uwe D. (Autor:in)


    Erscheinungsdatum :

    2024-06-01


    Format / Umfang :

    1079774 byte




    Medientyp :

    Aufsatz (Zeitschrift)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch



    PCA and Kullback-Leibler Divergence-Based FDD Methods

    Chen, Hongtian / Jiang, Bin / Lu, Ningyun et al. | Springer Verlag | 2020


    Ellipticity and Circularity Measuring via Kullback–Leibler Divergence

    Misztal, K. | British Library Online Contents | 2016


    Kullback-Leibler boosting

    Ce Liu, / Hueng-Yeung Shum, | IEEE | 2003



    Kullback-Leibler Boosting

    Liu, C. / Shum, H.-Y. / IEEE | British Library Conference Proceedings | 2003