The Davidon Fletcher Powell (DFP) optimization algorithm usually used for nonlinear least squares is presented and is combined with the standard backpropagation (SBP) algorithm yielding a new fast training multilayer perceptron (MLP) algorithm (DFP/SBP). The new algorithm is tested on several function approximation problems. The number of iterations required by this algorithm to converge is less than 40% of what is required by the SBP algorithm. Also it is less affected by the choice of initial weights and setup parameters. The DFP/SBP algorithm is much more efficient than either of other techniques when the network contains no more than few hundred weights.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    A new fast multilayer perceptron training procedure based on the Davidon Fletcher Powell algorithm


    Contributors:
    Abid, S. (author) / Fnaiech, F. (author)


    Publication date :

    2003-01-01


    Size :

    276381 byte





    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English