Feedforward neural networks (FNN) are most heavily used to identify the relation between a given set of input and desired output patterns. By the universal approximation theorem, it is clear that a single-hidden layer FNN is suffcient for the outputs to approximate the corresponding desired outputs arbitrarily close and so we consider a single-hidden layer FNN. In practice, we set up an error function so as to measure the performance of the FNN. As the error function is nonlinear, we define an iterative process, learning algorithm, to obtain the optimal choice of the connection weights and thus set up a numerical optimization problem. In this paper, we consider a new error function defined on the hidden layer We propose a new learning algorithm based on the least square methods converges rapidly. We discuss our method with the classic learning algorithms and the convergence for these algorithms.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    New Error Function for Single Hidden Layer Feedforward Neural Networks


    Contributors:


    Publication date :

    2008-05-01


    Size :

    194129 byte




    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English



    Predicting Telephone Traffic Congestion Using Multi Layer Feedforward Neural Networks

    Markus, E.D. ;Okereke, O.U. ;Agee, John T. | Trans Tech Publications | 2011


    Two Order Training Algorithm for Multi-layer Feedforward Neural Networks

    Shen, C. / Chang, D. / Shu, Y. | British Library Online Contents | 1997



    Hidden-layer Redundancy Method of RBF Neural Networks

    Xu, L.-q. / Hu, D.-c. | British Library Online Contents | 2001