Electronic neural networks made to learn faster by use of terminal teacher forcing. Method of supervised learning involves addition of teacher forcing functions to excitations fed as inputs to output neurons. Initially, teacher forcing functions are strong enough to force outputs to desired values; subsequently, these functions decay with time. When learning successfully completed, terminal teacher forcing vanishes, and dynamics or neural network become equivalent to those of conventional neural network. Simulated neural network with terminal teacher forcing learned to produce close approximation of circular trajectory in 400 iterations.


    Access

    Access via TIB

    Check availability in my library


    Export, share and cite



    Title :

    Accelerating Learning By Neural Networks


    Contributors:

    Published in:

    Publication date :

    1992-11-01



    Type of media :

    Miscellaneous


    Type of material :

    No indication


    Language :

    English




    Accelerating Simulation Speed and Efficiency Using Graph Neural Networks

    Cook, Kyle / Langel, Chris / Vellakal, Madhu et al. | AIAA | 2025


    Accelerating Inference In Long Short-Term Memory Neural Networks

    Mealey, Thomas / Taha, Tarek M. | IEEE | 2018



    Accelerating Numerical Simulations of Supercritical Fluid Flows using Deep Neural Networks

    Milan, Petro Junior / Wang, Xingjian / Hickey, Jean-Pierre et al. | AIAA | 2020


    ACCELERATING NUMERICAL SIMULATIONS OF SUPERCRITICAL FLUID FLOWS USING DEEP NEURAL NETWORKS

    Milan, Petro Junior / Wang, Xingjian / Hickey, Jean-Pierre et al. | TIBKAT | 2020