Neural networks have to capture mathematical relationships in order to learn various tasks. They approximate these relations implicitly and therefore often do not generalize well. The recently proposed Neural Arithmetic Logic Unit (NALU) is a novel neural architecture which is able to explicitly represent the mathematical relationships by the units of the network to learn operations such as summation, subtraction or multiplication. Although NALUs have been shown to perform well on various downstream tasks, an in-depth analysis reveals practical shortcomings by design, such as the inability to multiply or divide negative input values or training stability issues for deeper networks. We address these issues and propose an improved model architecture. We evaluate our model empirically in various settings from learning basic arithmetic operations to more complex functions. Our experiments indicate that our model solves stability issues and outperforms the original NALU model in means of arithmetic precision and convergence.


    Access

    Download


    Export, share and cite



    Title :

    iNALU: Improved Neural Arithmetic Logic Unit


    Contributors:

    Publication date :

    2020-01-01



    Type of media :

    Article (Journal)


    Type of material :

    Electronic Resource


    Language :

    English


    Keywords :

    Classification :

    DDC:    004 / 629




    Neural arithmetic logic units

    Trask, A / Hill, F / Reed, SE et al. | BASE | 2019

    Free access

    ARITHMETIC SYSTEM, AND ARITHMETIC UNIT

    MORISHIMA KENTA | European Patent Office | 2020

    Free access

    ARITHMETIC UNIT AND ARITHMETIC METHOD

    UJITOKO YUSUKE / HOTTA YUUKI | European Patent Office | 2020

    Free access

    ARITHMETIC UNIT

    INAGAKI TAKAHIRO / IIDA TAKAAKI | European Patent Office | 2021

    Free access

    ARITHMETIC UNIT

    INAGAKI TAKAHIRO / MORIMOTO TOMOAKI / KADOSAKI SHIRO | European Patent Office | 2021

    Free access