This thesis is dealing with the creation of a model for abstractive text summarization. For this purpose, recurrent neural networks are used to generate accurate summaries of given texts in the correct English language and context. We are appending a combination of recurrent neural network with hierarchical attention followed by Long Short Term Memory Networks (LSTM) building an auto-encoder structure. This work shows a possible upgradeable variant for automatically summarizing texts and can now be expanded for further research. The abstract compilation of texts is still in its infancy, and there are still many different open possibilities waiting to be realized.


    Access

    Download


    Export, share and cite



    Title :

    Deep recurrent neural networks for abstractive text summarization


    Contributors:

    Publication date :

    2018-05-08


    Type of media :

    Theses


    Type of material :

    Electronic Resource


    Language :

    English




    A LSTM based Deep Learning Model for Text Summarization

    Vijaya Saraswathi, R / Chunchu, Ravi Varma / Kunchala, Sushma et al. | IEEE | 2022


    Multi-view 3D face reconstruction with deep recurrent neural networks

    Dou, Pengfei / Kakadiaris, Ioannis A. | British Library Online Contents | 2018


    New baseline correction algorithm for text-line recognition with bidirectional recurrent neural networks

    Morillot, O. / Likforman-Sulem, L. / Grosicki, E. | British Library Online Contents | 2013



    Deep Learning Methods for Vessel Trajectory Prediction Based on Recurrent Neural Networks

    Capobianco, Samuele / Millefiori, Leonardo M. / Forti, Nicola et al. | IEEE | 2021