Abstract
In the current landscape, there is a growing urgency for advanced text generation technology, prompting a surge of interest in the exploration of highly efficient text generative models based on deep learning. In this paper, the long short-term memory (LSTM) unit was selected as the text generator and used as the decoder to generate its own retelling text. The feature extractor was BERT (Bidirectional Encoder Representation from Transformers), and the embedding layer of BERT included three parts: words, fragments, and locations. The BLEU (Bilingual Evaluation Understudy) score was used to measure the similarity between real text and generated text. The BLEU score for text paraphrasing generation under the LSTM network method was 0.98, while the BLEU score for text paraphrasing generation under the Transformer network method was 0.89. The BLEU score of text paraphrase generation under the Transformer-based sequence to sequence model method was 0.97, and the BLEU score of text paraphrase generation under the Transformer’s general text generative model method was 0.96. The LSTM network method was the best method compared with other methods. This article met the technical requirements of modern society for generating abstracts, and the readability and completeness of abstracts were further improved.
Keywords
Get full access to this article
View all access options for this article.
