Neural Natural Language Generation: A Survey on Multilinguality, Multimodality, Controllability and Learning

Main Article Content

Erkut Erdem
Menekse Kuyu
Semih Yagcioglu
Anette Frank
Letitia Parcalabescu
Barbara Plank
Andrii Babii
Oleksii Turuta
Aykut Erdem
Iacer Calixto
Elena Lloret
Elena-Simona Apostol
Ciprian-Octavian Truică
Branislava Šandrih
Sanda Martinčić-Ipšić
Gábor Berend
Albert Gatt
Grăzina Korvel

Abstract

Developing artificial learning systems that can understand and generate natural language has been one of the long-standing goals of artificial intelligence. Recent decades have witnessed an impressive progress on both of these problems, giving rise to a new family of approaches. Especially, the advances in deep learning over the past couple of years have led to neural approaches to natural language generation (NLG). These methods combine generative language learning techniques with neural-networks based frameworks. With a wide range of applications in natural language processing, neural NLG (NNLG) is a new and fast growing field of research. In this state-of-the-art report, we investigate the recent developments and applications of NNLG in its full extent from a multidimensional view, covering critical perspectives such as multimodality, multilinguality, controllability and learning strategies. We summarize the fundamental building blocks of NNLG approaches from these aspects and provide detailed reviews of commonly used preprocessing steps and basic neural architectures. This report also focuses on the seminal applications of these NNLG models such as machine translation, description generation, automatic speech recognition, abstractive summarization, text simplification, question answering and generation, and dialogue generation. Finally, we conclude with a thorough discussion of the described frameworks by pointing out some open research directions.

Article Details

Section
Articles