Generating Natural Language from Logic Expressions with Structural Representation

Xin Wu, Yi Cai, Zetao Lian, Ho Fung Leung, Tao Wang*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

3 Citations (Scopus)

Abstract

Incorporating logic reasoning with deep neural networks (DNNs) is an important challenge in machine learning. In this article, we study the problem of converting logical expressions into natural language. In particular, given a sequential logic expression, the goal is to generate its corresponding natural sentence. Since the information in a logic expression often has a hierarchical structure, a sequence-to-sequence baseline struggles to capture the full dependencies between words, and hence it often generates incorrect sentences. To alleviate this problem, we propose a model to convert Structural Logic Expressions into Natural Language (SLEtoNL). SLEtoNL converts sequential logic expressions into structural representation and leverages structural encoders to capture the dependencies between nodes. The quantitative and qualitative analyses demonstrate that our proposed method outperforms the seq2seq model, which is based on the sequential representation, and outperforms strong pretrained language models (e.g., T5, BART, GPT3) with a large margin (28.6 in BLEU3) in out-of-distribution evaluation.

Original languageEnglish
Pages (from-to)1499-1510
Number of pages12
JournalIEEE/ACM Transactions on Audio Speech and Language Processing
Volume31
DOIs
Publication statusPublished - 2023

Keywords

  • graph convolutional networks
  • logic expressions
  • Natural language generation
  • Tree-LSTM

Fingerprint

Dive into the research topics of 'Generating Natural Language from Logic Expressions with Structural Representation'. Together they form a unique fingerprint.

Cite this