
TBS also generates $knowledge$ that makes sense and is relevant to the dialogue around 85% of the time Empirical results show TBS models outperform end-to-end and knowledge-augmented RG baselines on most automatic metrics and generate more informative, specific, and commonsense-following responses, as evaluated by human annotators. We analyze different choices to collect knowledge-aligned dialogues, represent implicit knowledge, and transition between knowledge and dialogues. We argue that externalizing implicit knowledge allows more efficient learning, produces more informative responses, and enables more explainable models. In this paper, we present Think-Before-Speaking (TBS), a generative approach to first externalize implicit commonsense knowledge ($think$) and use this knowledge to generate responses ($speak$). Current neural response generation (RG) models are trained to generate responses directly, omitting unstated implicit knowledge. Implicit knowledge, such as common sense, is key to fluid human conversations. Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)Īssociation for Computational Linguistics Think Before You Speak: Explicitly Generating Implicit Commonsense Knowledge for Response Generation TBS also generates $knowledge$ that makes sense and is relevant to the dialogue around 85 of the time",

Publisher = "Association for Computational Linguistics",Ībstract = "Implicit knowledge, such as common sense, is key to fluid human conversations.
THINK BEFORE YOU SPEAK MODS
Cite (Informal): Think Before You Speak: Explicitly Generating Implicit Commonsense Knowledge for Response Generation (Zhou et al., ACL 2022) Copy Citation: BibTeX Markdown MODS XML Endnote More options… PDF: Software: 2022.acl-long.88.software.zip Data ConceptNet, = "Think Before You Speak: Explicitly Generating Implicit Commonsense Knowledge for Response Generation",īooktitle = "Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)", Association for Computational Linguistics. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1237–1252, Dublin, Ireland. Think Before You Speak: Explicitly Generating Implicit Commonsense Knowledge for Response Generation. TBS also generates knowledge that makes sense and is relevant to the dialogue around 85% of the time Anthology ID: 2022.acl-long.88 Volume: Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) Month: May Year: 2022 Address: Dublin, Ireland Venue: ACL SIG: Publisher: Association for Computational Linguistics Note: Pages: 1237–1252 Language: URL: DOI: 10.18653/v1/2022.acl-long.88 Bibkey: zhou-etal-2022-think Cite (ACL): Pei Zhou, Karthik Gopalakrishnan, Behnam Hedayatnia, Seokhwan Kim, Jay Pujara, Xiang Ren, Yang Liu, and Dilek Hakkani-Tur.


In this paper, we present Think-Before-Speaking (TBS), a generative approach to first externalize implicit commonsense knowledge ( think) and use this knowledge to generate responses ( speak). Abstract Implicit knowledge, such as common sense, is key to fluid human conversations.
