Abstract
Incorporating external graph knowledge into neural chatbot models has been
proven effective for enhancing dialogue generation. However, in conventional
graph neural networks (GNNs), message passing on a graph is independent from
text, resulting in the graph representation hidden space differing from that of
the text. This training regime of existing models therefore leads to a semantic
gap between graph knowledge and text. In this study, we propose a novel
framework for knowledge graph enhanced dialogue generation. We dynamically
construct a multi-hop knowledge graph with pseudo nodes to involve the language
model in feature aggregation within the graph at all steps. To avoid the
semantic biases caused by learning on vanilla subgraphs, the proposed framework
applies hierarchical graph attention to aggregate graph features on pseudo
nodes and then attains a global feature. Therefore, the framework can better
utilise the heterogeneous features from both the post and external graph
knowledge. Extensive experiments demonstrate that our framework outperforms
state-of-the-art (SOTA) baselines on dialogue generation. Further analysis also
shows that our representation learning framework can fill the semantic gap by
coagulating representations of both text and graph knowledge. Moreover, the
language model also learns how to better select knowledge triples for a more
informative response via exploiting subgraph patterns within our feature
aggregation process. Our code and resources are available at
https://github.com/tangg555/SaBART.