Improving End-to-End Memory Networks with Unified Weight Tying
Published:
Fei Liu, Trevor Cohn and Timothy Baldwin (2017) Improving End-to-End Memory Networks with Unified Weight Tying. In Proceedings of the Australasian Language Technology Association Workshop 2017, Brisbane, Australia, pp. 16-24. Best Poster Award
@inproceedings{Liu+:2017,
author = {Liu, Fei and Cohn, Trevor and Baldwin, Timothy},
title = {Improving End-to-End Memory Networks with Unified Weight Tying},
booktitle = {Proceedings of the Australasian Language Technology Association Workshop 2017},
year = {2017},
address = {Brisbane, Australia},
pages = {16--24}
}
Abstract
Answering questions while reasoning over multiple supporting facts has long been a goal of artificial intelligence. Recently, remarkable advances have been made, focusing on reasoning over natural language-based stories. In particular, end-to-end memory networks (N2N), have achieved state-of-the-art results over such tasks. However, N2Ns are limited by the necessity to choose between two weight tying schemes, neither of which performs consistently well over all tasks. We propose a unified model generalising weight tying and in doing so, make the model more expressive. The proposed model achieves uniformly high performance, improving on the best results for memory network-based models on the bAbI dataset, and competitive results on Dialog bAbI.