Experimental Research on Encoder-Decoder Architectures with Attention for Chatbots
Abstract
Chatbots aim at automatically offering a conversation between a human and a computer. While there is a long track of research in rule-based and retrieval-based approaches, the generation-based approaches are promisingly emerging solving issues like responding to queries in inference that were not previously seen in development or training time. In this paper, we offer an experimental view of how recent advances in close areas as machine translation can bead opted for chatbots. In particular, we compare how alternative encoder-decoder deep learning architectures perform in the context of chatbots. Our research concludes that a fully attention-based architecture is able to outperform the recurrent neural network baseline system.
Keywords
Chatbot, encoder-decoder, attention mechanisms