RNTI

MODULAD
Système de question-réponse multilingue appliqué aux agents conversationnels
In EGC 2020, vol. RNTI-E-36, pp.333-340
Abstract
Language models such as BERT are a great way to solve complex NLP tasks like QuestionAnswering. However, datasets are currently mostly in English which makes it difficult to acknowledge progress in other languages. Fortunately, BERT has recently been pre-trained in several hundred languages and show a good ability for zero-shot transfer from one language to another. In this paper, we show that multilingual BERT, trained to solve the question-answering task in English, is then able to generalize to French and Japanese. We also introduce our usecase: Kate, a human resources chatbot, that answers questions from users in multiple languages from intranet pages.