RNTI

MODULAD
Interaction retardée dans l'encodeur du Transformer pour répondre efficacement aux questions dans un domaine ouvert
In EGC 2021, vol. RNTI-E-37, pp.133-144
Abstract
Transformer-based language models such as Bert suffer from a high complexity in open domain question answering. In this paper, we change their architecture to allow a more efficient management of computations. The resulting variants are competitive with original models and allow a significant speedup in both GPU and CPU.