Transformer networks have recently been successfully applied on a very large range of NLP tasks. Surprisingly, they have never been employed for query suggestion, although their sequence-tosequence architecture makes them particularly appealing for …
Neural architectures based on self-attention, such as Transformers, recently attracted interest from the research community, and obtained significant improvements over the state of the art in several tasks. We explore how Transformers can be adapted …
Unsupervised relation extraction aims at extracting relations between entities in text. Previous unsupervised approaches are either generative or discriminative. In a supervised setting, discriminative approaches, such as deep neural network …
Zero-Shot Learning (ZSL) aims at classifying unlabeled objects by leveraging auxiliary knowledge, such as semantic representations. A limitation of previous approaches is that only intrinsic proper...
RÉSUMÉ. Les architectures neuronales basées sur l'attention, telles que le Transformer, ont ré- cemment suscité l'intérêt de la communauté scientifique et ont permis d'obtenir des progrès im- portants par rapport à l'état de l'art dans plusieurs …