1

Using BERT and BART for Query Suggestion

Transformer networks have recently been successfully applied on a very large range of NLP tasks. Surprisingly, they have never been employed for query suggestion, although their sequence-tosequence architecture makes them particularly appealing for …

Self-Attention Architectures for Answer-Agnostic Neural Question Generation

Neural architectures based on self-attention, such as Transformers, recently attracted interest from the research community, and obtained significant improvements over the state of the art in several tasks. We explore how Transformers can be adapted …

Unsupervised Information Extraction: Regularizing Discriminative Approaches with Relation Distribution Losses

Unsupervised relation extraction aims at extracting relations between entities in text. Previous unsupervised approaches are either generative or discriminative. In a supervised setting, discriminative approaches, such as deep neural network …

Context-Aware Zero-Shot Learning for Object Recognition

Zero-Shot Learning (ZSL) aims at classifying unlabeled objects by leveraging auxiliary knowledge, such as semantic representations. A limitation of previous approaches is that only intrinsic proper...

Un modèle multimodal d’apprentissage de représentations de phrases qui préserve la sémantique visuelle

Answers Unite! Unsupervised Metrics for Reinforced Summarization Models

Architecture basée sur les mécanismes d'attention: le cas de la génération de questions neuronales

RÉSUMÉ. Les architectures neuronales basées sur l'attention, telles que le Transformer, ont ré- cemment suscité l'intérêt de la communauté scientifique et ont permis d'obtenir des progrès im- portants par rapport à l'état de l'art dans plusieurs …

Incorporating Visual Semantics into Sentence Representations within a Grounded Space

Apprentissage multimodal de représentation de mots à l'aide de contexte visuel

Représentations Gaussiennes pour le Filtrage Collaboratif