Using BERT and BART for Query Suggestion

Abstract

Transformer networks have recently been successfully applied on a very large range of NLP tasks. Surprisingly, they have never been employed for query suggestion, although their sequence-tosequence architecture makes them particularly appealing for this task. Query suggestion requires to model behaviors during complex search sessions to output useful next queries to help users to complete their intent. We show that pre-trained transformer networks exhibit a very good performance for query suggestion on a large corpus of search logs, that they are more robust to noise, and have a better understanding of complex queries.

Publication
Joint Conference of the Information Retrieval Communities in Europe