Home / Question Answering / Recent Language Models > DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter Recent Language Models > DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter , Victor sanh, et al., arXiv, 2019. Package GitHub Back to Question Answering