Home / Question Answering / Recent Language Models > RoBERTa: A Robustly Optimized BERT Pretraining Approach Recent Language Models > RoBERTa: A Robustly Optimized BERT Pretraining Approach , Yinhan Liu, et al., arXiv preprint, 2019. Package GitHub Back to Question Answering