Home / Question Answering / Recent Language Models > TinyBERT: Distilling BERT for Natural Language Understanding Recent Language Models > TinyBERT: Distilling BERT for Natural Language Understanding , Xiaoqi Jiao, et al., ICLR, 2020. Package GitHub Back to Question Answering