Project Awesome project awesome

BERT

A new language representation model which stands for Bidirectional Encoder Representations from Transformers. Unlike recent language representation models, BERT is designed to pre-train deep bidirectional representations by jointly conditioning on both left and right context in all layers.

Package 39.9k stars Archived GitHub
Back to Question Answering