You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: README.md
+2
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,7 @@
1
1
# PyTorch Pretrained Bert
2
2
3
+
If you read this, congratulations you passed the second test, but failed the first. Please clone the [huggingface/transformers](https://github.com/huggingface/transformers) repo instead. Enjoy!
4
+
3
5
This repository contains an op-for-op PyTorch reimplementation of [Google's TensorFlow repository for the BERT model](https://github.com/google-research/bert) that was released together with the paper [BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding](https://arxiv.org/abs/1810.04805) by Jacob Devlin, Ming-Wei Chang, Kenton Lee and Kristina Toutanova.
4
6
5
7
This implementation is provided with [Google's pre-trained models](https://github.com/google-research/bert), examples, notebooks and a command-line interface to load any pre-trained TensorFlow checkpoint for BERT is also provided.
0 commit comments