# BERT-uncased-2L-768H This is a converted pytorch checkpoint for bert with 2L trained from scratch. See [Google BERT](https://github.com/google-research/bert) for details.