File size: 154 Bytes
84c4f8b
 
 
 
1
2
3
4
If you get a "missing tokenizer" error, run this line as well when importing the model:


`tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')`