This is a dummy model that can be used for testing. It was created as follows in a python shell ```python import transformers config = transformers.DistilBertConfig(vocab_size=5, n_layers=1, n_heads=1, dim=1, hidden_dim=4 * 1, num_labels=2, id2label={0: "negative", 1: "positive"}, label2id={"negative": 0, "positive": 1}) model = transformers.DistilBertForSequenceClassification(config) tokenizer = transformers.DistilBertTokenizer("/tmp/empty_vocab.txt", model_max_length=512) config.save_pretrained(".") model.save_pretrained(".") tokenizer.save_pretrained(".") ```