vocab: add vocab and tokenizer configuration for fine-tuned model (Faust I and II German GPT-2)
Browse files- merges.txt +0 -0
- tokenizer_config.json +1 -0
- vocab.json +0 -0
merges.txt
ADDED
The diff for this file is too large to render.
See raw diff
|
|
tokenizer_config.json
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"special_tokens_map_file": null, "full_tokenizer_file": null}
|
vocab.json
ADDED
The diff for this file is too large to render.
See raw diff
|
|