tokenizer trained on 128k - not sure if it'll be better for RoPE or not
Browse files
converted_tokenizer_128k/tokenizer.model
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:c0ddbb01d0aee1f4c6e65c47638924bfc736f69a14d854938ca96035f9f776a1
|
3 |
+
size 2038397
|
converted_tokenizer_128k/tokenizer.vocab
ADDED
The diff for this file is too large to render.
See raw diff
|
|