Update README.md
Browse files
README.md
CHANGED
@@ -12,7 +12,7 @@ tags:
|
|
12 |
license: apache-2.0
|
13 |
---
|
14 |
|
15 |
-
|
16 |
|
17 |
A 6.8 billion parameter pre-trained model for Japanese language, based on EleutherAI's Mesh Transformer JAX, that has a similar model structure to their GPT-J-6B pre-trained model.
|
18 |
|
12 |
license: apache-2.0
|
13 |
---
|
14 |
|
15 |
+
This pre-trained model is work in progress! Model weight download will be available in the future.
|
16 |
|
17 |
A 6.8 billion parameter pre-trained model for Japanese language, based on EleutherAI's Mesh Transformer JAX, that has a similar model structure to their GPT-J-6B pre-trained model.
|
18 |
|