WIP warning
Browse files
README.md
CHANGED
@@ -10,6 +10,8 @@ tags:
|
|
10 |
license: apache-2.0
|
11 |
---
|
12 |
|
|
|
|
|
13 |
A 6.8 billion parameter pre-trained model for Japanese language, based on EleutherAI's Mesh Transformer JAX, that has a similar model structure to their GPT-J-6B pre-trained model.
|
14 |
|
15 |
EleutherAIによるMesh Transformer JAXをコードベースとした、GPT-J-6Bに似たストラクチャと約68.7億パラメータを持つ日本語pre-trainedモデルです。
|
|
|
10 |
license: apache-2.0
|
11 |
---
|
12 |
|
13 |
+
The pre-trained model is work in progress!
|
14 |
+
|
15 |
A 6.8 billion parameter pre-trained model for Japanese language, based on EleutherAI's Mesh Transformer JAX, that has a similar model structure to their GPT-J-6B pre-trained model.
|
16 |
|
17 |
EleutherAIによるMesh Transformer JAXをコードベースとした、GPT-J-6Bに似たストラクチャと約68.7億パラメータを持つ日本語pre-trainedモデルです。
|