Update README.md
Browse files
README.md
CHANGED
@@ -4,6 +4,8 @@ license: cc-by-sa-4.0
|
|
4 |
# japanese-gpt2-medium-unidic
|
5 |
This is a medium-sized Japanese GPT-2 model using BERT-like tokenizer.
|
6 |
|
|
|
|
|
7 |
# How to use
|
8 |
The model depends on [PyTorch](https://pytorch.org/), [fugashi](https://github.com/polm/fugashi) with [unidic-lite](https://github.com/polm/unidic-lite), and [Hugging Face Transformers](https://github.com/huggingface/transformers).
|
9 |
|
@@ -16,8 +18,8 @@ pip install transformers
|
|
16 |
```python
|
17 |
from transformers import AutoTokenizer, AutoModelForCausalLM
|
18 |
import torch
|
19 |
-
tokenizer = AutoTokenizer.from_pretrained('
|
20 |
-
model = AutoModelForCausalLM.from_pretrained('
|
21 |
|
22 |
text = '今日はいい天気なので、'
|
23 |
|
|
|
4 |
# japanese-gpt2-medium-unidic
|
5 |
This is a medium-sized Japanese GPT-2 model using BERT-like tokenizer.
|
6 |
|
7 |
+
Reversed version is published [here](https://huggingface.co/okazaki-lab/japanese-reversed-gpt2-medium-unidic/).
|
8 |
+
|
9 |
# How to use
|
10 |
The model depends on [PyTorch](https://pytorch.org/), [fugashi](https://github.com/polm/fugashi) with [unidic-lite](https://github.com/polm/unidic-lite), and [Hugging Face Transformers](https://github.com/huggingface/transformers).
|
11 |
|
|
|
18 |
```python
|
19 |
from transformers import AutoTokenizer, AutoModelForCausalLM
|
20 |
import torch
|
21 |
+
tokenizer = AutoTokenizer.from_pretrained('okazaki-lab/japanese-gpt2-medium-unidic')
|
22 |
+
model = AutoModelForCausalLM.from_pretrained('okazaki-lab/japanese-gpt2-medium-unidic')
|
23 |
|
24 |
text = '今日はいい天気なので、'
|
25 |
|