File size: 1,898 Bytes
6291643 7eff671 6ba8393 7eff671 6291643 7eff671 0fcfbe5 7eff671 c8f1288 7eff671 b4a2ab7 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 |
---
license: mit
language:
- ja
library_name: transformers
pipeline_tag: text-generation
tags:
- gpt_neox
- gpt-neox
- japanese
inference:
parameters:
max_new_tokens: 32
do_sample: false
repetition_penalty: 1.1
---
# stockmark/gpt-neox-japanese-1.4b
This repository provides a GPT-NeoX based model with 1.4B parameters pre-trained on Japanese corpus of about 20B tokens. This model is developed by [Stockmark Inc.](https://stockmark.co.jp/)
## How to use
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
# Use torch.bfloat16 for A100 GPU and torch.flaot16 for the older generation GPUs
torch_dtype = torch.bfloat16 if torch.cuda.is_available() and hasattr(torch.cuda, "is_bf16_supported") and torch.cuda.is_bf16_supported() else torch.float16
model = AutoModelForCausalLM.from_pretrained("stockmark/gpt-neox-japanese-1.4b", device_map="auto", torch_dtype=torch_dtype)
tokenizer = AutoTokenizer.from_pretrained("stockmark/gpt-neox-japanese-1.4b")
inputs = tokenizer("自然言語処理は", return_tensors="pt").to(model.device)
with torch.no_grad():
tokens = model.generate(
**inputs,
max_new_tokens=128,
repetition_penalty=1.1
)
output = tokenizer.decode(tokens[0], skip_special_tokens=True)
print(output)
```
## Example:
- LoRA tuning: https://huggingface.co/stockmark/gpt-neox-japanese-1.4b/blob/main/notebooks/LoRA.ipynb
## Training dataset
- Japanese Web Corpus (ja): 8.6B tokens (This dataset will not be released.)
- Wikipedia (ja): 0.88B tokens
- CC100 (ja): 10.5B tokens
## Training setting
- Trained using HuggingFace Trainer and DeepSpeed (ZeRO-2)
- 8 A100 GPUs (40GB) at ABCI
- Mixed Precision (BF16)
## License
[The MIT license](https://opensource.org/licenses/MIT)
## Developed by
[Stockmark Inc.](https://stockmark.co.jp/)
## Author
[Takahiro Omi](https://huggingface.co/omitakahiro) |