fujiki commited on
Commit
1393148
1 Parent(s): 322776f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -5
README.md CHANGED
@@ -8,11 +8,11 @@ pipeline_tag: text-generation
8
  license: apache-2.0
9
  ---
10
 
11
- # Japanese Mistral-7B-v0.1 Instruct
12
 
13
  ## Model Description
14
 
15
- This is a 7B-parameter decoder-only Japanese language model fine-tuned on instruction-following datasets, built on top of the base model [Japanese Mistral-7B-v0.1 Base](https://huggingface.co/stabilityai/japanese-mistral-7b-v0.1-base).
16
 
17
  ## Usage
18
 
@@ -22,9 +22,9 @@ This is a 7B-parameter decoder-only Japanese language model fine-tuned on instru
22
  import torch
23
  from transformers import LlamaTokenizer, AutoModelForCausalLM
24
 
25
- tokenizer = AutoTokenizer.from_pretrained("stabilityai/japanese-mistral-7b-v0.1-instruct")
26
  model = AutoModelForCausalLM.from_pretrained(
27
- "stabilityai/japanese-mistral-7b-v0.1-instruct",
28
  trust_remote_code=True,
29
  torch_dtype="auto",
30
  )
@@ -73,7 +73,7 @@ print(out)
73
  ## Model Details
74
 
75
  * **Developed by**: [Stability AI](https://stability.ai/)
76
- * **Model type**: `Japanese Mistral-7B-v0.1 Instruct` model is an auto-regressive language model based on the transformer decoder architecture.
77
  * **Language(s)**: Japanese
78
  * **License**: This model is licensed under [Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0).
79
  * **Contact**: For questions and comments about the model, please email `lm@stability.ai`
 
8
  license: apache-2.0
9
  ---
10
 
11
+ # Japanese StableLM Instruct Gamma
12
 
13
  ## Model Description
14
 
15
+ This is a 7B-parameter decoder-only Japanese language model fine-tuned on instruction-following datasets, built on top of the base model [Japanese StableLM Base Gamma](https://huggingface.co/stabilityai/japanese-stablelm-base-gamma).
16
 
17
  ## Usage
18
 
 
22
  import torch
23
  from transformers import LlamaTokenizer, AutoModelForCausalLM
24
 
25
+ tokenizer = AutoTokenizer.from_pretrained("stabilityai/japanese-stablelm-instruct-gamma")
26
  model = AutoModelForCausalLM.from_pretrained(
27
+ "stabilityai/japanese-stablelm-instruct-gamma",
28
  trust_remote_code=True,
29
  torch_dtype="auto",
30
  )
 
73
  ## Model Details
74
 
75
  * **Developed by**: [Stability AI](https://stability.ai/)
76
+ * **Model type**: `Japanese StableLM Instruct Gamma` model is an auto-regressive language model based on the transformer decoder architecture.
77
  * **Language(s)**: Japanese
78
  * **License**: This model is licensed under [Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0).
79
  * **Contact**: For questions and comments about the model, please email `lm@stability.ai`