doberst commited on
Commit
c4160e7
1 Parent(s): 6e16d34

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -8
README.md CHANGED
@@ -1,5 +1,5 @@
1
  ---
2
- license: apache-2.0
3
  ---
4
 
5
  # Model Card for Model ID
@@ -34,7 +34,7 @@ For test run results (and good indicator of target use cases), please see the fi
34
  - **Model type:** StableLM-7B
35
  - **Language(s) (NLP):** English
36
  - **License:** Apache 2.0
37
- - **Finetuned from model:** StableLM-7B
38
 
39
  ## Uses
40
 
@@ -72,8 +72,8 @@ Any model can provide inaccurate or incomplete information, and should be used i
72
  The fastest way to get started with BLING is through direct import in transformers:
73
 
74
  from transformers import AutoTokenizer, AutoModelForCausalLM
75
- tokenizer = AutoTokenizer.from_pretrained("dragon-llama-7b-0.1")
76
- model = AutoModelForCausalLM.from_pretrained("dragon-llama-7b-0.1")
77
 
78
 
79
  The BLING model was fine-tuned with a simple "\<human> and \<bot> wrapper", so to get the best results, wrap inference entries as:
@@ -95,7 +95,4 @@ my_prompt = {{text_passage}} + "\n" + {{question/instruction}}
95
 
96
  Darren Oberst & llmware team
97
 
98
- Please reach out anytime if you are interested in this project!
99
-
100
-
101
-
 
1
  ---
2
+ license: cc-by-sa-4.0
3
  ---
4
 
5
  # Model Card for Model ID
 
34
  - **Model type:** StableLM-7B
35
  - **Language(s) (NLP):** English
36
  - **License:** Apache 2.0
37
+ - **Finetuned from model:** StableLM-Base-Alpha-7B-v2
38
 
39
  ## Uses
40
 
 
72
  The fastest way to get started with BLING is through direct import in transformers:
73
 
74
  from transformers import AutoTokenizer, AutoModelForCausalLM
75
+ tokenizer = AutoTokenizer.from_pretrained("dragon-stable-lm-7b-v1")
76
+ model = AutoModelForCausalLM.from_pretrained("dragon-stable-lm-7b-v1")
77
 
78
 
79
  The BLING model was fine-tuned with a simple "\<human> and \<bot> wrapper", so to get the best results, wrap inference entries as:
 
95
 
96
  Darren Oberst & llmware team
97
 
98
+ Please reach out anytime if you are interested in this project!