Text Generation
Transformers
PyTorch
English
llama
Eval Results
Inference Endpoints
text-generation-inference
Pankaj Mathur commited on
Commit
911bb0b
1 Parent(s): 4ddf772

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -20,7 +20,7 @@ We build explain tuned [WizardLM dataset ~70K](https://github.com/nlpxucan/Wizar
20
 
21
  We leverage all of the 15 system instructions provided in Orca Research Paper. to generate custom datasets, in contrast to vanilla instruction tuning approaches used by original datasets.
22
 
23
- This helps student model aka [wizardlm_alpaca_dolly_orca_open_llama_13b](https://huggingface.co/psmathur/wizardlm_alpaca_dolly_orca_open_llama_13b) to learn ***thought*** process from teacher model, which is ChatGPT (gpt-3.5-turbo-0301 version).
24
 
25
  Please see below example usage how the **System** prompt is added before each **instruction**.
26
 
@@ -55,7 +55,7 @@ import torch
55
  from transformers import LlamaForCausalLM, LlamaTokenizer
56
 
57
  # Hugging Face model_path
58
- model_path = 'psmathur/wizardlm_alpaca_dolly_orca_open_llama_7b'
59
  tokenizer = LlamaTokenizer.from_pretrained(model_path)
60
  model = LlamaForCausalLM.from_pretrained(
61
  model_path, torch_dtype=torch.float16, device_map='auto',
 
20
 
21
  We leverage all of the 15 system instructions provided in Orca Research Paper. to generate custom datasets, in contrast to vanilla instruction tuning approaches used by original datasets.
22
 
23
+ This helps student model aka this model to learn ***thought*** process from teacher model, which is ChatGPT (gpt-3.5-turbo-0301 version).
24
 
25
  Please see below example usage how the **System** prompt is added before each **instruction**.
26
 
 
55
  from transformers import LlamaForCausalLM, LlamaTokenizer
56
 
57
  # Hugging Face model_path
58
+ model_path = 'psmathur/orca_mini_7b'
59
  tokenizer = LlamaTokenizer.from_pretrained(model_path)
60
  model = LlamaForCausalLM.from_pretrained(
61
  model_path, torch_dtype=torch.float16, device_map='auto',