Text Generation
Transformers
PyTorch
English
llama
Eval Results
text-generation-inference
Inference Endpoints
Pankaj Mathur commited on
Commit
bf52c88
1 Parent(s): dc41361

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -18,7 +18,7 @@ We build explain tuned [WizardLM dataset ~70K](https://github.com/nlpxucan/Wizar
18
 
19
  We leverage all of the 15 system instructions provided in Orca Research Paper. to generate custom datasets, in contrast to vanilla instruction tuning approaches used by original datasets.
20
 
21
- This helps student model aka [wizardlm_alpaca_dolly_orca_open_llama_13b](https://huggingface.co/psmathur/wizardlm_alpaca_dolly_orca_open_llama_13b) to learn ***thought*** process from teacher model, which is ChatGPT (gpt-3.5-turbo-0301 version).
22
 
23
  Please see below example usage how the **System** prompt is added before each **instruction**.
24
 
@@ -53,7 +53,7 @@ import torch
53
  from transformers import LlamaForCausalLM, LlamaTokenizer
54
 
55
  # Hugging Face model_path
56
- model_path = 'psmathur/wizardlm_alpaca_dolly_orca_open_llama_13b'
57
  tokenizer = LlamaTokenizer.from_pretrained(model_path)
58
  model = LlamaForCausalLM.from_pretrained(
59
  model_path, torch_dtype=torch.float16, device_map='auto',
 
18
 
19
  We leverage all of the 15 system instructions provided in Orca Research Paper. to generate custom datasets, in contrast to vanilla instruction tuning approaches used by original datasets.
20
 
21
+ This helps student model aka this model to learn ***thought*** process from teacher model, which is ChatGPT (gpt-3.5-turbo-0301 version).
22
 
23
  Please see below example usage how the **System** prompt is added before each **instruction**.
24
 
 
53
  from transformers import LlamaForCausalLM, LlamaTokenizer
54
 
55
  # Hugging Face model_path
56
+ model_path = 'psmathur/orca_mini_13b'
57
  tokenizer = LlamaTokenizer.from_pretrained(model_path)
58
  model = LlamaForCausalLM.from_pretrained(
59
  model_path, torch_dtype=torch.float16, device_map='auto',