bhavyaaiplanet commited on
Commit
3509b5c
1 Parent(s): 512fd9e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +42 -1
README.md CHANGED
@@ -1,8 +1,10 @@
1
  ---
 
2
  license: apache-2.0
3
  language:
4
  - en
5
  library_name: transformers
 
6
  pipeline_tag: text-generation
7
  ---
8
 
@@ -19,4 +21,43 @@ effi 7b AWQ is a quantized version of effi 7b whiich is a 7 billion parameter mo
19
  - **Language(s) (NLP):** English
20
  - **Quantisation type:** AWQ(4-bit)
21
  - **License:** Apache 2.0
22
- - **Quantized from model:** Effi-7b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ base_model: aiplanet/effi-7b
3
  license: apache-2.0
4
  language:
5
  - en
6
  library_name: transformers
7
+ model_type: llama
8
  pipeline_tag: text-generation
9
  ---
10
 
 
21
  - **Language(s) (NLP):** English
22
  - **Quantisation type:** AWQ(4-bit)
23
  - **License:** Apache 2.0
24
+ - **Quantized from model:** Effi-7b
25
+
26
+
27
+ ### Example of usage
28
+
29
+ ```py
30
+ import torch
31
+ from transformers import AutoTokenizer , AutoModelForCausalLM
32
+
33
+ quant_path = "aiplanet/effi-7b-awq"
34
+
35
+ model = AutoModelForCausalLM.from_pretrained(quant_path).to(0)
36
+ tokenizer = AutoTokenizer.from_pretrained(quant_path, trust_remote_code=True , safetensors=True , fuse_layers=True)
37
+
38
+
39
+ tst = """
40
+
41
+ ### INSTRUCTION:
42
+ Virgin Australia, the trading name of Virgin Australia Airlines Pty Ltd, is an Australian-based airline. It is the largest airline by fleet size to use the Virgin brand. It commenced services on 31 August 2000 as Virgin Blue, with two aircraft on a single route. It suddenly found itself as a major airline in Australia's domestic market after the collapse of Ansett Australia in September 2001. The airline has since grown to directly serve 32 cities in Australia, from hubs in Brisbane, Melbourne and Sydney.Is Virgin Australia and Virgin Blue the same airlines?
43
+
44
+ """
45
+
46
+ system_message = "Given your chain of thought reasoning, provide a rationale for the context in the source."
47
+
48
+ template=f"""
49
+ Context: {system_message}
50
+ Human: {tst}
51
+ """
52
+
53
+ # Tokenize the input
54
+ input_ids = tokenizer(template, return_tensors="pt", truncation=True).input_ids.cuda()
55
+ # Run the model to infere an output
56
+ outputs = model.generate(input_ids=input_ids, max_new_tokens=512, top_p=0.9,temperature=0.1 , top_k=1, repetition_penalty=1.1)
57
+
58
+
59
+ # Print the result
60
+
61
+ print(f"{tokenizer.batch_decode(outputs.detach().cpu().numpy(), skip_special_tokens=True)[0][len(template):].split(' [/INST]')[0]}")
62
+
63
+ '''