Naveengo commited on
Commit
8443345
1 Parent(s): 6133670

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +33 -0
README.md CHANGED
@@ -29,6 +29,39 @@ Parameter-Efficient Fine-tuning (PEFT) is a technique used to improve the perfor
29
  ## Training Data
30
  the model is trained on 'b-mc2/sql-create-context' dataset upto 5000rows
31
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
32
  ## Training procedure
33
 
34
 
 
29
  ## Training Data
30
  the model is trained on 'b-mc2/sql-create-context' dataset upto 5000rows
31
 
32
+ ## Usage:
33
+ please install `transformers`, and `peft`:
34
+ ```
35
+ !pip install transformers peft
36
+ ```
37
+ To use the model, you can run the following:
38
+
39
+ ```py
40
+ import torch
41
+ from peft import PeftModel, PeftConfig
42
+ from transformers import AutoModelForCausalLM, AutoTokenizer
43
+
44
+ config = PeftConfig.from_pretrained("Naveengo/gpt2-medium-on-sql-create-context")
45
+ model = AutoModelForCausalLM.from_pretrained(config.base_model_name_or_path, return_dict=True, load_in_8bit=False)
46
+ tokenizer = AutoTokenizer.from_pretrained(config.base_model_name_or_path)
47
+
48
+
49
+ # Load the Lora model
50
+ model = PeftModel.from_pretrained(model,"Naveengo/gpt2-medium-on-sql-create-context")
51
+
52
+ from IPython.display import display, Markdown
53
+
54
+ def make_inference(question, context):
55
+ batch = tokenizer(f"Below is an SQL instruction that describes a task, paired with an input that provides further context. Write an SQL query that appropriately completes the request using your expertise in SQL. ### Instruction: {question}### Input: {context}### Response:", return_tensors='pt')
56
+
57
+ with torch.cuda.amp.autocast():
58
+ output_tokens = model.generate(**batch, max_new_tokens=100)
59
+
60
+ display(Markdown((tokenizer.decode(output_tokens[0], skip_special_tokens=True))))
61
+ #give question and context to function
62
+ make_inference(your_question_here, your_context_here)
63
+ ```
64
+
65
  ## Training procedure
66
 
67