File size: 849 Bytes
980a6e5
 
 
da8f261
980a6e5
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
da8f261
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
---
language:
- en
pipeline_tag: text-generation
---
This is llama2 7B finetuned using qlora with bf16 as compute dtype. The dataset has been generated using open-ai api with samples semantics oriented towards abstract explanation of system design.


lora has been merged into the original model, 3 peochs have been trained with batch size of 16.


```bash
from google.colab import drive
from transformers import AutoModelForCausalLM, AutoTokenizer
from transformers import pipeline

model_path = "SaffalPoosh/system_design_expert"

model = AutoModelForCausalLM.from_pretrained(model_path)
tokenizer = AutoTokenizer.from_pretrained(model_path)


prompt = "Design an application like Whatsapp with tech stack you will use"
gen = pipeline('text-generation', model=model, tokenizer=tokenizer)
result = gen(prompt)
print(result[0]['generated_text'])
```