File size: 1,959 Bytes
9d3525e
 
 
f1f310e
 
9d3525e
 
 
 
 
 
f1f310e
 
9d3525e
 
 
 
 
f1f310e
 
 
9d3525e
f1f310e
 
 
 
 
 
9d3525e
f1f310e
 
9d3525e
f1f310e
 
 
 
9d3525e
f1f310e
 
 
9d3525e
f1f310e
 
 
9d3525e
f1f310e
9d3525e
f1f310e
 
 
9d3525e
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
---
library_name: peft
base_model: mistralai/Mistral-7B-Instruct-v0.2
language:
- en
---

# Model Card for Model ID

<!-- Provide a quick summary of what the model is/does. -->

This model is a fine-tuned version of [base_model](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2) on an [FRIENDS TV Series](https://www.kaggle.com/datasets/blessondensil294/friends-tv-series-screenplay-script) dataset. 
Fine-tuning was done by taking only the parts of the dataset where Monica spoke.


## Uses

<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
```python  
from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel, PeftConfig

base_model = "mistralai/Mistral-7B-Instruct-v0.2"
adapter_model = "akingunduz/monica_llm"
 
model = AutoModelForCausalLM.from_pretrained(base_model)
model = PeftModel.from_pretrained(model, adapter_model)
tokenizer = AutoTokenizer.from_pretrained(base_model)

model = model.to("cuda")
model.eval()

import torch
def build_prompt(question):
  prompt=f"<s>[INST]@Monica. {question} [/INST]"
  return prompt

question = "Which city do you live?"
prompt = build_prompt(question)
inputs = tokenizer(prompt, return_tensors="pt")

with torch.no_grad():
    outputs = model.generate(input_ids=inputs["input_ids"].to("cuda"), max_new_tokens=10)
    print(tokenizer.batch_decode(outputs.detach().cpu().numpy(), skip_special_tokens=True)[0])

```

```
>>> [INST]@Monica. Which city do you live? [/INST]New York.
```

## Environmental Impact

<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->

Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).


- PEFT 0.10.0