File size: 4,017 Bytes
ddacb92
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
4070fa7
ddacb92
32e5b1f
ddacb92
4070fa7
884d1b9
 
 
 
ddacb92
884d1b9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
4070fa7
ddacb92
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
---
license: other
library_name: peft
tags:
- trl
- sft
- generated_from_trainer
datasets:
- generator
base_model: google/gemma-7b
model-index:
- name: gemma-7b-dolly-chatml
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# gemma-7b-dolly-chatml

This model is a fine-tuned version of [google/gemma-7b](https://huggingface.co/google/gemma-7b) with [philschmid/gemma-tokenizer-chatml](https://huggingface.co/philschmid/gemma-tokenizer-chatml) tokenizer on the [philschmid/dolly-15k-oai-style](https://huggingface.co/datasets/philschmid/dolly-15k-oai-style) using the chatML format. 

The model was fine-tuned with the following [script using Lora (no, qlora)](https://huggingface.co/philschmid/gemma-7b-dolly-chatml/blob/main/trl-lora.py). I also included a [inference script](https://huggingface.co/philschmid/gemma-7b-dolly-chatml/blob/main/inference.py) to make sure it works since there were some issues with Gemma. Results of the inference test are

```bash
prompt:
 What is the capital of Germany? Explain why thats the case and if it was different in the past?
response:
 Berlin is the capital of Germany.  It was the capital of Prussia until 1918, when the monarchy was abolished.  It was also the capital of the Weimar Republic.  It was the capital of the Third Reich until 1945, when it was liberated by the allies.  It has been the capital of the Federal Republic of Germany since 1949.  It is the largest city in the country with a population of 3.6 million people.  It is also the seat of the government and parliament.

prompt:
 In a town, 60% of the population are adults. Among the adults, 30% have a pet dog and 40% have a pet cat. What percentage of the total population has a pet dog?
response:
 60% of the total population have a pet dog. The calculation is 30% of adults multiplied by 60% of the total population. 30% of adults is 18% of the total population and 18% multiplied by 60% is 10.8% or 60% of the total population.
```

### Run inference

```python
import torch
from peft import AutoPeftModelForCausalLM
from transformers import  AutoTokenizer, pipeline

peft_model_id = "philschmid/gemma-7b-dolly-chatml"

# Load Model with PEFT adapter
tokenizer = AutoTokenizer.from_pretrained(peft_model_id)
model = AutoPeftModelForCausalLM.from_pretrained(peft_model_id, device_map="auto", torch_dtype=torch.float16)
pipe = pipeline("text-generation", model=model, tokenizer=tokenizer)
eos_token = tokenizer("<|im_end|>",add_special_tokens=False)["input_ids"][0]
print(f"eos_token: {eos_token}")

# run inference
messages = [
    {
        "role": "user",
        "content": "What is the capital of Germany? Explain why thats the case and if it was different in the past?"
    }
]

prompt = pipe.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
outputs = pipe(prompt, max_new_tokens=1024, do_sample=True, temperature=0.7, top_k=50, top_p=0.95, eos_token_id=eos_token)
print(outputs[0]['generated_text'][len(prompt):])
#  Berlin is the capital of Germany.  It was the capital of Prussia until 1918, when the monarchy was abolished.  It was also the capital of the Weimar Republic.  It was the capital of the Third Reich until 1945, when it was liberated by the allies.  It has been the capital of the Federal Republic of Germany since 1949.  It is the largest city in the country with a population of 3.6 million people.  It is also the seat of the government and parliament.
```


### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant
- lr_scheduler_warmup_ratio: 0.03
- num_epochs: 3

### Framework versions

- PEFT 0.8.2
- Transformers 4.38.1
- Pytorch 2.1.2+cu121
- Datasets 2.16.1
- Tokenizers 0.15.0