File size: 7,957 Bytes
12f2bcf
 
e8c26b2
 
 
 
 
238a32b
332a751
e8c26b2
 
e95a12b
 
2ea151c
 
 
 
e95a12b
 
 
 
44775b9
 
 
 
332a751
e95a12b
 
 
2ea151c
 
 
 
 
 
 
 
 
e9f5d2a
e95a12b
cd0927a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
b1d5b7e
 
 
 
 
 
 
cd0927a
 
 
1e22fe8
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
cd0927a
 
e95a12b
2ea151c
 
e95a12b
cd0927a
e95a12b
cd0927a
debbb92
 
 
3d309ce
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
debbb92
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
503f6b9
e95a12b
 
 
e9f5d2a
2ea151c
e95a12b
 
 
 
986613c
e95a12b
 
 
986613c
e95a12b
 
 
c71209d
e95a12b
 
 
e9f5d2a
e95a12b
 
 
e06f1b9
e95a12b
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
---
license: apache-2.0
library_name: peft
tags:
- trl
- sft
- generated_from_trainer
- falcon
- transformers
base_model: tiiuae/falcon-7b
model-index:
- name: falcon7b-linear-equations
  results: []
datasets:
- Menouar/LinearEquations
language:
- en
---

# falcon7b-linear-equations

This model is a fine-tuned version of [tiiuae/falcon-7b](https://huggingface.co/tiiuae/falcon-7b) on a simple dataset of [linear equations](https://huggingface.co/datasets/Menouar/LinearEquations).
For this task it is better to finetune [tiiuae/falcon-7b-instruct](https://huggingface.co/tiiuae/falcon-7b-instruct) as it is already finetuned on a mixture of chat/instruct datasets.
But we start by finetuning the raw model as it is more challenging.

The merged version of this model with QLoRA can be found at [falcon7b-linear-equations-merged](https://huggingface.co/Menouar/falcon7b-linear-equations-merged). 

## Model description

The objective of this model is to test Falcon7B's ability to solve mathematical linear equations after fine-tuning. The linear equations are in the form:

```
Ay + ay + b + B = Dy + dy + c + C
```
This model was trained using TRL, LoRA quantization, and Flash Attention.

Due to limited GPU resources, I only considered 20,000 samples for training.

For more information, check my [**Notebook**](https://colab.research.google.com/drive/1e8t5Cj6ZDAOc-z3bweWuBxF8mQZ9IPsH?usp=sharing).

```python
import torch
from peft import AutoPeftModelForCausalLM
from transformers import AutoTokenizer, pipeline

# Specify the model ID
peft_model_id = "Menouar/falcon7b-linear-equations"

# Load Model with PEFT adapter
model = AutoPeftModelForCausalLM.from_pretrained(
  peft_model_id,
  device_map="auto",
  torch_dtype=torch.float16
)

tokenizer = AutoTokenizer.from_pretrained(peft_model_id)

pipe = pipeline("text-generation", model=model, tokenizer=tokenizer)

equation = "Solve for y: 10 + 4y -9y +5 = 4 +8y - 2y + 8 ."

outputs = pipe(equation, 
               max_new_tokens=172, 
               do_sample=True, 
               temperature=0.1,
               top_k=50, top_p=0.1,
               eos_token_id=pipe.tokenizer.eos_token_id,
               pad_token_id=pipe.tokenizer.pad_token_id)

for seq in outputs:
    print(f"{seq['generated_text']}")

"""
Solve for y: 10 + 4y -9y +5 = 4 +8y - 2y + 8 .
The equation is in the form of ay + b = dy + c where:
a = 4 - 9 = -5
b = 10 + 5 = 15
d = 8 - 2 = 6
c = 4 + 8 = 12
The solution is y = (c - b)/(a - d) if a ≠ d
12 - 15 = -3
-5 - 6 = -11
y = -3 / -11
The fraction -3 / -11 = 3 / 11.
The solution is y = 3 / 11.
"""

```


## Intended uses & limitations
The model can solve any equation of the form ```Ay + ay + b + B = Dy + dy + c + C``` with integer coefficients ranging from -10 to 10. It cannot solve linear equations which have more constants than: A, a, b, B, c, C. It also cannot solve linear equations which have constants larger than 10 or smaller than -10. These limitations are due to the nature of the samples within the dataset and the ability of Large Language Models (LLMs) to perform simple computations between numbers. The goal of this work is to demonstrate that fine-tuning an LLM on a specific dataset can yield excellent results for handling a specific task, as is the case with our new model compared to the original one.

## Evaluation

I will compile the evaluation section at a later time. For the moment, I’d like to present an example of a linear equation. In this example, this model, Bard, and BingChat are able to find the correct solution. However, other models, including ChatGPT3.5, Llama 70B, Mixtral, and Falcon-7b-instruct, do not arrive at the correct solution.
```
Solve for y: 10 + 4y -9y +5 = 4 +8y - 2y + 8 .
```
<strong>Falcon-7b-instruct</strong><br>
Solve for y: 10 + 4y -9y +5 = 4 +8y - 2y + 8 .<br>
First, we can solve for y.<br>
y = -2 + 8x<br>
We add 8 to x and subtract -2, giving:<br>
x = -2<br>
so 2 = +2<br>
y = 4<br>
we add 2 to y, so y = +6<br>
We can now plug these values into the original equation:<br>
10 + 4x -9x + 5 = -2 + 4x<br>
10 + 4(-2) = -6<br>
we end up with -1 = -1 + 4<br>
4 = 1<br>
y = 1<br>
Therefore, the solution is <strong><span style="color:red">
y = 1
</span></strong>. The equation has been solved.

<strong>Mixtral 8x7b</strong><br>
Let's first simplify the equation by combining like terms:<br>
10 - 5y + 5 = 4 + 6y<br>
Now, let's get all the y terms on one side and constants on the other side. Subtract 6y from both sides:<br>
10 - 5y - 6y + 5 = 4 + 6y - 6y<br>
This simplifies to:<br>
15 - 11y = 4<br>
Next, subtract 15 from both sides to isolate the y term:<br>
15 - 15 - 11y = 4 - 15<br>
This simplifies to:<br>
-11y = -11<br>
Finally, divide both sides by -11 to solve for y:<br>
y = -11 / -11<br>
So,<br>
<strong><span style="color:red">
y = 1
</span></strong>.

<strong>Llma 70b</strong><br>
To solve for y, we need to isolate the y term on one side of the equation.<br>
First, we can combine the like terms:<br>
10 + 4y - 9y + 5 = 4 + 8y - 2y + 8<br>
This simplifies to:<br>
10 - 5y + 5 = 4 + 6y<br>
Next, we can subtract 4 from both sides:<br>
6 - 5y + 5 = 6y<br>
Now, we can isolate the y term by subtracting 5 from both sides:<br>
-5y + 5 = 6y - 5<br>
This simplifies to:<br>
-5y = 11<br>
Finally, we can divide both sides by -5 to solve for y:<br>
y = -11/5<br>
Therefore, the value of y is <strong><span style="color:red">
-11/5
</span></strong>.

<strong>ChatGPT3.5</strong><br>
To solve for y in the given equation 10 + 4y - 9y + 5 = 4 + 8y - 2y + 8, follow these steps:<br>
Combine like terms on each side: (4y - 9y + 5) + 10 = (6y + 12 + 8y - 2y)<br>
Combine the y terms and constants separately: -5y + 15 = 12y + 12<br>
Move all terms involving y to one side and the constants to the other side: -5y - 12y = 12 - 15<br>
Combine like terms: -17y = -3<br>
Divide by -17: y = 3/17<br>
Therefore, the solution to the equation 10 + 4y - 9y + 5 = 4 + 8y - 2y + 8 is <strong><span style="color:red">
y = 3/17
</span></strong>.

<strong>Bard</strong><br>
Sure, I'd be happy to help you solve for y. Here is the solution:<br>
10 + 4y - 9y +5 = 4 +8y - 2y + 8<br>
First, combine like terms on each side of the equation.<br>
-5y + 15 = 12 + 6y<br>
Next, subtract 6y from both sides of the equation.<br>
-11y + 15 = 12<br>
Finally, subtract 15 from both sides of the equation to isolate y.<br>
-11y = -3<br>
Divide both sides of the equation by -11 to find y.<br>
<strong><span style="color:green">
y = 0.2727
</span></strong>.

<strong>BingChat</strong><br>
Combine like terms on both sides:<br>
10 + 4y - 9y + 5 = 15 - 5y<br>
4 + 8y - 2y + 8 = 12 + 6y<br>
This gives us:<br>
15 - 5y = 12 + 6y<br>
Solving for y in one step:<br>
<strong><span style="color:green">
y = 3 / 11
</span></strong>.

<strong>Falcon7B-linear-equation</strong><br>
The equation is in the form of ay + b = dy + c where:<br>
a = 4 - 9 = -5<br>
b = 10 + 5 = 15<br>
d = 8 - 2 = 6<br>
c = 4 + 8 = 12<br>
The solution is y = (c - b)/(a - d) if a ≠ d <br>
12 - 15 = -3<br>
-5 - 6 = -11<br>
y = -3 / -11<br>
The fraction -3 / -11 = 3 / 11.<br>
The solution is <strong><span style="color:green">
y = 3 / 11
</span></strong>.


## Training procedure

For more information, check my [**Notebook**](https://colab.research.google.com/drive/1e8t5Cj6ZDAOc-z3bweWuBxF8mQZ9IPsH?usp=sharing).

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 42
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 84
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant
- lr_scheduler_warmup_ratio: 0.03
- num_epochs: 3

### Training results

The training results can be found on [**TensorBoard**](https://huggingface.co/Menouar/falcon7b-linear-equations/tensorboard)

### Framework versions

- PEFT 0.8.2.dev0
- Transformers 4.38.0.dev0
- Pytorch 2.1.0+cu121
- Datasets 2.16.1
- Tokenizers 0.15.1