File size: 1,560 Bytes
1372cf0
098244a
 
 
 
1372cf0
098244a
 
 
 
c30befe
098244a
a87a7a1
098244a
132481e
bb2380d
098244a
a87a7a1
132481e
a87a7a1
32a84c0
 
a87a7a1
132481e
a87a7a1
 
32a84c0
a87a7a1
 
 
 
 
 
 
32a84c0
a87a7a1
32a84c0
a87a7a1
 
 
 
 
 
 
 
 
 
 
 
 
32a84c0
a87a7a1
 
 
 
 
 
 
32a84c0
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
---
license: other
datasets:
- Open-Orca/OpenOrca
- ehartford/wizard_vicuna_70k_unfiltered
tags:
- code
- prompt
- reverse prompt
widget:
- text: "Photosynthesis is the process by which plants, algae and some bacteria convert carbon dioxide and water into glucose and oxygen, using the energy of sunlight. This process is fundamental to life on Earth, as it provides the basis for almost all food chains and also contributes to the carbon cycle by helping to regulate the concentration of carbon dioxide in the atmosphere.  \n[REVERSED-PROMPT]"  
  example_title: "reverse prompt"

---
# PREVIEW - training will end 4/9
commit a87a7a188022bec44cffcb3ae9c250b8bacf7dd3 seems to be more stable than the lasts commits, the next one I will post only at 6/9
# core-prompt-reverser-opt-1.3b

This model is a fine-tuned version of facebook/opt-1.3b on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2950
- Accuracy: 0.7084


## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1.0

### Training results



### Framework versions

- Transformers 4.33.0.dev0
- Pytorch 2.1.0.dev20230605+cu121
- Datasets 2.14.4
- Tokenizers 0.13.3