File size: 3,559 Bytes
384ca66
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
57a793f
 
 
384ca66
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
57a793f
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
---
tags:
- generated_from_trainer
datasets:
- uonlp/CulturaX
metrics:
- accuracy
model-index:
- name: gpt2+morf_s0-30-x-2_cx-en_00000-00009_50k
  results:
  - task:
      name: Causal Language Modeling
      type: text-generation
    dataset:
      name: uonlp/CulturaX en
      type: uonlp/CulturaX
      args: en
    metrics:
    - name: Accuracy
      type: accuracy
      value: 0.4329592727693433
license: mit
language:
- en
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# gpt2+morf_s0-30-x-2_cx-en_00000-00009_50k

This model is a fine-tuned version of [](https://huggingface.co/) on the uonlp/CulturaX en dataset.
It achieves the following results on the evaluation set:
- Loss: 2.8423
- Accuracy: 0.4330

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1.0

### Training results

| Training Loss | Epoch | Step   | Validation Loss | Accuracy |
|:-------------:|:-----:|:------:|:---------------:|:--------:|
| 3.6569        | 0.03  | 10000  | 3.5764          | 0.3502   |
| 3.4317        | 0.06  | 20000  | 3.3581          | 0.3727   |
| 3.3161        | 0.09  | 30000  | 3.2447          | 0.3848   |
| 3.2463        | 0.13  | 40000  | 3.1761          | 0.3924   |
| 3.1897        | 0.16  | 50000  | 3.1277          | 0.3977   |
| 3.152         | 0.19  | 60000  | 3.0910          | 0.4022   |
| 3.1341        | 0.22  | 70000  | 3.0575          | 0.4060   |
| 3.1006        | 0.25  | 80000  | 3.0363          | 0.4084   |
| 3.0806        | 0.28  | 90000  | 3.0118          | 0.4115   |
| 3.0555        | 0.31  | 100000 | 2.9919          | 0.4138   |
| 3.038         | 0.34  | 110000 | 2.9786          | 0.4156   |
| 3.0291        | 0.38  | 120000 | 2.9651          | 0.4171   |
| 3.0182        | 0.41  | 130000 | 2.9499          | 0.4191   |
| 3.0145        | 0.44  | 140000 | 2.9381          | 0.4205   |
| 2.9891        | 0.47  | 150000 | 2.9272          | 0.4219   |
| 2.9836        | 0.5   | 160000 | 2.9191          | 0.4230   |
| 2.9717        | 0.53  | 170000 | 2.9103          | 0.4241   |
| 2.9651        | 0.56  | 180000 | 2.9039          | 0.4250   |
| 2.9615        | 0.59  | 190000 | 2.8971          | 0.4258   |
| 2.9556        | 0.63  | 200000 | 2.8882          | 0.4269   |
| 2.9452        | 0.66  | 210000 | 2.8825          | 0.4277   |
| 2.9412        | 0.69  | 220000 | 2.8766          | 0.4284   |
| 2.9402        | 0.72  | 230000 | 2.8722          | 0.4290   |
| 2.9299        | 0.75  | 240000 | 2.8675          | 0.4296   |
| 2.9302        | 0.78  | 250000 | 2.8623          | 0.4304   |
| 2.9165        | 0.81  | 260000 | 2.8585          | 0.4308   |
| 2.915         | 0.84  | 270000 | 2.8537          | 0.4314   |
| 2.92          | 0.88  | 280000 | 2.8506          | 0.4319   |
| 2.9186        | 0.91  | 290000 | 2.8484          | 0.4321   |
| 2.9084        | 0.94  | 300000 | 2.8458          | 0.4325   |
| 2.9142        | 0.97  | 310000 | 2.8438          | 0.4327   |


### Framework versions

- Transformers 4.37.1
- Pytorch 2.1.2+cu121
- Datasets 2.16.1
- Tokenizers 0.15.1