File size: 1,730 Bytes
f6f7ebf
 
 
 
 
68b4958
 
 
 
 
f6f7ebf
 
 
 
 
68b4958
f6f7ebf
68b4958
 
f6f7ebf
 
 
68b4958
f6f7ebf
68b4958
 
 
f6f7ebf
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
68b4958
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
---
tags:
- generated_from_trainer
metrics:
- accuracy
license: apache-2.0
datasets:
- pszemraj/simple_wikipedia_LM
language:
- en
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# griffin-v0.01-c3t-8layer-simplewiki-silu

- griffin/recurrent_gemma arch
- claude3 tokenizer (as an HF gpt2 tokenizer)

## Model description

pretrain experiment on the pszemraj/simple_wikipedia_LM dataset.

It achieves the following results on the evaluation set:
- Loss: 4.0476
- Accuracy: 0.4224

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 4
- eval_batch_size: 4
- seed: 80085
- gradient_accumulation_steps: 32
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.99) and epsilon=1e-07
- lr_scheduler_type: constant_with_warmup
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 2.0

### Training results

| Training Loss | Epoch  | Step | Validation Loss | Accuracy |
|:-------------:|:------:|:----:|:---------------:|:--------:|
| 13.3276       | 0.2548 | 100  | 12.0402         | 0.0131   |
| 8.9207        | 0.5095 | 200  | 8.0312          | 0.0360   |
| 7.2681        | 0.7643 | 300  | 6.4775          | 0.0506   |
| 6.3187        | 1.0190 | 400  | 5.6227          | 0.0434   |
| 5.5695        | 1.2738 | 500  | 4.7796          | 0.3635   |
| 5.2926        | 1.5285 | 600  | 4.3923          | 0.3952   |
| 4.878         | 1.7833 | 700  | 4.1877          | 0.4085   |


### Framework versions

- Transformers 4.40.1
- Pytorch 2.2.0+cu121
- Datasets 2.19.0
- Tokenizers 0.19.1