File size: 2,772 Bytes
3775188
 
 
7830a28
d8fe766
3775188
 
 
7830a28
 
 
cab06b7
7830a28
 
 
 
 
cab06b7
 
3775188
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7830a28
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
---
license: gemma
tags:
- medical
base_model: google/gemma-2b
model-index:
- name: Gemma2b_V03_BRONCO_CARDIO_SUMMARY_CATALOG
  results: []
datasets:
- bigbio/bronco
- bigbio/cardiode
- Dev4Med/Notfallberichte-German-100
language:
- de
- en
metrics:
- f1
- precision
- recall
---

# Gemma2b_V03_BRONCO_CARDIO_SUMMARY_CATALOG

This model is a fine-tuned version of [google/gemma-2b](https://huggingface.co/google/gemma-2b) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2659
- Num Input Tokens Seen: 32680580

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 50
- num_epochs: 5

### Training results

| Training Loss | Epoch  | Step  | Validation Loss | Input Tokens Seen |
|:-------------:|:------:|:-----:|:---------------:|:-----------------:|
| 0.5621        | 0.2500 | 1370  | 0.5018          | 1573488           |
| 0.4518        | 0.4999 | 2740  | 0.4115          | 3273680           |
| 0.386         | 0.7499 | 4110  | 0.3684          | 4936100           |
| 0.364         | 0.9998 | 5480  | 0.3308          | 6511060           |
| 0.2709        | 1.2498 | 6850  | 0.3131          | 8278928           |
| 0.2422        | 1.4997 | 8220  | 0.2944          | 9895248           |
| 0.2264        | 1.7497 | 9590  | 0.2707          | 11490512          |
| 0.235         | 1.9996 | 10960 | 0.2554          | 13071912          |
| 0.1685        | 2.2496 | 12330 | 0.2606          | 14810372          |
| 0.1538        | 2.4995 | 13700 | 0.2519          | 16231800          |
| 0.1566        | 2.7495 | 15070 | 0.2459          | 17815564          |
| 0.1548        | 2.9995 | 16440 | 0.2387          | 19581608          |
| 0.1182        | 3.2494 | 17810 | 0.2599          | 21161040          |
| 0.1106        | 3.4994 | 19180 | 0.2576          | 22698788          |
| 0.1176        | 3.7493 | 20550 | 0.2550          | 24438800          |
| 0.1175        | 3.9993 | 21920 | 0.2558          | 26121308          |
| 0.1032        | 4.2492 | 23290 | 0.2645          | 27832740          |
| 0.0998        | 4.4992 | 24660 | 0.2656          | 29363700          |
| 0.102         | 4.7491 | 26030 | 0.2658          | 31053092          |
| 0.1019        | 4.9991 | 27400 | 0.2659          | 32672364          |


### Framework versions

- Transformers 4.40.2
- Pytorch 2.2.2+cu121
- Datasets 2.19.0
- Tokenizers 0.19.1