File size: 2,555 Bytes
7757633
3ca0f14
 
 
 
 
 
 
 
 
 
7757633
 
3ca0f14
 
 
 
6f26c9e
 
 
3ca0f14
6f26c9e
 
3ca0f14
6f26c9e
 
 
 
 
 
 
 
 
3ca0f14
 
 
 
 
6f26c9e
 
3ca0f14
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2693c99
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
    We imagine, design and commercialize innovative off-grid systems that aim to generate
    power at sea, stabilize and collect data. The success of our low power platforms
    WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
    platform.
---

## Environmental Impact (CODE CARBON DEFAULT)

| Metric                   | Value                           |
|--------------------------|---------------------------------|
| Duration (in seconds)    | 93656.63418769836  |
| Emissions (Co2eq in kg)  | 0.0566730291084483 |
| CPU power (W)            | 42.5  |
| GPU power (W)            | [No GPU]  |
| RAM power (W)            | 3.75  |
| CPU energy (kWh)         | 1.105665677199434  |
| GPU energy (kWh)         | [No GPU]  |
| RAM energy (kWh)         | 0.0975578023115795  |
| Consumed energy (kWh)    | 1.2032234795110155  |
| Country name             | Switzerland  |
| Cloud provider           | nan  |
| Cloud region             | nan  |
| CPU count                | 2  |
| CPU model                | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz  |
| GPU count                | nan  |
| GPU model                | nan  |

## Environmental Impact (for one core)

| Metric                   | Value                           |
|--------------------------|---------------------------------|
| CPU energy (kWh)         | 0.18028902081131934  |
| Emissions (Co2eq in kg)  | 0.036682181723515186 |

## Note

19 juin 2024

## My Config

| Config                   | Value           |
|--------------------------|-----------------|
| checkpoint               | albert-base-v2  |
| model_name               | ft_4_10e6_base_x1 |
| sequence_length          | 400  |
| num_epoch                | 6  |
| learning_rate            | 1e-05  |
| batch_size               | 4  |
| weight_decay             | 0.0  |
| warm_up_prop             | 0.0  |
| drop_out_prob            | 0.1 |
| packing_length           | 100 |
| train_test_split         | 0.2 |
| num_steps                | 29328 |

## Training and Testing steps






 
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.771826 | 0.452522 |
| 1 | 0.298150 | 0.225640 | 0.935236 |
| 2 | 0.192938 | 0.222287 | 0.935015 |
| 3 | 0.145186 | 0.231643 | 0.913735 |
| 4 | 0.103925 | 0.249424 | 0.927651 |
| 5 | 0.070025 | 0.270552 | 0.917309 |
| 6 | 0.045620 | 0.305436 | 0.916324 |