File size: 2,557 Bytes
2bbfcfa
3688944
 
 
 
 
 
 
 
 
 
2bbfcfa
 
3688944
 
 
 
62534d8
 
 
3688944
62534d8
 
3688944
62534d8
 
 
 
 
 
 
 
 
3688944
 
 
 
 
62534d8
 
3688944
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
    We imagine, design and commercialize innovative off-grid systems that aim to generate
    power at sea, stabilize and collect data. The success of our low power platforms
    WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
    platform.
---

## Environmental Impact (CODE CARBON DEFAULT)

| Metric                   | Value                           |
|--------------------------|---------------------------------|
| Duration (in seconds)    | 90950.34148287772  |
| Emissions (Co2eq in kg)  | 0.0550354524112484 |
| CPU power (W)            | 42.5  |
| GPU power (W)            | [No GPU]  |
| RAM power (W)            | 3.75  |
| CPU energy (kWh)         | 1.0737171362383515  |
| GPU energy (kWh)         | [No GPU]  |
| RAM energy (kWh)         | 0.0947389986897504  |
| Consumed energy (kWh)    | 1.168456134928101  |
| Country name             | Switzerland  |
| Cloud provider           | nan  |
| Cloud region             | nan  |
| CPU count                | 2  |
| CPU model                | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz  |
| GPU count                | nan  |
| GPU model                | nan  |

## Environmental Impact (for one core)

| Metric                   | Value                           |
|--------------------------|---------------------------------|
| CPU energy (kWh)         | 0.17507940735453958  |
| Emissions (Co2eq in kg)  | 0.035622217080793765 |

## Note

19 juin 2024

## My Config

| Config                   | Value           |
|--------------------------|-----------------|
| checkpoint               | albert-base-v2  |
| model_name               | ft_2_12e6_base_x2 |
| sequence_length          | 400  |
| num_epoch                | 6  |
| learning_rate            | 1.2e-05  |
| batch_size               | 2  |
| weight_decay             | 0.0  |
| warm_up_prop             | 0.0  |
| drop_out_prob            | 0.1 |
| packing_length           | 100 |
| train_test_split         | 0.2 |
| num_steps                | 29328 |

## Training and Testing steps






 
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.705588 | 0.171226 |
| 1 | 0.299717 | 0.226786 | 0.923670 |
| 2 | 0.209650 | 0.222374 | 0.918367 |
| 3 | 0.164724 | 0.231910 | 0.921650 |
| 4 | 0.115635 | 0.278792 | 0.912720 |
| 5 | 0.078925 | 0.293567 | 0.927080 |
| 6 | 0.050375 | 0.352881 | 0.924322 |