File size: 2,555 Bytes
9c16bda
a1e3459
 
8414854
 
 
 
 
 
 
 
9c16bda
 
a1e3459
9c16bda
a1e3459
 
0a11050
 
 
a1e3459
0a11050
 
a1e3459
0a11050
 
 
 
 
 
 
 
 
9c16bda
a1e3459
9c16bda
a1e3459
 
0a11050
 
9c16bda
a1e3459
9c16bda
8414854
9c16bda
a1e3459
9c16bda
a1e3459
 
 
 
 
 
 
 
 
 
 
 
 
8414854
9c16bda
a1e3459
9c16bda
 
 
 
 
 
a1e3459
1b48d97
 
02ec9a7
 
c253a48
59f057f
0ae67a9
edf4d95
1e18ad1
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
    We imagine, design and commercialize innovative off-grid systems that aim to generate
    power at sea, stabilize and collect data. The success of our low power platforms
    WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
    platform.
---

## Environmental Impact (CODE CARBON DEFAULT)

| Metric                   | Value                           |
|--------------------------|---------------------------------|
| Duration (in seconds)    | 75576.03192782402  |
| Emissions (Co2eq in kg)  | 0.0457322172624719 |
| CPU power (W)            | 42.5  |
| GPU power (W)            | [No GPU]  |
| RAM power (W)            | 3.75  |
| CPU energy (kWh)         | 0.8922151447104096  |
| GPU energy (kWh)         | [No GPU]  |
| RAM energy (kWh)         | 0.0787242676687737  |
| Consumed energy (kWh)    | 0.9709394123791834  |
| Country name             | Switzerland  |
| Cloud provider           | nan  |
| Cloud region             | nan  |
| CPU count                | 2  |
| CPU model                | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz  |
| GPU count                | nan  |
| GPU model                | nan  |

## Environmental Impact (for one core)

| Metric                   | Value                           |
|--------------------------|---------------------------------|
| CPU energy (kWh)         | 0.14548386146106124  |
| Emissions (Co2eq in kg)  | 0.029600612505064405 |

## Note

14 juin 2024

## My Config

| Config                   | Value           |
|--------------------------|-----------------|
| checkpoint               | albert-base-v2  |
| model_name               | ft_8_2e6_base_x4 |
| sequence_length          | 400  |
| num_epoch                | 6  |
| learning_rate            | 2e-06  |
| batch_size               | 8  |
| weight_decay             | 0.0  |
| warm_up_prop             | 0.0  |
| drop_out_prob            | 0.1 |
| packing_length           | 100 |
| train_test_split         | 0.2 |
| num_steps                | 29328 |

## Training and Testing steps






 
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.745325 | 0.335876 |
| 1 | 0.355815 | 0.256684 | 0.905312 |
| 2 | 0.224772 | 0.226835 | 0.921716 |
| 3 | 0.185306 | 0.223102 | 0.918413 |
| 4 | 0.151347 | 0.234494 | 0.923235 |
| 5 | 0.119629 | 0.247906 | 0.919491 |
| 6 | 0.089336 | 0.264783 | 0.907933 |