File size: 2,556 Bytes
85d0c5d
c9d6bd2
 
ed5e3c7
 
 
 
 
 
 
 
85d0c5d
 
c9d6bd2
85d0c5d
c9d6bd2
 
2496751
 
 
c9d6bd2
2496751
 
c9d6bd2
2496751
 
 
 
 
 
 
 
 
85d0c5d
c9d6bd2
85d0c5d
c9d6bd2
 
2496751
 
85d0c5d
c9d6bd2
85d0c5d
ed5e3c7
85d0c5d
c9d6bd2
85d0c5d
c9d6bd2
 
 
 
 
 
 
 
 
 
 
 
 
ed5e3c7
85d0c5d
c9d6bd2
85d0c5d
 
 
 
 
 
c9d6bd2
ed5e3c7
 
e090eed
 
8669b6e
ce88e36
2538a22
a9e44e5
aeaf219
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
    We imagine, design and commercialize innovative off-grid systems that aim to generate
    power at sea, stabilize and collect data. The success of our low power platforms
    WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
    platform.
---

## Environmental Impact (CODE CARBON DEFAULT)

| Metric                   | Value                           |
|--------------------------|---------------------------------|
| Duration (in seconds)    | 84232.47019672394  |
| Emissions (Co2eq in kg)  | 0.0509703346790505 |
| CPU power (W)            | 42.5  |
| GPU power (W)            | [No GPU]  |
| RAM power (W)            | 3.75  |
| CPU energy (kWh)         | 0.9944085552609624  |
| GPU energy (kWh)         | [No GPU]  |
| RAM energy (kWh)         | 0.0877411799686654  |
| Consumed energy (kWh)    | 1.0821497352296257  |
| Country name             | Switzerland  |
| Cloud provider           | nan  |
| Cloud region             | nan  |
| CPU count                | 2  |
| CPU model                | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz  |
| GPU count                | nan  |
| GPU model                | nan  |

## Environmental Impact (for one core)

| Metric                   | Value                           |
|--------------------------|---------------------------------|
| CPU energy (kWh)         | 0.16214750512869358  |
| Emissions (Co2eq in kg)  | 0.03299105082705021 |

## Note

19 juin 2024

## My Config

| Config                   | Value           |
|--------------------------|-----------------|
| checkpoint               | albert-base-v2  |
| model_name               | ft_32_1e6_base_x2 |
| sequence_length          | 400  |
| num_epoch                | 6  |
| learning_rate            | 1e-06  |
| batch_size               | 32  |
| weight_decay             | 0.0  |
| warm_up_prop             | 0.0  |
| drop_out_prob            | 0.1 |
| packing_length           | 100 |
| train_test_split         | 0.2 |
| num_steps                | 29328 |

## Training and Testing steps






 
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.725595 | 0.272803 |
| 1 | 0.513231 | 0.405007 | 0.871395 |
| 2 | 0.350059 | 0.319134 | 0.884396 |
| 3 | 0.277223 | 0.271397 | 0.901570 |
| 4 | 0.234082 | 0.251465 | 0.902470 |
| 5 | 0.208181 | 0.246724 | 0.894811 |
| 6 | 0.188840 | 0.236156 | 0.919973 |