File size: 2,526 Bytes
9f27a2e
0c1234b
 
 
 
 
 
 
 
 
 
9f27a2e
 
0c1234b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
    We imagine, design and commercialize innovative off-grid systems that aim to generate
    power at sea, stabilize and collect data. The success of our low power platforms
    WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
    platform.
---

## Environmental Impact (CODE CARBON DEFAULT)

| Metric                   | Value                           |
|--------------------------|---------------------------------|
| Duration (in seconds)    | [More Information Needed]  |
| Emissions (Co2eq in kg)  | [More Information Needed] |
| CPU power (W)            | [NO CPU]  |
| GPU power (W)            | [No GPU]  |
| RAM power (W)            | [More Information Needed]  |
| CPU energy (kWh)         | [No CPU]  |
| GPU energy (kWh)         | [No GPU]  |
| RAM energy (kWh)         | [More Information Needed]  |
| Consumed energy (kWh)    | [More Information Needed]  |
| Country name             | [More Information Needed]  |
| Cloud provider           | [No Cloud]  |
| Cloud region             | [No Cloud]  |
| CPU count                | [No CPU]  |
| CPU model                | [No CPU]  |
| GPU count                | [No GPU]  |
| GPU model                | [No GPU]  |

## Environmental Impact (for one core)

| Metric                   | Value                           |
|--------------------------|---------------------------------|
| CPU energy (kWh)         | [No CPU]  |
| Emissions (Co2eq in kg)  | [More Information Needed] |

## Note

19 juin 2024

## My Config

| Config                   | Value           |
|--------------------------|-----------------|
| checkpoint               | albert-base-v2  |
| model_name               | ft_4_18e6_base_x4 |
| sequence_length          | 400  |
| num_epoch                | 6  |
| learning_rate            | 1.8e-05  |
| batch_size               | 4  |
| weight_decay             | 0.0  |
| warm_up_prop             | 0.0  |
| drop_out_prob            | 0.1 |
| packing_length           | 100 |
| train_test_split         | 0.2 |
| num_steps                | 29328 |

## Training and Testing steps






 
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.708547 | 0.309799 |
| 1 | 0.309565 | 0.261743 | 0.927694 |
| 2 | 0.280837 | 0.329386 | 0.899541 |
| 3 | 0.282914 | 0.423529 | 0.720371 |
| 4 | 0.283815 | 0.358924 | 0.889532 |