File size: 2,609 Bytes
9f3791a
913b0e5
 
 
 
 
 
 
 
 
 
9f3791a
 
913b0e5
9f3791a
913b0e5
 
b853f46
 
 
913b0e5
b853f46
 
913b0e5
b853f46
 
 
 
 
 
 
 
 
9f3791a
913b0e5
9f3791a
913b0e5
 
b853f46
 
9f3791a
913b0e5
9f3791a
b853f46
9f3791a
913b0e5
9f3791a
913b0e5
 
b853f46
913b0e5
 
 
 
 
 
 
 
 
 
 
9f3791a
913b0e5
9f3791a
 
 
 
 
 
913b0e5
 
 
8918d05
 
 
 
2323f66
 
a71f8f2
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
    We imagine, design and commercialize innovative off-grid systems that aim to generate
    power at sea, stabilize and collect data. The success of our low power platforms
    WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
    platform.
---

## Environmental Impact (CODE CARBON DEFAULT)

| Metric                   | Value                           |
|--------------------------|---------------------------------|
| Duration (in seconds)    | [More Information Needed]  |
| Emissions (Co2eq in kg)  | [More Information Needed] |
| CPU power (W)            | [NO CPU]  |
| GPU power (W)            | [No GPU]  |
| RAM power (W)            | [More Information Needed]  |
| CPU energy (kWh)         | [No CPU]  |
| GPU energy (kWh)         | [No GPU]  |
| RAM energy (kWh)         | [More Information Needed]  |
| Consumed energy (kWh)    | [More Information Needed]  |
| Country name             | [More Information Needed]  |
| Cloud provider           | [No Cloud]  |
| Cloud region             | [No Cloud]  |
| CPU count                | [No CPU]  |
| CPU model                | [No CPU]  |
| GPU count                | [No GPU]  |
| GPU model                | [No GPU]  |

## Environmental Impact (for one core)

| Metric                   | Value                           |
|--------------------------|---------------------------------|
| CPU energy (kWh)         | [No CPU]  |
| Emissions (Co2eq in kg)  | [More Information Needed] |

## Note

12 juillet 2024

## My Config

| Config                   | Value           |
|--------------------------|-----------------|
| checkpoint               | damgomz/fp_bs32_lr1e4_x4  |
| model_name               | ft_2_9e6_x4 |
| sequence_length          | 400  |
| num_epoch                | 6  |
| learning_rate            | 9e-06  |
| batch_size               | 2  |
| weight_decay             | 0.0  |
| warm_up_prop             | 0.0  |
| drop_out_prob            | 0.1 |
| packing_length           | 100 |
| train_test_split         | 0.2 |
| num_steps                | 29328 |

## Training and Testing steps






 
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.703402 | 0.510109 |
| 1 | 0.255059 | 0.205928 | 0.936629 |
| 2 | 0.155122 | 0.221829 | 0.915467 |
| 3 | 0.091348 | 0.263971 | 0.928592 |
| 4 | 0.042537 | 0.318423 | 0.931796 |
| 5 | 0.020457 | 0.400993 | 0.934956 |
| 6 | 0.014147 | 0.411178 | 0.928118 |