File size: 2,561 Bytes
c80cf23 88c104b c80cf23 88c104b c80cf23 88c104b 1e7814f 88c104b 1e7814f 88c104b 1e7814f c80cf23 88c104b c80cf23 88c104b 1e7814f c80cf23 88c104b c80cf23 f7bef0b c80cf23 88c104b c80cf23 88c104b f7bef0b 88c104b c80cf23 88c104b c80cf23 88c104b 26d5f11 f1d05d4 a859914 94b4f85 06d7aca 6e71f14 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 |
---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 67082.63524746895 |
| Emissions (Co2eq in kg) | 0.0405927710270507 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7919466437844789 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0698773308031263 |
| Consumed energy (kWh) | 0.861823974587604 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.1291340728513777 |
| Emissions (Co2eq in kg) | 0.026274032138592 |
## Note
12 juillet 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr1e4_x2 |
| model_name | ft_1_17e6_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.7e-05 |
| batch_size | 1 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.730259 | 0.352548 |
| 1 | 0.302777 | 0.287648 | 0.934797 |
| 2 | 0.203525 | 0.249329 | 0.894996 |
| 3 | 0.132525 | 0.231056 | 0.935879 |
| 4 | 0.068135 | 0.276123 | 0.936830 |
| 5 | 0.030918 | 0.342866 | 0.926386 |
| 6 | 0.014191 | 0.363380 | 0.926961 |
|