modelId
stringlengths 5
122
| author
stringlengths 2
42
| last_modified
unknown | downloads
int64 0
738M
| likes
int64 0
11k
| library_name
stringclasses 245
values | tags
sequencelengths 1
4.05k
| pipeline_tag
stringclasses 48
values | createdAt
unknown | card
stringlengths 1
901k
|
---|---|---|---|---|---|---|---|---|---|
Chahatdatascience/config-1 | Chahatdatascience | "2024-06-23T11:59:49Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"wav2vec2",
"automatic-speech-recognition",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | "2024-06-23T10:31:35Z" | Entry not found |
itay-nakash/model_2ec771cb72_sweep_glowing-spaceship-845 | itay-nakash | "2024-06-23T10:34:18Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-23T10:34:18Z" | Entry not found |
limaatulya/my_awesome_billsum_model_82 | limaatulya | "2024-06-23T10:39:07Z" | 0 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"t5",
"text2text-generation",
"generated_from_trainer",
"base_model:google-t5/t5-small",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text2text-generation | "2024-06-23T10:34:56Z" | ---
license: apache-2.0
base_model: google-t5/t5-small
tags:
- generated_from_trainer
metrics:
- rouge
model-index:
- name: my_awesome_billsum_model_82
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# my_awesome_billsum_model_82
This model is a fine-tuned version of [google-t5/t5-small](https://huggingface.co/google-t5/t5-small) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2651
- Rouge1: 0.9769
- Rouge2: 0.8861
- Rougel: 0.9414
- Rougelsum: 0.9398
- Gen Len: 4.9583
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:|
| No log | 1.0 | 12 | 0.1788 | 0.9675 | 0.8359 | 0.9136 | 0.9111 | 4.9792 |
| No log | 2.0 | 24 | 0.1578 | 0.9706 | 0.8564 | 0.9219 | 0.9199 | 5.0 |
| No log | 3.0 | 36 | 0.1606 | 0.974 | 0.8654 | 0.9317 | 0.9307 | 4.9375 |
| No log | 4.0 | 48 | 0.1720 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 5.0 | 60 | 0.1800 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 6.0 | 72 | 0.1871 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 7.0 | 84 | 0.1840 | 0.974 | 0.8654 | 0.9317 | 0.9307 | 4.9375 |
| No log | 8.0 | 96 | 0.1802 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 9.0 | 108 | 0.1672 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 10.0 | 120 | 0.1875 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 11.0 | 132 | 0.2060 | 0.9728 | 0.8655 | 0.9285 | 0.927 | 4.9792 |
| No log | 12.0 | 144 | 0.2068 | 0.9728 | 0.8655 | 0.9285 | 0.927 | 4.9792 |
| No log | 13.0 | 156 | 0.2064 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 14.0 | 168 | 0.2066 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 15.0 | 180 | 0.1867 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 16.0 | 192 | 0.1947 | 0.974 | 0.8654 | 0.9317 | 0.9307 | 4.9375 |
| No log | 17.0 | 204 | 0.1979 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 18.0 | 216 | 0.1971 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 19.0 | 228 | 0.1865 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 20.0 | 240 | 0.1757 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 21.0 | 252 | 0.1735 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 22.0 | 264 | 0.1846 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 23.0 | 276 | 0.2039 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 24.0 | 288 | 0.2251 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 25.0 | 300 | 0.2272 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 26.0 | 312 | 0.2165 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 27.0 | 324 | 0.2202 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 28.0 | 336 | 0.2166 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 29.0 | 348 | 0.2151 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 30.0 | 360 | 0.2151 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 31.0 | 372 | 0.2136 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 32.0 | 384 | 0.2206 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 33.0 | 396 | 0.2233 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 34.0 | 408 | 0.2220 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 35.0 | 420 | 0.2263 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 36.0 | 432 | 0.2298 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 37.0 | 444 | 0.2413 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 38.0 | 456 | 0.2407 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 39.0 | 468 | 0.2407 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 40.0 | 480 | 0.2420 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| No log | 41.0 | 492 | 0.2424 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 42.0 | 504 | 0.2442 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 43.0 | 516 | 0.2466 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 44.0 | 528 | 0.2416 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 45.0 | 540 | 0.2442 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 46.0 | 552 | 0.2457 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 47.0 | 564 | 0.2383 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 48.0 | 576 | 0.2481 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 49.0 | 588 | 0.2512 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 50.0 | 600 | 0.2510 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 51.0 | 612 | 0.2516 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 52.0 | 624 | 0.2491 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 53.0 | 636 | 0.2480 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 54.0 | 648 | 0.2493 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 55.0 | 660 | 0.2417 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 56.0 | 672 | 0.2320 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 57.0 | 684 | 0.2270 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 58.0 | 696 | 0.2351 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 59.0 | 708 | 0.2414 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 60.0 | 720 | 0.2490 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 61.0 | 732 | 0.2489 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 62.0 | 744 | 0.2496 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 63.0 | 756 | 0.2505 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 64.0 | 768 | 0.2515 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 65.0 | 780 | 0.2511 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 66.0 | 792 | 0.2521 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 67.0 | 804 | 0.2530 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 68.0 | 816 | 0.2536 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 69.0 | 828 | 0.2535 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 70.0 | 840 | 0.2575 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 71.0 | 852 | 0.2593 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 72.0 | 864 | 0.2588 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 73.0 | 876 | 0.2654 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 74.0 | 888 | 0.2622 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 75.0 | 900 | 0.2597 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 76.0 | 912 | 0.2586 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 77.0 | 924 | 0.2566 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 78.0 | 936 | 0.2554 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 79.0 | 948 | 0.2560 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 80.0 | 960 | 0.2582 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 81.0 | 972 | 0.2614 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 82.0 | 984 | 0.2652 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0483 | 83.0 | 996 | 0.2685 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0231 | 84.0 | 1008 | 0.2696 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0231 | 85.0 | 1020 | 0.2700 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0231 | 86.0 | 1032 | 0.2715 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0231 | 87.0 | 1044 | 0.2697 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0231 | 88.0 | 1056 | 0.2692 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0231 | 89.0 | 1068 | 0.2666 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0231 | 90.0 | 1080 | 0.2666 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0231 | 91.0 | 1092 | 0.2671 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0231 | 92.0 | 1104 | 0.2665 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0231 | 93.0 | 1116 | 0.2655 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0231 | 94.0 | 1128 | 0.2646 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0231 | 95.0 | 1140 | 0.2652 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0231 | 96.0 | 1152 | 0.2656 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0231 | 97.0 | 1164 | 0.2657 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0231 | 98.0 | 1176 | 0.2656 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0231 | 99.0 | 1188 | 0.2654 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
| 0.0231 | 100.0 | 1200 | 0.2651 | 0.9769 | 0.8861 | 0.9414 | 0.9398 | 4.9583 |
### Framework versions
- Transformers 4.41.2
- Pytorch 2.3.0+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1
|
itay-nakash/model_6d5c5a99e5_sweep_rare-bush-847 | itay-nakash | "2024-06-23T10:36:28Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-23T10:36:28Z" | Entry not found |
itay-nakash/model_71dd0b85f5_sweep_ruby-microwave-850 | itay-nakash | "2024-06-23T10:36:31Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-23T10:36:31Z" | Entry not found |
itay-nakash/model_9539ee4e06_sweep_glamorous-thunder-851 | itay-nakash | "2024-06-23T10:37:41Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-23T10:37:40Z" | Entry not found |
itay-nakash/model_47b4c49ddb_sweep_lemon-eon-854 | itay-nakash | "2024-06-23T10:39:55Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-23T10:39:55Z" | Entry not found |
Meghna0122/AI_waste_segregation | Meghna0122 | "2024-06-23T10:40:39Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-23T10:40:39Z" | Entry not found |
damgomz/ft_32_14e6_base_x1 | damgomz | "2024-06-23T23:58:22Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:41:32Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 49553.42141032219 |
| Emissions (Co2eq in kg) | 0.0299855659153916 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.5850048634934761 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0516178391113878 |
| Consumed energy (kWh) | 0.6366227026048621 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.0953903362148702 |
| Emissions (Co2eq in kg) | 0.01940842338570952 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_14e6_base_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.4e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.705695 | 0.349682 |
| 1 | 0.324650 | 0.245880 | 0.931878 |
| 2 | 0.186641 | 0.227299 | 0.899948 |
| 3 | 0.136289 | 0.220477 | 0.910546 |
| 4 | 0.083458 | 0.254802 | 0.912736 |
| 5 | 0.058306 | 0.268142 | 0.917724 |
| 6 | 0.045264 | 0.318445 | 0.899427 |
|
damgomz/ft_32_14e6_x2 | damgomz | "2024-06-23T23:44:53Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:41:36Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 48748.43414139747 |
| Emissions (Co2eq in kg) | 0.0294984522243246 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.5755014980089319 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0507793075859547 |
| Consumed energy (kWh) | 0.6262808055948847 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.09384073572219012 |
| Emissions (Co2eq in kg) | 0.019093136705380674 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x2 |
| model_name | ft_32_14e6_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.4e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.697253 | 0.168707 |
| 1 | 0.322929 | 0.210988 | 0.925817 |
| 2 | 0.166583 | 0.191676 | 0.934927 |
| 3 | 0.112184 | 0.226396 | 0.931735 |
| 4 | 0.062347 | 0.254119 | 0.918397 |
| 5 | 0.038214 | 0.337791 | 0.906920 |
| 6 | 0.026824 | 0.340202 | 0.923523 |
|
damgomz/ft_32_14e6_base_x4 | damgomz | "2024-06-24T00:07:51Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:41:47Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 50126.91908168793 |
| Emissions (Co2eq in kg) | 0.0303325962416147 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.591775274712012 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0522152189423639 |
| Consumed energy (kWh) | 0.6439904936543758 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.09649431923224926 |
| Emissions (Co2eq in kg) | 0.019633043306994436 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_14e6_base_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.4e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.706296 | 0.060120 |
| 1 | 0.324685 | 0.232908 | 0.917821 |
| 2 | 0.196300 | 0.220065 | 0.922584 |
| 3 | 0.146310 | 0.245110 | 0.891744 |
| 4 | 0.096830 | 0.267668 | 0.923489 |
| 5 | 0.059735 | 0.353932 | 0.895680 |
| 6 | 0.038564 | 0.376377 | 0.906003 |
|
damgomz/ft_32_14e6_x1 | damgomz | "2024-06-24T00:00:00Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:41:47Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 49655.92636561394 |
| Emissions (Co2eq in kg) | 0.030047590251623 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.5862149367236428 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0517246028110386 |
| Consumed energy (kWh) | 0.637939539534682 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.09558765825380683 |
| Emissions (Co2eq in kg) | 0.019448571159865456 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/ThunBERT_bs16_lr5_MLM |
| model_name | ft_32_14e6_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.4e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.711359 | 0.526633 |
| 1 | 0.285098 | 0.202718 | 0.936462 |
| 2 | 0.159492 | 0.210159 | 0.949438 |
| 3 | 0.106755 | 0.212429 | 0.918103 |
| 4 | 0.060597 | 0.251258 | 0.917170 |
| 5 | 0.024925 | 0.310591 | 0.920161 |
| 6 | 0.017295 | 0.324555 | 0.911884 |
|
damgomz/ft_32_14e6_base_x2 | damgomz | "2024-06-23T23:44:13Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:41:48Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 48707.72151255608 |
| Emissions (Co2eq in kg) | 0.0294738165084835 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.5750208622387717 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0507369031692544 |
| Consumed energy (kWh) | 0.6257577654080286 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.09376236391167045 |
| Emissions (Co2eq in kg) | 0.019077190925751133 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_14e6_base_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.4e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.708510 | 0.332460 |
| 1 | 0.308986 | 0.230178 | 0.909861 |
| 2 | 0.184904 | 0.211419 | 0.924938 |
| 3 | 0.128324 | 0.239823 | 0.924368 |
| 4 | 0.074300 | 0.282441 | 0.936272 |
| 5 | 0.044291 | 0.327128 | 0.909196 |
| 6 | 0.032622 | 0.391054 | 0.921756 |
|
damgomz/ft_32_14e6_x4 | damgomz | "2024-06-24T00:14:58Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:41:55Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 50553.10890841484 |
| Emissions (Co2eq in kg) | 0.0305904927130599 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.5968067046892317 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0526591816626487 |
| Consumed energy (kWh) | 0.649465886351881 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.09731473464869857 |
| Emissions (Co2eq in kg) | 0.01979996765579581 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x4 |
| model_name | ft_32_14e6_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.4e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.695373 | 0.498067 |
| 1 | 0.301758 | 0.220890 | 0.935375 |
| 2 | 0.170199 | 0.214461 | 0.934385 |
| 3 | 0.116739 | 0.219118 | 0.931017 |
| 4 | 0.069409 | 0.290862 | 0.911435 |
| 5 | 0.039477 | 0.319148 | 0.925073 |
| 6 | 0.026176 | 0.362954 | 0.922905 |
|
damgomz/ft_32_10e6_base_x1 | damgomz | "2024-06-24T01:20:37Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:42:14Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 54492.29978084564 |
| Emissions (Co2eq in kg) | 0.0329741509828358 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.6433108421163447 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0567624256027241 |
| Consumed energy (kWh) | 0.7000732677190684 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.10489767707812786 |
| Emissions (Co2eq in kg) | 0.021342817414164543 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_10e6_base_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.734880 | 0.591007 |
| 1 | 0.339730 | 0.227890 | 0.925695 |
| 2 | 0.193068 | 0.229836 | 0.933185 |
| 3 | 0.142397 | 0.214513 | 0.917664 |
| 4 | 0.092663 | 0.246131 | 0.917786 |
| 5 | 0.060364 | 0.298660 | 0.887692 |
| 6 | 0.040224 | 0.363768 | 0.884661 |
|
damgomz/ft_32_13e6_base_x12 | damgomz | "2024-06-23T17:42:20Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:42:24Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | [More Information Needed] |
| Emissions (Co2eq in kg) | [More Information Needed] |
| CPU power (W) | [NO CPU] |
| GPU power (W) | [No GPU] |
| RAM power (W) | [More Information Needed] |
| CPU energy (kWh) | [No CPU] |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | [More Information Needed] |
| Consumed energy (kWh) | [More Information Needed] |
| Country name | [More Information Needed] |
| Cloud provider | [No Cloud] |
| Cloud region | [No Cloud] |
| CPU count | [No CPU] |
| CPU model | [No CPU] |
| GPU count | [No GPU] |
| GPU model | [No GPU] |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | [No CPU] |
| Emissions (Co2eq in kg) | [More Information Needed] |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_13e6_base_x12 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.3e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.716555 | 0.000706 |
| 1 | 0.365165 | 0.277071 | 0.929461 |
| 2 | 0.234518 | 0.232122 | 0.903520 |
| 3 | 0.193173 | 0.228818 | 0.914826 |
| 4 | 0.163885 | 0.222521 | 0.921731 |
|
Kyouma45/fine_tune_paligemma | Kyouma45 | "2024-06-23T15:35:35Z" | 0 | 0 | null | [
"safetensors",
"license:mit",
"region:us"
] | null | "2024-06-23T10:42:26Z" | ---
license: mit
---
|
damgomz/ft_32_13e6_x12 | damgomz | "2024-06-24T01:13:26Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:42:36Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 54053.0493388176 |
| Emissions (Co2eq in kg) | 0.0327083536805448 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.6381252421420488 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0563048905630907 |
| Consumed energy (kWh) | 0.6944301327051408 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.10405211997722387 |
| Emissions (Co2eq in kg) | 0.021170777657703557 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x12 |
| model_name | ft_32_13e6_x12 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.3e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.714470 | 0.508991 |
| 1 | 0.313275 | 0.218205 | 0.919343 |
| 2 | 0.180288 | 0.214982 | 0.938783 |
| 3 | 0.126071 | 0.235109 | 0.926142 |
| 4 | 0.078124 | 0.285606 | 0.910749 |
| 5 | 0.046535 | 0.336369 | 0.920548 |
| 6 | 0.022901 | 0.382385 | 0.902624 |
|
damgomz/ft_32_18e6_base_x4 | damgomz | "2024-06-24T03:21:59Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:42:45Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 61774.44494199753 |
| Emissions (Co2eq in kg) | 0.0373806765880717 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7292802593949762 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0643478289273878 |
| Consumed energy (kWh) | 0.7936280883223655 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.11891580651334524 |
| Emissions (Co2eq in kg) | 0.0241949909356157 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_18e6_base_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.8e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.724677 | 0.306478 |
| 1 | 0.325122 | 0.243701 | 0.896817 |
| 2 | 0.193125 | 0.240784 | 0.896417 |
| 3 | 0.139279 | 0.282203 | 0.912782 |
| 4 | 0.098899 | 0.261478 | 0.927773 |
| 5 | 0.063704 | 0.340761 | 0.911821 |
| 6 | 0.037984 | 0.413277 | 0.918257 |
|
damgomz/ft_32_18e6_x4 | damgomz | "2024-06-24T03:20:25Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:42:55Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 61679.77289366722 |
| Emissions (Co2eq in kg) | 0.0373233783619048 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7281624145484634 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0642491767533124 |
| Consumed energy (kWh) | 0.7924115913017743 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.11873356282030939 |
| Emissions (Co2eq in kg) | 0.02415791105001966 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x4 |
| model_name | ft_32_18e6_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.8e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.729157 | 0.659373 |
| 1 | 0.298039 | 0.216890 | 0.927334 |
| 2 | 0.170153 | 0.203970 | 0.936771 |
| 3 | 0.112154 | 0.244054 | 0.901885 |
| 4 | 0.070577 | 0.272153 | 0.930896 |
| 5 | 0.041795 | 0.281028 | 0.925729 |
| 6 | 0.027958 | 0.358824 | 0.933668 |
|
smarthehe/qwen_unsafe_mix3 | smarthehe | "2024-06-23T10:43:39Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-23T10:43:39Z" | Entry not found |
damgomz/ft_32_10e6_x8 | damgomz | "2024-06-24T03:19:43Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:44:55Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 61638.38862705231 |
| Emissions (Co2eq in kg) | 0.0372983508123459 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7276740724540425 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0642061596502861 |
| Consumed energy (kWh) | 0.7918802321043279 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.11865389810707568 |
| Emissions (Co2eq in kg) | 0.02414170221226215 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x8 |
| model_name | ft_32_10e6_x8 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.702677 | 0.204757 |
| 1 | 0.321131 | 0.222224 | 0.910643 |
| 2 | 0.187377 | 0.213300 | 0.932097 |
| 3 | 0.140453 | 0.235801 | 0.907473 |
| 4 | 0.095066 | 0.254595 | 0.921588 |
| 5 | 0.062238 | 0.295134 | 0.923693 |
| 6 | 0.033753 | 0.372447 | 0.915545 |
|
tchen175/your-repo-name | tchen175 | "2024-06-23T10:45:13Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-23T10:45:13Z" | Entry not found |
damgomz/ft_32_12e6_base_x8 | damgomz | "2024-06-24T03:00:51Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:45:31Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 60506.266407728195 |
| Emissions (Co2eq in kg) | 0.0366132875121688 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7143088167700514 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0630268557670214 |
| Consumed energy (kWh) | 0.7773356725370759 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.11647456283487678 |
| Emissions (Co2eq in kg) | 0.02369828767636021 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_12e6_base_x8 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.2e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.716820 | 0.341096 |
| 1 | 0.341059 | 0.242102 | 0.902834 |
| 2 | 0.213414 | 0.232133 | 0.921278 |
| 3 | 0.165743 | 0.246041 | 0.915003 |
| 4 | 0.128356 | 0.239500 | 0.920798 |
| 5 | 0.088278 | 0.307523 | 0.896415 |
| 6 | 0.061984 | 0.337869 | 0.917907 |
|
WhiteGiverPlus/open-web-math-filtered | WhiteGiverPlus | "2024-06-23T10:46:00Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-23T10:46:00Z" | Entry not found |
damgomz/ft_32_10e6_base_x8 | damgomz | "2024-06-24T04:06:25Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:46:21Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 64439.61116409302 |
| Emissions (Co2eq in kg) | 0.038993406428673 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7607439085890851 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.067124001405885 |
| Consumed energy (kWh) | 0.8278679099949702 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.12404625149087906 |
| Emissions (Co2eq in kg) | 0.025238847705936433 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_10e6_base_x8 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.722382 | 0.623123 |
| 1 | 0.342272 | 0.250376 | 0.902948 |
| 2 | 0.212349 | 0.227735 | 0.926831 |
| 3 | 0.174827 | 0.249475 | 0.931909 |
| 4 | 0.133557 | 0.247448 | 0.923699 |
| 5 | 0.092758 | 0.272226 | 0.916025 |
| 6 | 0.067174 | 0.297975 | 0.923680 |
|
damgomz/ft_32_10e6_x12 | damgomz | "2024-06-24T04:11:10Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:46:29Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 64725.36603331566 |
| Emissions (Co2eq in kg) | 0.0391663320908972 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7641175464818881 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0674217433611552 |
| Consumed energy (kWh) | 0.8315392898430448 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.12459632961413264 |
| Emissions (Co2eq in kg) | 0.025350768363048632 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x12 |
| model_name | ft_32_10e6_x12 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.718682 | 0.255915 |
| 1 | 0.325344 | 0.234154 | 0.922187 |
| 2 | 0.188915 | 0.211627 | 0.934530 |
| 3 | 0.138495 | 0.225317 | 0.921540 |
| 4 | 0.093250 | 0.259228 | 0.932649 |
| 5 | 0.053548 | 0.294073 | 0.920488 |
| 6 | 0.031332 | 0.352463 | 0.908372 |
|
damgomz/ft_32_10e6_base_x12 | damgomz | "2024-06-24T04:14:45Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:46:29Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 64939.65093708038 |
| Emissions (Co2eq in kg) | 0.0392959976393814 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7666472850430347 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0676449303957324 |
| Consumed energy (kWh) | 0.8342922154387682 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.12500882805387975 |
| Emissions (Co2eq in kg) | 0.02543469661702315 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_10e6_base_x12 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.723034 | 0.340371 |
| 1 | 0.375047 | 0.287499 | 0.926210 |
| 2 | 0.246502 | 0.245994 | 0.915720 |
| 3 | 0.195115 | 0.223204 | 0.924802 |
| 4 | 0.167885 | 0.228968 | 0.921891 |
| 5 | 0.144182 | 0.236791 | 0.897302 |
| 6 | 0.115180 | 0.251106 | 0.922498 |
|
damgomz/ft_32_11e6_base_x2 | damgomz | "2024-06-24T02:24:45Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:46:53Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 58340.561074733734 |
| Emissions (Co2eq in kg) | 0.0353027897885612 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.6887415678969708 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0607709645027915 |
| Consumed energy (kWh) | 0.7495125323997637 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.11230558006886243 |
| Emissions (Co2eq in kg) | 0.022850053087604044 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_11e6_base_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.1e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.688052 | 0.405682 |
| 1 | 0.323996 | 0.236435 | 0.917469 |
| 2 | 0.186470 | 0.217464 | 0.907873 |
| 3 | 0.134185 | 0.244582 | 0.908838 |
| 4 | 0.084835 | 0.269350 | 0.931134 |
| 5 | 0.046221 | 0.325139 | 0.906680 |
| 6 | 0.029917 | 0.346234 | 0.920390 |
|
damgomz/ft_32_11e6_x1 | damgomz | "2024-06-24T02:26:44Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:46:56Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 58452.322566747665 |
| Emissions (Co2eq in kg) | 0.0353704143805683 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.6900609134708846 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0608873547308146 |
| Consumed energy (kWh) | 0.7509482682017012 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.11252072094098925 |
| Emissions (Co2eq in kg) | 0.022893826338642835 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/ThunBERT_bs16_lr5_MLM |
| model_name | ft_32_11e6_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.1e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.725747 | 0.313528 |
| 1 | 0.291823 | 0.197327 | 0.946298 |
| 2 | 0.161598 | 0.190254 | 0.946652 |
| 3 | 0.106739 | 0.219191 | 0.926633 |
| 4 | 0.058725 | 0.236603 | 0.934019 |
| 5 | 0.027315 | 0.284712 | 0.935789 |
| 6 | 0.011687 | 0.333927 | 0.926002 |
|
oz1115/t5-large_PREFIX_TUNING_SEQ2SEQ | oz1115 | "2024-06-23T10:47:11Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | null | "2024-06-23T10:47:06Z" | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
ameen2/testfinal | ameen2 | "2024-06-23T10:47:22Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-23T10:47:22Z" | Entry not found |
sakizuuuuu/yunahillitsaki | sakizuuuuu | "2024-06-23T10:48:30Z" | 0 | 0 | null | [
"license:openrail",
"region:us"
] | null | "2024-06-23T10:47:27Z" | ---
license: openrail
---
|
SwimChoi/villama2-7b-chat-Denmark-lora | SwimChoi | "2024-06-23T10:47:46Z" | 0 | 0 | peft | [
"peft",
"safetensors",
"arxiv:1910.09700",
"base_model:meta-llama/Llama-2-7b-chat-hf",
"region:us"
] | null | "2024-06-23T10:47:42Z" | ---
library_name: peft
base_model: meta-llama/Llama-2-7b-chat-hf
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.10.1.dev0 |
damgomz/ft_32_11e6_x2 | damgomz | "2024-06-24T03:14:02Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:47:44Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 61296.654522418976 |
| Emissions (Co2eq in kg) | 0.0370915599331974 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7236396963198996 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0638501644728085 |
| Consumed energy (kWh) | 0.7874898607927094 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.11799605995565653 |
| Emissions (Co2eq in kg) | 0.024007856354614096 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x2 |
| model_name | ft_32_11e6_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.1e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.701859 | 0.675479 |
| 1 | 0.326096 | 0.208626 | 0.909665 |
| 2 | 0.169499 | 0.200487 | 0.937234 |
| 3 | 0.115596 | 0.226355 | 0.927341 |
| 4 | 0.069538 | 0.255445 | 0.920743 |
| 5 | 0.037195 | 0.317331 | 0.924852 |
| 6 | 0.025854 | 0.344384 | 0.926451 |
|
damgomz/ft_32_11e6_base_x4 | damgomz | "2024-06-24T03:26:28Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:48:04Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 62042.28346157074 |
| Emissions (Co2eq in kg) | 0.0375427490804359 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7324422083359654 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0646268364918729 |
| Consumed energy (kWh) | 0.7970690448278377 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.11943139566352368 |
| Emissions (Co2eq in kg) | 0.02429989435578187 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_11e6_base_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.1e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.699534 | 0.263259 |
| 1 | 0.327131 | 0.254763 | 0.894355 |
| 2 | 0.193591 | 0.231249 | 0.916772 |
| 3 | 0.146482 | 0.230188 | 0.927157 |
| 4 | 0.100900 | 0.251480 | 0.918744 |
| 5 | 0.059710 | 0.303986 | 0.918022 |
| 6 | 0.040972 | 0.381227 | 0.896910 |
|
damgomz/ft_32_18e6_base_x2 | damgomz | "2024-06-24T03:02:48Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:48:10Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 60623.04388284683 |
| Emissions (Co2eq in kg) | 0.0366839478122947 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7156873837255774 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0631484756467243 |
| Consumed energy (kWh) | 0.778835859372302 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.11669935947448015 |
| Emissions (Co2eq in kg) | 0.023744025520781673 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_18e6_base_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.8e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.686975 | 0.457443 |
| 1 | 0.299301 | 0.233460 | 0.917952 |
| 2 | 0.178452 | 0.234576 | 0.927325 |
| 3 | 0.122078 | 0.265160 | 0.899049 |
| 4 | 0.075007 | 0.359669 | 0.891128 |
| 5 | 0.046868 | 0.363505 | 0.916224 |
| 6 | 0.037300 | 0.425532 | 0.877115 |
|
damgomz/ft_32_18e6_x2 | damgomz | "2024-06-24T03:07:07Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:48:20Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 60882.29117107391 |
| Emissions (Co2eq in kg) | 0.0368408189030136 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7187478788114237 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0634185062550008 |
| Consumed energy (kWh) | 0.7821663850664239 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.11719841050431727 |
| Emissions (Co2eq in kg) | 0.023845564042003945 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x2 |
| model_name | ft_32_18e6_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.8e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.709016 | 0.333987 |
| 1 | 0.296607 | 0.210878 | 0.914499 |
| 2 | 0.159499 | 0.213675 | 0.948872 |
| 3 | 0.103251 | 0.231337 | 0.924179 |
| 4 | 0.057495 | 0.268172 | 0.924240 |
| 5 | 0.031612 | 0.340274 | 0.911495 |
| 6 | 0.025240 | 0.308503 | 0.929302 |
|
SwimChoi/villama2-7b-chat-Bulgaria-lora | SwimChoi | "2024-06-23T10:49:07Z" | 0 | 0 | peft | [
"peft",
"safetensors",
"arxiv:1910.09700",
"base_model:meta-llama/Llama-2-7b-chat-hf",
"region:us"
] | null | "2024-06-23T10:49:03Z" | ---
library_name: peft
base_model: meta-llama/Llama-2-7b-chat-hf
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.10.1.dev0 |
damgomz/ft_32_12e6_base_x1 | damgomz | "2024-06-24T04:38:32Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:49:23Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 66367.16772437096 |
| Emissions (Co2eq in kg) | 0.040159805071551 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.783499805670645 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0691318809506792 |
| Consumed energy (kWh) | 0.8526316866213264 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.12775679786941407 |
| Emissions (Co2eq in kg) | 0.025993807358711953 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_12e6_base_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.2e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.727164 | 0.269014 |
| 1 | 0.342232 | 0.228468 | 0.919074 |
| 2 | 0.194223 | 0.210547 | 0.914134 |
| 3 | 0.132516 | 0.225793 | 0.919228 |
| 4 | 0.094546 | 0.247126 | 0.922731 |
| 5 | 0.063055 | 0.296505 | 0.898574 |
| 6 | 0.040402 | 0.310421 | 0.918942 |
|
damgomz/ft_32_13e6_x1 | damgomz | "2024-06-24T03:22:46Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:49:31Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 61821.70661497116 |
| Emissions (Co2eq in kg) | 0.0374092776802647 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7298382610713453 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0643970562311509 |
| Consumed energy (kWh) | 0.7942353173024941 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.11900678523381948 |
| Emissions (Co2eq in kg) | 0.024213501757530374 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/ThunBERT_bs16_lr5_MLM |
| model_name | ft_32_13e6_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.3e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.716095 | 0.454027 |
| 1 | 0.299995 | 0.204867 | 0.949707 |
| 2 | 0.162957 | 0.202141 | 0.925927 |
| 3 | 0.109887 | 0.226468 | 0.943325 |
| 4 | 0.059780 | 0.236016 | 0.934651 |
| 5 | 0.028645 | 0.288232 | 0.917235 |
| 6 | 0.016775 | 0.319037 | 0.915728 |
|
damgomz/ft_32_13e6_base_x1 | damgomz | "2024-06-24T03:25:20Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:49:33Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 61969.37371945381 |
| Emissions (Co2eq in kg) | 0.0374986335659504 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7315815464261486 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0645508831602831 |
| Consumed energy (kWh) | 0.7961324295864309 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.1192910444099486 |
| Emissions (Co2eq in kg) | 0.02427133804011941 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_13e6_base_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.3e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.747683 | 0.372014 |
| 1 | 0.322004 | 0.252306 | 0.935384 |
| 2 | 0.190558 | 0.236110 | 0.930112 |
| 3 | 0.137793 | 0.221180 | 0.929360 |
| 4 | 0.095192 | 0.237832 | 0.930265 |
| 5 | 0.064447 | 0.275405 | 0.930897 |
| 6 | 0.042057 | 0.294993 | 0.913735 |
|
damgomz/ft_32_11e6_x4 | damgomz | "2024-06-24T04:56:55Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:49:43Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 67470.11894965172 |
| Emissions (Co2eq in kg) | 0.0408272077312933 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.796520581936009 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0702807116945584 |
| Consumed energy (kWh) | 0.8668012936305676 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.12987997897807954 |
| Emissions (Co2eq in kg) | 0.026425796588613586 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x4 |
| model_name | ft_32_11e6_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.1e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.698941 | 0.833385 |
| 1 | 0.319600 | 0.228110 | 0.913771 |
| 2 | 0.181743 | 0.221087 | 0.913368 |
| 3 | 0.127414 | 0.240794 | 0.935666 |
| 4 | 0.076371 | 0.276369 | 0.927906 |
| 5 | 0.043185 | 0.350709 | 0.905340 |
| 6 | 0.023650 | 0.382893 | 0.918261 |
|
damgomz/ft_32_13e6_base_x2 | damgomz | "2024-06-24T03:32:57Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:49:48Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 62432.55982017517 |
| Emissions (Co2eq in kg) | 0.037778911534293 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.737049633434747 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0650333697773516 |
| Consumed energy (kWh) | 0.8020830032120988 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.1201826776538372 |
| Emissions (Co2eq in kg) | 0.02445275259623527 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_13e6_base_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.3e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.768791 | 0.666957 |
| 1 | 0.314481 | 0.239888 | 0.929780 |
| 2 | 0.185034 | 0.218459 | 0.924059 |
| 3 | 0.125580 | 0.236036 | 0.934296 |
| 4 | 0.077402 | 0.290235 | 0.926335 |
| 5 | 0.047750 | 0.322272 | 0.919217 |
| 6 | 0.030746 | 0.332044 | 0.906778 |
|
damgomz/ft_32_10e6_x1 | damgomz | "2024-06-24T03:49:17Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:50:05Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 63412.08262372017 |
| Emissions (Co2eq in kg) | 0.0383716406076001 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7486134944193908 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0660537229984999 |
| Consumed energy (kWh) | 0.8146672174178916 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.1220682590506613 |
| Emissions (Co2eq in kg) | 0.024836399027623732 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/ThunBERT_bs16_lr5_MLM |
| model_name | ft_32_10e6_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.717120 | 0.498894 |
| 1 | 0.313023 | 0.201526 | 0.930088 |
| 2 | 0.165281 | 0.193125 | 0.918497 |
| 3 | 0.117815 | 0.206144 | 0.928350 |
| 4 | 0.068427 | 0.234662 | 0.940792 |
| 5 | 0.033853 | 0.295565 | 0.907556 |
| 6 | 0.013773 | 0.316535 | 0.931560 |
|
damgomz/ft_32_12e6_x8 | damgomz | "2024-06-24T03:52:12Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:50:12Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 63587.756796360016 |
| Emissions (Co2eq in kg) | 0.0384779483901636 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7506875043221624 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0662367306232453 |
| Consumed energy (kWh) | 0.8169242349454086 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.12240643183299302 |
| Emissions (Co2eq in kg) | 0.024905204745241005 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x8 |
| model_name | ft_32_12e6_x8 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.2e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.696962 | 0.461799 |
| 1 | 0.315536 | 0.218524 | 0.918846 |
| 2 | 0.184044 | 0.212446 | 0.912889 |
| 3 | 0.134632 | 0.229983 | 0.912806 |
| 4 | 0.087121 | 0.274184 | 0.915734 |
| 5 | 0.053320 | 0.307776 | 0.923862 |
| 6 | 0.031863 | 0.373288 | 0.921723 |
|
damgomz/ft_32_11e6_x8 | damgomz | "2024-06-24T05:12:31Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:50:16Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 68405.32388114929 |
| Emissions (Co2eq in kg) | 0.0413931148120575 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.8075611485959747 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0712549022746583 |
| Consumed energy (kWh) | 0.8788160508706312 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.1316802484712124 |
| Emissions (Co2eq in kg) | 0.026792085186783474 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x8 |
| model_name | ft_32_11e6_x8 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.1e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.698797 | 0.535476 |
| 1 | 0.311891 | 0.216233 | 0.923965 |
| 2 | 0.181901 | 0.216357 | 0.933944 |
| 3 | 0.132345 | 0.218361 | 0.923078 |
| 4 | 0.088638 | 0.283297 | 0.919674 |
| 5 | 0.055888 | 0.293291 | 0.924256 |
| 6 | 0.035945 | 0.366886 | 0.914551 |
|
damgomz/ft_32_10e6_base_x2 | damgomz | "2024-06-24T03:55:09Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:50:17Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 63764.16578125954 |
| Emissions (Co2eq in kg) | 0.0385846889448661 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7527699625846387 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.066420478061338 |
| Consumed energy (kWh) | 0.819190440645976 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.1227460191289246 |
| Emissions (Co2eq in kg) | 0.02497429826432665 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_10e6_base_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.705993 | 0.463870 |
| 1 | 0.325754 | 0.227605 | 0.915561 |
| 2 | 0.189222 | 0.230228 | 0.927057 |
| 3 | 0.132360 | 0.241706 | 0.904789 |
| 4 | 0.082430 | 0.286002 | 0.917147 |
| 5 | 0.046848 | 0.313149 | 0.916334 |
| 6 | 0.026017 | 0.372005 | 0.909557 |
|
damgomz/ft_32_11e6_base_x8 | damgomz | "2024-06-24T05:18:32Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:50:20Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 68766.21276664734 |
| Emissions (Co2eq in kg) | 0.0416115036104767 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.8118218287082183 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0716308285703264 |
| Consumed energy (kWh) | 0.8834526572785455 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.13237495957579612 |
| Emissions (Co2eq in kg) | 0.026933433333603537 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_11e6_base_x8 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.1e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.710617 | 0.509886 |
| 1 | 0.338455 | 0.250426 | 0.890439 |
| 2 | 0.214005 | 0.222999 | 0.925721 |
| 3 | 0.165656 | 0.228164 | 0.936637 |
| 4 | 0.128775 | 0.273561 | 0.884398 |
| 5 | 0.093063 | 0.281123 | 0.909365 |
| 6 | 0.063092 | 0.332259 | 0.906240 |
|
damgomz/ft_32_3e6_x4 | damgomz | "2024-06-24T03:51:54Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:50:23Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 63568.50374484062 |
| Emissions (Co2eq in kg) | 0.038466277002403 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7504598892933815 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0662165507482984 |
| Consumed energy (kWh) | 0.8166764400416776 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.12236936970881819 |
| Emissions (Co2eq in kg) | 0.02489766396672924 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x4 |
| model_name | ft_32_3e6_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 3e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.703375 | 0.154776 |
| 1 | 0.420278 | 0.274999 | 0.894289 |
| 2 | 0.235648 | 0.230832 | 0.927197 |
| 3 | 0.184423 | 0.218322 | 0.914166 |
| 4 | 0.154378 | 0.225190 | 0.903943 |
| 5 | 0.127300 | 0.224724 | 0.908620 |
| 6 | 0.104520 | 0.228851 | 0.922266 |
|
SwimChoi/villama2-7b-chat-Belgium-lora | SwimChoi | "2024-06-23T10:50:30Z" | 0 | 0 | peft | [
"peft",
"safetensors",
"arxiv:1910.09700",
"base_model:meta-llama/Llama-2-7b-chat-hf",
"region:us"
] | null | "2024-06-23T10:50:26Z" | ---
library_name: peft
base_model: meta-llama/Llama-2-7b-chat-hf
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.10.1.dev0 |
damgomz/ft_32_9e6_base_x1 | damgomz | "2024-06-24T04:07:46Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:50:27Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 64521.65361833573 |
| Emissions (Co2eq in kg) | 0.0390430510238909 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7617124852599358 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0672094277332226 |
| Consumed energy (kWh) | 0.8289219129931618 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.12420418321529628 |
| Emissions (Co2eq in kg) | 0.025270981000514826 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_9e6_base_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 9e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.708413 | 0.765260 |
| 1 | 0.351244 | 0.243379 | 0.913919 |
| 2 | 0.197510 | 0.226436 | 0.914135 |
| 3 | 0.140162 | 0.216890 | 0.936142 |
| 4 | 0.096865 | 0.262625 | 0.912753 |
| 5 | 0.063076 | 0.296102 | 0.917618 |
| 6 | 0.041265 | 0.310545 | 0.909462 |
|
damgomz/ft_32_11e6_base_x12 | damgomz | "2024-06-24T05:26:02Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:50:30Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 69216.67906451225 |
| Emissions (Co2eq in kg) | 0.0418840844673213 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.8171397599284844 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0721000538190207 |
| Consumed energy (kWh) | 0.8892398137475067 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.13324210719918608 |
| Emissions (Co2eq in kg) | 0.027109865966933967 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_11e6_base_x12 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.1e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.705994 | 0.202603 |
| 1 | 0.382078 | 0.266890 | 0.892989 |
| 2 | 0.237866 | 0.231193 | 0.893412 |
| 3 | 0.190063 | 0.220246 | 0.918815 |
| 4 | 0.162988 | 0.232781 | 0.894941 |
| 5 | 0.133773 | 0.235240 | 0.927856 |
| 6 | 0.097648 | 0.299021 | 0.898977 |
|
damgomz/ft_32_12e6_x12 | damgomz | "2024-06-24T04:08:44Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:50:30Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 64579.179790735245 |
| Emissions (Co2eq in kg) | 0.0390778682930347 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7623917148543731 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0672694025961063 |
| Consumed energy (kWh) | 0.8296611174504743 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.12431492109716535 |
| Emissions (Co2eq in kg) | 0.025293512084704636 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x12 |
| model_name | ft_32_12e6_x12 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.2e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.693914 | 0.744271 |
| 1 | 0.310272 | 0.217811 | 0.926124 |
| 2 | 0.181070 | 0.214467 | 0.935254 |
| 3 | 0.131239 | 0.237009 | 0.924359 |
| 4 | 0.083097 | 0.273741 | 0.923407 |
| 5 | 0.051251 | 0.313456 | 0.924904 |
| 6 | 0.030361 | 0.367706 | 0.905008 |
|
damgomz/ft_32_11e6_x12 | damgomz | "2024-06-24T05:35:49Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:50:31Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 69803.57610917091 |
| Emissions (Co2eq in kg) | 0.0422392223178722 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.8240683347692086 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0727114006477098 |
| Consumed energy (kWh) | 0.8967797354169186 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.134371884010154 |
| Emissions (Co2eq in kg) | 0.027339733976091938 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x12 |
| model_name | ft_32_11e6_x12 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.1e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.704184 | 0.263856 |
| 1 | 0.321729 | 0.222099 | 0.914283 |
| 2 | 0.186099 | 0.221556 | 0.911754 |
| 3 | 0.136086 | 0.232018 | 0.920171 |
| 4 | 0.090092 | 0.276801 | 0.910476 |
| 5 | 0.053885 | 0.330174 | 0.925608 |
| 6 | 0.036190 | 0.351063 | 0.907287 |
|
damgomz/ft_32_12e6_base_x12 | damgomz | "2024-06-24T04:05:53Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:50:35Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 64407.97327756882 |
| Emissions (Co2eq in kg) | 0.0389742685598844 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7603705253824606 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0670910690610606 |
| Consumed energy (kWh) | 0.8274615944435236 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.12398534855931997 |
| Emissions (Co2eq in kg) | 0.025226456200381123 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_12e6_base_x12 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.2e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.711305 | 0.348983 |
| 1 | 0.371980 | 0.281835 | 0.911239 |
| 2 | 0.233438 | 0.235064 | 0.887677 |
| 3 | 0.190237 | 0.228277 | 0.928511 |
| 4 | 0.157534 | 0.223249 | 0.918371 |
| 5 | 0.126999 | 0.271003 | 0.915995 |
| 6 | 0.099254 | 0.316929 | 0.933282 |
|
damgomz/ft_32_3e6_base_x8 | damgomz | "2024-06-24T04:08:43Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:50:37Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 64577.61004495621 |
| Emissions (Co2eq in kg) | 0.0390769031568386 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7623729121769484 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0672677144942184 |
| Consumed energy (kWh) | 0.8296406266711681 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.12431189933654069 |
| Emissions (Co2eq in kg) | 0.025292897267607844 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_3e6_base_x8 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 3e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.732783 | 0.331379 |
| 1 | 0.400023 | 0.305358 | 0.884755 |
| 2 | 0.267000 | 0.251737 | 0.897428 |
| 3 | 0.220699 | 0.240164 | 0.918675 |
| 4 | 0.186956 | 0.222951 | 0.916962 |
| 5 | 0.163473 | 0.257377 | 0.918532 |
| 6 | 0.144596 | 0.238047 | 0.902378 |
|
damgomz/ft_32_8e6_x4 | damgomz | "2024-06-24T04:15:36Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:50:39Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 64991.684680223465 |
| Emissions (Co2eq in kg) | 0.0393274767441526 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7672614934131512 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0676990540094674 |
| Consumed energy (kWh) | 0.8349605474226162 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.12510899300943018 |
| Emissions (Co2eq in kg) | 0.025455076499754193 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x4 |
| model_name | ft_32_8e6_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 8e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.723429 | 0.500086 |
| 1 | 0.338580 | 0.231171 | 0.931261 |
| 2 | 0.184393 | 0.220689 | 0.914968 |
| 3 | 0.138274 | 0.213500 | 0.929883 |
| 4 | 0.097061 | 0.250910 | 0.921640 |
| 5 | 0.061709 | 0.287389 | 0.936064 |
| 6 | 0.038919 | 0.330413 | 0.922205 |
|
damgomz/ft_32_7e6_base_x8 | damgomz | "2024-06-24T04:06:01Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:50:40Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 64410.82313990593 |
| Emissions (Co2eq in kg) | 0.038975982343514 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7604040039589012 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0670939757764339 |
| Consumed energy (kWh) | 0.827497979735336 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.12399083454431892 |
| Emissions (Co2eq in kg) | 0.02522757239646315 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_7e6_base_x8 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 7e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.718227 | 0.498851 |
| 1 | 0.359866 | 0.252811 | 0.911557 |
| 2 | 0.222826 | 0.242313 | 0.940853 |
| 3 | 0.178143 | 0.234529 | 0.922860 |
| 4 | 0.146431 | 0.229160 | 0.913996 |
| 5 | 0.116026 | 0.253457 | 0.930856 |
| 6 | 0.080290 | 0.305930 | 0.912067 |
|
damgomz/ft_32_13e6_x8 | damgomz | "2024-06-24T02:21:26Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:50:48Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 58134.33598256111 |
| Emissions (Co2eq in kg) | 0.0351779995462016 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.6863069588215808 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.06055615546902 |
| Consumed energy (kWh) | 0.7468631142906021 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.11190859676643013 |
| Emissions (Co2eq in kg) | 0.022769281593169766 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x8 |
| model_name | ft_32_13e6_x8 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.3e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.702471 | 0.333360 |
| 1 | 0.305049 | 0.223681 | 0.913020 |
| 2 | 0.177468 | 0.213456 | 0.919667 |
| 3 | 0.128068 | 0.250218 | 0.908615 |
| 4 | 0.080486 | 0.261785 | 0.918915 |
| 5 | 0.044342 | 0.315824 | 0.912664 |
| 6 | 0.030384 | 0.366964 | 0.921336 |
|
damgomz/ft_32_7e6_x8 | damgomz | "2024-06-24T03:54:37Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:50:57Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 63732.416165828705 |
| Emissions (Co2eq in kg) | 0.0385654774232085 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7523951675482922 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0663873938242594 |
| Consumed energy (kWh) | 0.818782561372552 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.12268490111922026 |
| Emissions (Co2eq in kg) | 0.02496186299828291 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x8 |
| model_name | ft_32_7e6_x8 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 7e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.715937 | 0.340792 |
| 1 | 0.347706 | 0.230302 | 0.914348 |
| 2 | 0.198954 | 0.222575 | 0.912178 |
| 3 | 0.156681 | 0.228229 | 0.910182 |
| 4 | 0.114634 | 0.247019 | 0.917011 |
| 5 | 0.079275 | 0.280220 | 0.913811 |
| 6 | 0.045209 | 0.327616 | 0.911830 |
|
damgomz/ft_32_8e6_base_x8 | damgomz | "2024-06-24T04:33:29Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:50:58Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 66063.97350144386 |
| Emissions (Co2eq in kg) | 0.0399763235131582 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7799201850107952 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0688160098291934 |
| Consumed energy (kWh) | 0.8487361948399871 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.12717314899027943 |
| Emissions (Co2eq in kg) | 0.025875056288065513 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_8e6_base_x8 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 8e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.702749 | 0.410326 |
| 1 | 0.352046 | 0.262524 | 0.903588 |
| 2 | 0.217527 | 0.235564 | 0.918831 |
| 3 | 0.175118 | 0.240517 | 0.935229 |
| 4 | 0.138116 | 0.235018 | 0.932630 |
| 5 | 0.102132 | 0.320497 | 0.858109 |
| 6 | 0.076790 | 0.307403 | 0.909850 |
|
damgomz/ft_32_8e6_x8 | damgomz | "2024-06-24T04:30:23Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:51:06Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 65877.98792648315 |
| Emissions (Co2eq in kg) | 0.0398637902725759 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7777247187856171 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0686222860035796 |
| Consumed energy (kWh) | 0.8463470047891971 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.12681512675848006 |
| Emissions (Co2eq in kg) | 0.025802211937872566 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x8 |
| model_name | ft_32_8e6_x8 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 8e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.694601 | 0.133249 |
| 1 | 0.331750 | 0.253148 | 0.897015 |
| 2 | 0.196440 | 0.210067 | 0.922130 |
| 3 | 0.149955 | 0.218822 | 0.925623 |
| 4 | 0.111981 | 0.235200 | 0.920259 |
| 5 | 0.073023 | 0.282821 | 0.919379 |
| 6 | 0.043514 | 0.343314 | 0.902525 |
|
damgomz/ft_32_8e6_base_x4 | damgomz | "2024-06-24T04:24:17Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:51:17Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 65512.133805036545 |
| Emissions (Co2eq in kg) | 0.0396424103586092 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7734056813597672 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.068241212731848 |
| Consumed energy (kWh) | 0.8416468940916164 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.12611085757469537 |
| Emissions (Co2eq in kg) | 0.025658919073639316 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_8e6_base_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 8e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.723883 | 0.296267 |
| 1 | 0.333335 | 0.260794 | 0.899626 |
| 2 | 0.207532 | 0.226265 | 0.904981 |
| 3 | 0.154799 | 0.226320 | 0.922738 |
| 4 | 0.109941 | 0.255653 | 0.913665 |
| 5 | 0.073189 | 0.285200 | 0.910820 |
| 6 | 0.046709 | 0.327089 | 0.934989 |
|
damgomz/ft_32_7e6_base_x12 | damgomz | "2024-06-23T21:11:59Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:51:20Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | [More Information Needed] |
| Emissions (Co2eq in kg) | [More Information Needed] |
| CPU power (W) | [NO CPU] |
| GPU power (W) | [No GPU] |
| RAM power (W) | [More Information Needed] |
| CPU energy (kWh) | [No CPU] |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | [More Information Needed] |
| Consumed energy (kWh) | [More Information Needed] |
| Country name | [More Information Needed] |
| Cloud provider | [No Cloud] |
| Cloud region | [No Cloud] |
| CPU count | [No CPU] |
| CPU model | [No CPU] |
| GPU count | [No GPU] |
| GPU model | [No GPU] |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | [No CPU] |
| Emissions (Co2eq in kg) | [More Information Needed] |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_7e6_base_x12 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 7e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.698241 | 0.343033 |
| 1 | 0.388068 | 0.293261 | 0.883665 |
| 2 | 0.249783 | 0.291608 | 0.910089 |
| 3 | 0.199442 | 0.224127 | 0.914403 |
| 4 | 0.175401 | 0.235763 | 0.932895 |
| 5 | 0.149710 | 0.254013 | 0.899420 |
| 6 | 0.121423 | 0.259248 | 0.916229 |
|
damgomz/ft_32_10e6_x2 | damgomz | "2024-06-24T04:22:57Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:51:31Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 65431.41835165024 |
| Emissions (Co2eq in kg) | 0.0395935657937122 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7724527332756258 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0681571432389316 |
| Consumed energy (kWh) | 0.8406098765145591 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.12595548032692672 |
| Emissions (Co2eq in kg) | 0.025627305521063008 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x2 |
| model_name | ft_32_10e6_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.715400 | 0.665456 |
| 1 | 0.334307 | 0.214862 | 0.923360 |
| 2 | 0.169888 | 0.213791 | 0.914006 |
| 3 | 0.119825 | 0.213322 | 0.921408 |
| 4 | 0.071265 | 0.261560 | 0.918987 |
| 5 | 0.037882 | 0.301269 | 0.922922 |
| 6 | 0.022619 | 0.348024 | 0.924876 |
|
RemBish/4 | RemBish | "2024-06-23T10:51:32Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-23T10:51:32Z" | Entry not found |
damgomz/ft_32_12e6_x1 | damgomz | "2024-06-24T04:46:21Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:51:37Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 66836.08788919449 |
| Emissions (Co2eq in kg) | 0.0404435442708852 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7890354871817743 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0696202583625913 |
| Consumed energy (kWh) | 0.8586557455443669 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.1286594691866994 |
| Emissions (Co2eq in kg) | 0.026177467756601173 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/ThunBERT_bs16_lr5_MLM |
| model_name | ft_32_12e6_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.2e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.721379 | 0.459164 |
| 1 | 0.301007 | 0.204025 | 0.934254 |
| 2 | 0.160431 | 0.191494 | 0.938615 |
| 3 | 0.107821 | 0.208335 | 0.923460 |
| 4 | 0.057000 | 0.243971 | 0.919176 |
| 5 | 0.026577 | 0.287986 | 0.920489 |
| 6 | 0.011730 | 0.313262 | 0.919374 |
|
damgomz/ft_32_8e6_x12 | damgomz | "2024-06-24T02:07:48Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:51:40Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | [More Information Needed] |
| Emissions (Co2eq in kg) | [More Information Needed] |
| CPU power (W) | [NO CPU] |
| GPU power (W) | [No GPU] |
| RAM power (W) | [More Information Needed] |
| CPU energy (kWh) | [No CPU] |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | [More Information Needed] |
| Consumed energy (kWh) | [More Information Needed] |
| Country name | [More Information Needed] |
| Cloud provider | [No Cloud] |
| Cloud region | [No Cloud] |
| CPU count | [No CPU] |
| CPU model | [No CPU] |
| GPU count | [No GPU] |
| GPU model | [No GPU] |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | [No CPU] |
| Emissions (Co2eq in kg) | [More Information Needed] |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x12 |
| model_name | ft_32_8e6_x12 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 8e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.701202 | 0.261583 |
| 1 | 0.346899 | 0.237972 | 0.925277 |
|
damgomz/ft_32_8e6_base_x12 | damgomz | "2024-06-24T04:44:10Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:51:41Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 66705.5286591053 |
| Emissions (Co2eq in kg) | 0.0403645490822522 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7874943062462757 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.069484294680506 |
| Consumed energy (kWh) | 0.8569786009267806 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.12840814266877768 |
| Emissions (Co2eq in kg) | 0.026126332058149574 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_8e6_base_x12 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 8e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.701417 | 0.220863 |
| 1 | 0.379794 | 0.292918 | 0.894969 |
| 2 | 0.257515 | 0.247921 | 0.908974 |
| 3 | 0.206821 | 0.219694 | 0.912354 |
| 4 | 0.176935 | 0.223465 | 0.912714 |
| 5 | 0.159976 | 0.231351 | 0.915515 |
| 6 | 0.128451 | 0.245311 | 0.912663 |
|
damgomz/ft_32_12e6_x2 | damgomz | "2024-06-24T04:51:38Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:51:48Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 67153.54514575005 |
| Emissions (Co2eq in kg) | 0.0406356388022253 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7927831561014057 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0699509430095555 |
| Consumed energy (kWh) | 0.8627340991109596 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.12927057440556883 |
| Emissions (Co2eq in kg) | 0.026301805182085435 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x2 |
| model_name | ft_32_12e6_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.2e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.711447 | 0.667741 |
| 1 | 0.327382 | 0.208078 | 0.946239 |
| 2 | 0.165004 | 0.198591 | 0.934509 |
| 3 | 0.113291 | 0.229870 | 0.935010 |
| 4 | 0.068590 | 0.253574 | 0.926723 |
| 5 | 0.037144 | 0.320013 | 0.920483 |
| 6 | 0.026105 | 0.338016 | 0.916673 |
|
damgomz/ft_32_12e6_base_x2 | damgomz | "2024-06-24T04:53:32Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:51:53Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 67267.01394557953 |
| Emissions (Co2eq in kg) | 0.0407043058450788 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7941228226302419 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0700691445271172 |
| Consumed energy (kWh) | 0.8641919671573616 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.1294890018452406 |
| Emissions (Co2eq in kg) | 0.026346247128685312 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_12e6_base_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.2e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.700174 | 0.457632 |
| 1 | 0.310045 | 0.220452 | 0.923740 |
| 2 | 0.184045 | 0.228084 | 0.903657 |
| 3 | 0.130219 | 0.243912 | 0.922642 |
| 4 | 0.077956 | 0.262247 | 0.915680 |
| 5 | 0.044828 | 0.349786 | 0.904203 |
| 6 | 0.029975 | 0.360177 | 0.895238 |
|
damgomz/ft_32_10e6_x4 | damgomz | "2024-06-24T04:34:56Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:51:56Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 66150.79423570633 |
| Emissions (Co2eq in kg) | 0.0400288626203477 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7809451703021916 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0689064808378616 |
| Consumed energy (kWh) | 0.8498516511400547 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.12734027890373467 |
| Emissions (Co2eq in kg) | 0.025909061075651645 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x4 |
| model_name | ft_32_10e6_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.704944 | 0.380010 |
| 1 | 0.329930 | 0.228898 | 0.913834 |
| 2 | 0.185894 | 0.209666 | 0.943654 |
| 3 | 0.132470 | 0.229512 | 0.942634 |
| 4 | 0.086788 | 0.285951 | 0.906477 |
| 5 | 0.053113 | 0.280450 | 0.918628 |
| 6 | 0.036825 | 0.399850 | 0.898333 |
|
damgomz/ft_32_10e6_base_x4 | damgomz | "2024-06-24T04:32:12Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:52:01Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 65986.95319247246 |
| Emissions (Co2eq in kg) | 0.0399297276165722 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7790110763902462 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0687358422860502 |
| Consumed energy (kWh) | 0.8477469186762958 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.12702488489550948 |
| Emissions (Co2eq in kg) | 0.025844890000385045 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_10e6_base_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.693870 | 0.572993 |
| 1 | 0.338701 | 0.255779 | 0.903640 |
| 2 | 0.199152 | 0.226268 | 0.900425 |
| 3 | 0.153141 | 0.230576 | 0.927822 |
| 4 | 0.107643 | 0.256932 | 0.909967 |
| 5 | 0.074252 | 0.323469 | 0.901497 |
| 6 | 0.045206 | 0.336367 | 0.903879 |
|
damgomz/ft_32_12e6_base_x4 | damgomz | "2024-06-24T05:06:18Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:52:10Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 68032.67645573616 |
| Emissions (Co2eq in kg) | 0.0411676188243899 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.8031618302038982 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0708667217247187 |
| Consumed energy (kWh) | 0.8740285519286201 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.13096290217729212 |
| Emissions (Co2eq in kg) | 0.026646131611829993 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_12e6_base_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.2e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.711194 | 0.490326 |
| 1 | 0.326585 | 0.260003 | 0.875910 |
| 2 | 0.199336 | 0.218389 | 0.917410 |
| 3 | 0.147479 | 0.232200 | 0.922176 |
| 4 | 0.104865 | 0.278120 | 0.909022 |
| 5 | 0.062551 | 0.300760 | 0.921123 |
| 6 | 0.045031 | 0.370350 | 0.917923 |
|
damgomz/ft_32_12e6_x4 | damgomz | "2024-06-24T05:10:47Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:52:16Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 68301.78279232979 |
| Emissions (Co2eq in kg) | 0.0413304590832366 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.8063387830394879 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0711470258230964 |
| Consumed energy (kWh) | 0.8774858088625861 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.13148093187523482 |
| Emissions (Co2eq in kg) | 0.0267515315936625 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x4 |
| model_name | ft_32_12e6_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.2e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.714699 | 0.501074 |
| 1 | 0.317879 | 0.211429 | 0.927694 |
| 2 | 0.178985 | 0.206148 | 0.924799 |
| 3 | 0.126014 | 0.218194 | 0.928925 |
| 4 | 0.080316 | 0.255463 | 0.917471 |
| 5 | 0.041582 | 0.303495 | 0.928233 |
| 6 | 0.025777 | 0.347183 | 0.922220 |
|
damgomz/ft_32_4e6_base_x2 | damgomz | "2024-06-24T04:54:28Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:52:32Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 67315.8733716011 |
| Emissions (Co2eq in kg) | 0.040733867263106 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.794699585605662 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0701199991824726 |
| Consumed energy (kWh) | 0.8648195847881367 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.12958305624033212 |
| Emissions (Co2eq in kg) | 0.026365383737210434 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_4e6_base_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 4e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.709685 | 0.505596 |
| 1 | 0.383320 | 0.260129 | 0.901155 |
| 2 | 0.223109 | 0.218879 | 0.927198 |
| 3 | 0.170178 | 0.215907 | 0.918276 |
| 4 | 0.134197 | 0.228274 | 0.924236 |
| 5 | 0.095533 | 0.248810 | 0.918708 |
| 6 | 0.061871 | 0.293193 | 0.920600 |
|
damgomz/ft_32_4e6_base_x1 | damgomz | "2024-06-24T04:50:56Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:52:33Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 67111.1572368145 |
| Emissions (Co2eq in kg) | 0.0406099937601366 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7922828469347613 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0699067830123009 |
| Consumed energy (kWh) | 0.8621896299470633 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.1291889776808679 |
| Emissions (Co2eq in kg) | 0.026285203251085677 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_4e6_base_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 4e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.734676 | 0.279331 |
| 1 | 0.410059 | 0.268231 | 0.899700 |
| 2 | 0.230817 | 0.222677 | 0.908848 |
| 3 | 0.165796 | 0.223237 | 0.911309 |
| 4 | 0.119962 | 0.234589 | 0.913667 |
| 5 | 0.084190 | 0.260907 | 0.905875 |
| 6 | 0.056812 | 0.293063 | 0.922714 |
|
damgomz/ft_32_4e6_x1 | damgomz | "2024-06-24T04:49:11Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:52:40Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 67005.8401055336 |
| Emissions (Co2eq in kg) | 0.0405462653666652 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7910395445439562 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.069797069682181 |
| Consumed energy (kWh) | 0.8608366142261359 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.12898624220315216 |
| Emissions (Co2eq in kg) | 0.026243954041333988 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/ThunBERT_bs16_lr5_MLM |
| model_name | ft_32_4e6_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 4e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.710066 | 0.311295 |
| 1 | 0.387319 | 0.235759 | 0.923767 |
| 2 | 0.201637 | 0.200985 | 0.932140 |
| 3 | 0.155918 | 0.195535 | 0.927746 |
| 4 | 0.123290 | 0.201726 | 0.939085 |
| 5 | 0.090970 | 0.208165 | 0.922166 |
| 6 | 0.064448 | 0.230991 | 0.921129 |
|
slelab/AES7 | slelab | "2024-06-23T11:16:05Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-23T10:52:44Z" | Entry not found |
arash666/zakani | arash666 | "2024-06-23T10:56:38Z" | 0 | 0 | null | [
"license:openrail",
"region:us"
] | null | "2024-06-23T10:52:48Z" | ---
license: openrail
---
|
damgomz/ft_32_17e6_base_x1 | damgomz | "2024-06-24T05:09:33Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:53:16Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 68219.93429875374 |
| Emissions (Co2eq in kg) | 0.0412809365651874 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.80537261104286 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0710617866384486 |
| Consumed energy (kWh) | 0.8764343976813106 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.13132337352510096 |
| Emissions (Co2eq in kg) | 0.026719474267011878 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_17e6_base_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.7e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.712503 | 0.377147 |
| 1 | 0.327415 | 0.225597 | 0.932051 |
| 2 | 0.192420 | 0.218485 | 0.914538 |
| 3 | 0.137561 | 0.234475 | 0.931493 |
| 4 | 0.096490 | 0.240244 | 0.910265 |
| 5 | 0.071523 | 0.289267 | 0.938977 |
| 6 | 0.042446 | 0.320110 | 0.924455 |
|
damgomz/ft_32_15e6_base_x4 | damgomz | "2024-06-24T05:21:52Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:53:23Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 68967.13651156425 |
| Emissions (Co2eq in kg) | 0.0417330736987883 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.8141936003218114 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0718401080663006 |
| Consumed energy (kWh) | 0.8860337083881097 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.1327617377847612 |
| Emissions (Co2eq in kg) | 0.02701212846702933 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_15e6_base_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.5e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.712657 | 0.436186 |
| 1 | 0.323969 | 0.228080 | 0.903114 |
| 2 | 0.193001 | 0.215644 | 0.921454 |
| 3 | 0.132909 | 0.240050 | 0.923265 |
| 4 | 0.093689 | 0.307646 | 0.899092 |
| 5 | 0.059671 | 0.327657 | 0.908845 |
| 6 | 0.036950 | 0.387655 | 0.922914 |
|
damgomz/ft_32_17e6_x1 | damgomz | "2024-06-24T05:12:44Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:53:24Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 68419.44018101692 |
| Emissions (Co2eq in kg) | 0.041401656007626 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.807727794874541 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0712695938778417 |
| Consumed energy (kWh) | 0.8789973887523836 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.1317074223484576 |
| Emissions (Co2eq in kg) | 0.026797614070898295 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/ThunBERT_bs16_lr5_MLM |
| model_name | ft_32_17e6_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.7e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.713540 | 0.305240 |
| 1 | 0.280652 | 0.195848 | 0.935175 |
| 2 | 0.161281 | 0.207306 | 0.946549 |
| 3 | 0.104281 | 0.225213 | 0.929004 |
| 4 | 0.052321 | 0.242878 | 0.915553 |
| 5 | 0.022788 | 0.309547 | 0.916768 |
| 6 | 0.017620 | 0.312562 | 0.932147 |
|
damgomz/ft_32_15e6_x4 | damgomz | "2024-06-24T05:23:52Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:53:27Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 69087.04266500473 |
| Emissions (Co2eq in kg) | 0.0418056329522813 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.8156092032848126 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0719650085637966 |
| Consumed energy (kWh) | 0.8875742118486087 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.13299255713013408 |
| Emissions (Co2eq in kg) | 0.027059091710460184 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x4 |
| model_name | ft_32_15e6_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.5e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.696276 | 0.296121 |
| 1 | 0.309404 | 0.216955 | 0.932773 |
| 2 | 0.170270 | 0.214914 | 0.920081 |
| 3 | 0.120092 | 0.244058 | 0.920266 |
| 4 | 0.074628 | 0.277751 | 0.923188 |
| 5 | 0.045547 | 0.328615 | 0.926466 |
| 6 | 0.023289 | 0.412109 | 0.899151 |
|
damgomz/ft_32_17e6_base_x2 | damgomz | "2024-06-24T05:13:16Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:53:28Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 68449.6021873951 |
| Emissions (Co2eq in kg) | 0.041419911966503 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.8080839666964277 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0713010138029854 |
| Consumed energy (kWh) | 0.8793849804994166 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.13176548421073553 |
| Emissions (Co2eq in kg) | 0.02680942752339641 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_17e6_base_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.7e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.706356 | 0.346910 |
| 1 | 0.312243 | 0.234096 | 0.924459 |
| 2 | 0.181088 | 0.234735 | 0.924702 |
| 3 | 0.124604 | 0.269449 | 0.935804 |
| 4 | 0.082481 | 0.287491 | 0.907270 |
| 5 | 0.050122 | 0.321535 | 0.911518 |
| 6 | 0.034494 | 0.345586 | 0.908977 |
|
damgomz/ft_32_3e6_x12 | damgomz | "2024-06-24T05:31:37Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:53:48Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 69551.49281001091 |
| Emissions (Co2eq in kg) | 0.0420866736224328 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.8210922209223125 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0724487574525178 |
| Consumed energy (kWh) | 0.89354097837483 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.13388662365927098 |
| Emissions (Co2eq in kg) | 0.027241001350587605 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x12 |
| model_name | ft_32_3e6_x12 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 3e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.706071 | 0.426771 |
| 1 | 0.424142 | 0.294416 | 0.891731 |
| 2 | 0.252724 | 0.254003 | 0.904933 |
| 3 | 0.205829 | 0.231836 | 0.914915 |
| 4 | 0.174420 | 0.217850 | 0.920659 |
| 5 | 0.148440 | 0.223897 | 0.923999 |
| 6 | 0.126112 | 0.228574 | 0.924397 |
|
damgomz/ft_32_3e6_base_x12 | damgomz | "2024-06-24T05:34:12Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:53:51Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 69706.04548239708 |
| Emissions (Co2eq in kg) | 0.0421801920116139 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.8229167307608689 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0726097338919837 |
| Consumed energy (kWh) | 0.8955264646528512 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.13418413755361436 |
| Emissions (Co2eq in kg) | 0.027301534480605523 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_3e6_base_x12 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 3e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.727545 | 0.570724 |
| 1 | 0.433773 | 0.348463 | 0.892629 |
| 2 | 0.304504 | 0.319445 | 0.870198 |
| 3 | 0.261253 | 0.263748 | 0.903881 |
| 4 | 0.233191 | 0.261137 | 0.914327 |
| 5 | 0.211979 | 0.248463 | 0.893180 |
| 6 | 0.189862 | 0.243533 | 0.897099 |
|
damgomz/ft_32_15e6_base_x8 | damgomz | "2024-06-24T05:52:07Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:53:51Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 70782.1638636589 |
| Emissions (Co2eq in kg) | 0.0428313743077283 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.8356209404538092 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0737307571264605 |
| Consumed energy (kWh) | 0.9093516975802703 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.1362556654375434 |
| Emissions (Co2eq in kg) | 0.02772301417993307 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_15e6_base_x8 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.5e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.721794 | 0.555497 |
| 1 | 0.338023 | 0.236548 | 0.905157 |
| 2 | 0.204050 | 0.226602 | 0.903291 |
| 3 | 0.158461 | 0.240485 | 0.931850 |
| 4 | 0.116072 | 0.285530 | 0.903037 |
| 5 | 0.077379 | 0.331275 | 0.902012 |
| 6 | 0.052431 | 0.371383 | 0.917452 |
|
damgomz/ft_32_15e6_x8 | damgomz | "2024-06-24T05:47:00Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:54:00Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 70474.84282803535 |
| Emissions (Co2eq in kg) | 0.0426454132750979 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.8319929236936919 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0734106404577694 |
| Consumed energy (kWh) | 0.9054035641514606 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.13566407244396805 |
| Emissions (Co2eq in kg) | 0.027602646774313847 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x8 |
| model_name | ft_32_15e6_x8 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.5e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.705517 | 0.527863 |
| 1 | 0.304984 | 0.222122 | 0.935708 |
| 2 | 0.175308 | 0.214142 | 0.924931 |
| 3 | 0.124303 | 0.235468 | 0.932318 |
| 4 | 0.074439 | 0.278880 | 0.923392 |
| 5 | 0.040802 | 0.343101 | 0.917024 |
| 6 | 0.026713 | 0.386120 | 0.913340 |
|
damgomz/ft_32_3e6_x1 | damgomz | "2024-06-24T05:23:48Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:54:08Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 69083.08872318268 |
| Emissions (Co2eq in kg) | 0.0418032390840844 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.8155625146046288 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0719608730959396 |
| Consumed energy (kWh) | 0.887523387700568 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.13298494579212666 |
| Emissions (Co2eq in kg) | 0.02705754308324655 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/ThunBERT_bs16_lr5_MLM |
| model_name | ft_32_3e6_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 3e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.705479 | 0.345646 |
| 1 | 0.426854 | 0.276031 | 0.902913 |
| 2 | 0.223947 | 0.215637 | 0.925376 |
| 3 | 0.173298 | 0.202359 | 0.929803 |
| 4 | 0.141759 | 0.202937 | 0.918944 |
| 5 | 0.115066 | 0.202317 | 0.931141 |
| 6 | 0.090090 | 0.222893 | 0.927478 |
|
damgomz/ft_32_3e6_x2 | damgomz | "2024-06-24T05:23:40Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:54:10Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 69074.97848033905 |
| Emissions (Co2eq in kg) | 0.0417983275126095 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.8154667030245069 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0719524072408675 |
| Consumed energy (kWh) | 0.8874191102653777 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.13296933357465265 |
| Emissions (Co2eq in kg) | 0.027054366571466128 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x2 |
| model_name | ft_32_3e6_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 3e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.704349 | 0.668309 |
| 1 | 0.481622 | 0.263432 | 0.908632 |
| 2 | 0.221568 | 0.213809 | 0.934762 |
| 3 | 0.173743 | 0.203447 | 0.937904 |
| 4 | 0.143606 | 0.199034 | 0.935103 |
| 5 | 0.117202 | 0.218097 | 0.927914 |
| 6 | 0.091375 | 0.228271 | 0.920324 |
|
damgomz/ft_32_3e6_x8 | damgomz | "2024-06-24T05:41:10Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:54:16Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 70124.81861972809 |
| Emissions (Co2eq in kg) | 0.0424335590118775 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.8278600520286308 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0730456402470668 |
| Consumed energy (kWh) | 0.9009056922756968 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.13499027584297657 |
| Emissions (Co2eq in kg) | 0.0274655539593935 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x8 |
| model_name | ft_32_3e6_x8 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 3e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.732318 | 0.332367 |
| 1 | 0.424917 | 0.277960 | 0.898419 |
| 2 | 0.242365 | 0.235143 | 0.917440 |
| 3 | 0.199845 | 0.226370 | 0.922332 |
| 4 | 0.176337 | 0.221045 | 0.916011 |
| 5 | 0.148313 | 0.218984 | 0.924779 |
| 6 | 0.125811 | 0.229210 | 0.927771 |
|
damgomz/ft_32_3e6_base_x2 | damgomz | "2024-06-24T05:31:32Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:54:16Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 69546.51192593575 |
| Emissions (Co2eq in kg) | 0.0420836593227041 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.821033409884905 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0724435719775655 |
| Consumed energy (kWh) | 0.8934769818624689 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.1338770354574263 |
| Emissions (Co2eq in kg) | 0.027239050504324833 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_3e6_base_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 3e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.695845 | 0.588297 |
| 1 | 0.402406 | 0.288985 | 0.897483 |
| 2 | 0.237773 | 0.231385 | 0.917217 |
| 3 | 0.190397 | 0.218945 | 0.932204 |
| 4 | 0.155807 | 0.228457 | 0.927297 |
| 5 | 0.120795 | 0.243433 | 0.903127 |
| 6 | 0.088440 | 0.254021 | 0.918233 |
|
damgomz/ft_32_3e6_base_x4 | damgomz | "2024-06-24T05:36:59Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:54:23Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 69874.17082333565 |
| Emissions (Co2eq in kg) | 0.0422819351554865 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.8249016842146675 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0727848862506446 |
| Consumed energy (kWh) | 0.8976865704653092 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.1345077788349211 |
| Emissions (Co2eq in kg) | 0.027367383572473127 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_3e6_base_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 3e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.701788 | 0.354232 |
| 1 | 0.393833 | 0.295560 | 0.894343 |
| 2 | 0.248546 | 0.238341 | 0.910043 |
| 3 | 0.202545 | 0.242598 | 0.883174 |
| 4 | 0.171609 | 0.234381 | 0.900331 |
| 5 | 0.138293 | 0.238748 | 0.919080 |
| 6 | 0.113556 | 0.250436 | 0.925072 |
|
SwimChoi/villama2-7b-chat-France-lora | SwimChoi | "2024-06-23T10:54:27Z" | 0 | 0 | peft | [
"peft",
"safetensors",
"arxiv:1910.09700",
"base_model:meta-llama/Llama-2-7b-chat-hf",
"region:us"
] | null | "2024-06-23T10:54:24Z" | ---
library_name: peft
base_model: meta-llama/Llama-2-7b-chat-hf
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.10.1.dev0 |
damgomz/ft_32_16e6_base_x12 | damgomz | "2024-06-24T06:00:16Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:54:28Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 71268.81805562973 |
| Emissions (Co2eq in kg) | 0.0431258556864645 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.8413661629032767 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.074237660507361 |
| Consumed energy (kWh) | 0.9156038234106392 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.1371924747570872 |
| Emissions (Co2eq in kg) | 0.02791362040512164 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_16e6_base_x12 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.6e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.745482 | 0.425709 |
| 1 | 0.368146 | 0.265085 | 0.898074 |
| 2 | 0.225921 | 0.238311 | 0.888580 |
| 3 | 0.187288 | 0.263459 | 0.891032 |
| 4 | 0.150420 | 0.238125 | 0.911862 |
| 5 | 0.115398 | 0.255953 | 0.922487 |
| 6 | 0.086221 | 0.300761 | 0.907217 |
|
damgomz/ft_32_15e6_base_x12 | damgomz | "2024-06-24T06:13:37Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:54:33Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 72071.44087576866 |
| Emissions (Co2eq in kg) | 0.0436115410278336 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.8508416568411714 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.075073738327374 |
| Consumed energy (kWh) | 0.9259153951685444 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.13873752368585465 |
| Emissions (Co2eq in kg) | 0.028227981009676054 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_15e6_base_x12 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.5e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.720602 | 0.167068 |
| 1 | 0.372920 | 0.254849 | 0.907320 |
| 2 | 0.227426 | 0.238683 | 0.918506 |
| 3 | 0.187936 | 0.225428 | 0.926747 |
| 4 | 0.154460 | 0.245129 | 0.916622 |
| 5 | 0.120701 | 0.262675 | 0.909515 |
| 6 | 0.093956 | 0.312437 | 0.928707 |
|
damgomz/ft_32_9e6_x4 | damgomz | "2024-06-24T05:42:14Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:54:41Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 70189.35336184502 |
| Emissions (Co2eq in kg) | 0.0424726542233455 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.8286224981147402 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.073113223490864 |
| Consumed energy (kWh) | 0.9017357216056044 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.13511450522155163 |
| Emissions (Co2eq in kg) | 0.027490830066722628 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x4 |
| model_name | ft_32_9e6_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 9e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.733363 | 0.496702 |
| 1 | 0.339374 | 0.266059 | 0.926195 |
| 2 | 0.186584 | 0.208755 | 0.926505 |
| 3 | 0.133704 | 0.229455 | 0.930974 |
| 4 | 0.092516 | 0.267764 | 0.927905 |
| 5 | 0.057660 | 0.299329 | 0.908951 |
| 6 | 0.038804 | 0.326848 | 0.928435 |
|
WhiteGiverPlus/open-web-math-md100 | WhiteGiverPlus | "2024-06-23T10:54:49Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-23T10:54:49Z" | Entry not found |