modelId
stringlengths 5
122
| author
stringlengths 2
42
| last_modified
unknown | downloads
int64 0
738M
| likes
int64 0
11k
| library_name
stringclasses 245
values | tags
sequencelengths 1
4.05k
| pipeline_tag
stringclasses 48
values | createdAt
unknown | card
stringlengths 1
901k
|
---|---|---|---|---|---|---|---|---|---|
damgomz/ft_32_16e6_x12 | damgomz | "2024-06-24T06:10:27Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:54:56Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 71881.81767988205 |
| Emissions (Co2eq in kg) | 0.0434967905069397 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.8486029742063748 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0748761558957398 |
| Consumed energy (kWh) | 0.923479130102116 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.13837249903377294 |
| Emissions (Co2eq in kg) | 0.02815371192462047 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x12 |
| model_name | ft_32_16e6_x12 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.6e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.706009 | 0.371699 |
| 1 | 0.303564 | 0.217143 | 0.929244 |
| 2 | 0.175745 | 0.231566 | 0.905030 |
| 3 | 0.120437 | 0.238707 | 0.928846 |
| 4 | 0.068286 | 0.308548 | 0.921530 |
| 5 | 0.041702 | 0.314831 | 0.907265 |
| 6 | 0.025832 | 0.378665 | 0.924337 |
|
damgomz/ft_32_4e6_x2 | damgomz | "2024-06-24T06:04:01Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:54:56Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 71495.37512636185 |
| Emissions (Co2eq in kg) | 0.0432629400180004 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.8440406805665959 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0744735764130948 |
| Consumed energy (kWh) | 0.9185142569796908 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.13762859711824657 |
| Emissions (Co2eq in kg) | 0.02800235525782506 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x2 |
| model_name | ft_32_4e6_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 4e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.697838 | 0.165181 |
| 1 | 0.430814 | 0.231088 | 0.928863 |
| 2 | 0.197756 | 0.205576 | 0.930748 |
| 3 | 0.157784 | 0.203073 | 0.936039 |
| 4 | 0.122974 | 0.210342 | 0.930354 |
| 5 | 0.090873 | 0.220113 | 0.930634 |
| 6 | 0.060774 | 0.264006 | 0.922399 |
|
damgomz/ft_32_2e6_base_x1 | damgomz | "2024-06-24T06:30:40Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:54:57Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 73095.39277100563 |
| Emissions (Co2eq in kg) | 0.0442311379684663 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.862929785768026 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0761402545382579 |
| Consumed energy (kWh) | 0.9390700403062856 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.14070863108418583 |
| Emissions (Co2eq in kg) | 0.02862902883531054 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_2e6_base_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 2e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.719601 | 0.333675 |
| 1 | 0.482296 | 0.366204 | 0.876392 |
| 2 | 0.298587 | 0.270772 | 0.896133 |
| 3 | 0.222985 | 0.245894 | 0.908798 |
| 4 | 0.177491 | 0.236516 | 0.912284 |
| 5 | 0.137443 | 0.242232 | 0.893873 |
| 6 | 0.105196 | 0.269823 | 0.916260 |
|
damgomz/ft_32_15e6_base_x2 | damgomz | "2024-06-24T06:09:38Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:55:15Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 71832.20314526558 |
| Emissions (Co2eq in kg) | 0.0434667651147476 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.848017205489013 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0748244563599428 |
| Consumed energy (kWh) | 0.9228416618489558 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.13827699105463623 |
| Emissions (Co2eq in kg) | 0.028134279565229015 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_15e6_base_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.5e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.725213 | 0.323829 |
| 1 | 0.308142 | 0.225736 | 0.909666 |
| 2 | 0.182370 | 0.253100 | 0.916715 |
| 3 | 0.127769 | 0.271641 | 0.909688 |
| 4 | 0.074474 | 0.293281 | 0.907815 |
| 5 | 0.042815 | 0.323060 | 0.914852 |
| 6 | 0.030504 | 0.367846 | 0.917341 |
|
damgomz/ft_32_15e6_x1 | damgomz | "2024-06-24T06:03:50Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:55:15Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 71484.84540772438 |
| Emissions (Co2eq in kg) | 0.0432565779769739 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.843916534531778 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.074462650134663 |
| Consumed energy (kWh) | 0.9183791846664384 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.13760832740986942 |
| Emissions (Co2eq in kg) | 0.02799823111802538 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/ThunBERT_bs16_lr5_MLM |
| model_name | ft_32_15e6_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.5e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.695889 | 0.254652 |
| 1 | 0.275330 | 0.215623 | 0.933210 |
| 2 | 0.152076 | 0.198205 | 0.933075 |
| 3 | 0.098771 | 0.229899 | 0.917546 |
| 4 | 0.055592 | 0.266245 | 0.915720 |
| 5 | 0.028065 | 0.306953 | 0.918602 |
| 6 | 0.013205 | 0.313088 | 0.925786 |
|
damgomz/ft_32_4e6_base_x4 | damgomz | "2024-06-24T06:09:51Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:55:17Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 71845.55230617523 |
| Emissions (Co2eq in kg) | 0.0434748378985542 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.8481747363578943 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0748383185359342 |
| Consumed energy (kWh) | 0.9230130548938292 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.13830268818938735 |
| Emissions (Co2eq in kg) | 0.0281395079865853 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_4e6_base_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 4e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.726919 | 0.469267 |
| 1 | 0.380991 | 0.280183 | 0.879980 |
| 2 | 0.234251 | 0.240912 | 0.927142 |
| 3 | 0.188952 | 0.234519 | 0.911161 |
| 4 | 0.151393 | 0.247042 | 0.904971 |
| 5 | 0.120314 | 0.243643 | 0.921235 |
| 6 | 0.085023 | 0.265345 | 0.908701 |
|
damgomz/ft_32_15e6_x2 | damgomz | "2024-06-24T06:11:55Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:55:17Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 71970.71953129768 |
| Emissions (Co2eq in kg) | 0.0435505778786255 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.8496523629311062 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0749687253393237 |
| Consumed energy (kWh) | 0.9246210882704298 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.13854363509774806 |
| Emissions (Co2eq in kg) | 0.028188531816424927 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x2 |
| model_name | ft_32_15e6_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.5e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.696626 | 0.415477 |
| 1 | 0.303863 | 0.197197 | 0.934375 |
| 2 | 0.162620 | 0.212696 | 0.920553 |
| 3 | 0.106096 | 0.230343 | 0.928876 |
| 4 | 0.059013 | 0.286442 | 0.919312 |
| 5 | 0.031572 | 0.315769 | 0.928705 |
| 6 | 0.018381 | 0.415040 | 0.901857 |
|
damgomz/ft_32_1e6_x4 | damgomz | "2024-06-24T06:40:57Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:55:23Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 73710.30487847328 |
| Emissions (Co2eq in kg) | 0.0446032272363886 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.8701890616261289 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0767807954127587 |
| Consumed energy (kWh) | 0.9469698570388888 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.14189233689106107 |
| Emissions (Co2eq in kg) | 0.02886986941073537 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x4 |
| model_name | ft_32_1e6_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.712072 | 0.331035 |
| 1 | 0.571019 | 0.398052 | 0.854781 |
| 2 | 0.328747 | 0.295188 | 0.873769 |
| 3 | 0.265386 | 0.262781 | 0.910609 |
| 4 | 0.234499 | 0.247737 | 0.910782 |
| 5 | 0.210701 | 0.236986 | 0.905888 |
| 6 | 0.192468 | 0.230505 | 0.919231 |
|
damgomz/ft_32_9e6_base_x12 | damgomz | "2024-06-24T06:14:17Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:55:27Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 72108.18771147728 |
| Emissions (Co2eq in kg) | 0.0436337686919641 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.8512753254933474 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.075111984584232 |
| Consumed energy (kWh) | 0.9263873100775806 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.13880826134459376 |
| Emissions (Co2eq in kg) | 0.028242373520328597 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_9e6_base_x12 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 9e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.769496 | 0.568915 |
| 1 | 0.394504 | 0.299390 | 0.892960 |
| 2 | 0.248500 | 0.255195 | 0.907169 |
| 3 | 0.201805 | 0.245713 | 0.927074 |
| 4 | 0.172395 | 0.228271 | 0.920754 |
| 5 | 0.145818 | 0.238505 | 0.897134 |
| 6 | 0.120750 | 0.262823 | 0.918250 |
|
SwimChoi/villama2-7b-chat-United_Kingdom-lora | SwimChoi | "2024-06-23T10:55:47Z" | 0 | 0 | peft | [
"peft",
"safetensors",
"arxiv:1910.09700",
"base_model:meta-llama/Llama-2-7b-chat-hf",
"region:us"
] | null | "2024-06-23T10:55:42Z" | ---
library_name: peft
base_model: meta-llama/Llama-2-7b-chat-hf
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.10.1.dev0 |
Rakif215/test_model | Rakif215 | "2024-06-23T11:09:27Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"text-generation-inference",
"unsloth",
"llama",
"trl",
"en",
"base_model:unsloth/llama-3-8b-bnb-4bit",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | "2024-06-23T10:56:14Z" | ---
base_model: unsloth/llama-3-8b-bnb-4bit
language:
- en
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
---
# Uploaded model
- **Developed by:** Rakif215
- **License:** apache-2.0
- **Finetuned from model :** unsloth/llama-3-8b-bnb-4bit
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
a1-b2-c3-d4-archana/flan-t5-base-depression_dataset_reddit_cleaned_classification | a1-b2-c3-d4-archana | "2024-06-23T11:56:33Z" | 0 | 0 | peft | [
"peft",
"tensorboard",
"safetensors",
"generated_from_trainer",
"base_model:google/flan-t5-base",
"license:apache-2.0",
"region:us"
] | null | "2024-06-23T10:56:37Z" | ---
base_model: google/flan-t5-base
library_name: peft
license: apache-2.0
metrics:
- f1
tags:
- generated_from_trainer
model-index:
- name: flan-t5-base-depression_dataset_reddit_cleaned_classification
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# flan-t5-base-depression_dataset_reddit_cleaned_classification
This model is a fine-tuned version of [google/flan-t5-base](https://huggingface.co/google/flan-t5-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0360
- F1: 98.05
- Gen Len: 2.494
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
### Framework versions
- PEFT 0.11.1
- Transformers 4.41.2
- Pytorch 2.3.0+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1 |
damgomz/ft_32_9e6_base_x4 | damgomz | "2024-06-24T02:14:45Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:56:43Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 57740.51975607872 |
| Emissions (Co2eq in kg) | 0.0349396951409503 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.681657758170532 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0601459220262864 |
| Consumed energy (kWh) | 0.7418036801968183 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.11115050053045154 |
| Emissions (Co2eq in kg) | 0.022615036904464165 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_9e6_base_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 9e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.718265 | 0.417621 |
| 1 | 0.329715 | 0.237797 | 0.920274 |
| 2 | 0.203771 | 0.226090 | 0.916015 |
| 3 | 0.154634 | 0.241886 | 0.917662 |
| 4 | 0.105754 | 0.272587 | 0.909722 |
| 5 | 0.069278 | 0.333512 | 0.902355 |
| 6 | 0.047279 | 0.374489 | 0.887363 |
|
damgomz/ft_32_13e6_x4 | damgomz | "2024-06-24T02:42:29Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:56:47Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 59404.27415037155 |
| Emissions (Co2eq in kg) | 0.0359464566714897 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7012992403136343 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0618789654885728 |
| Consumed energy (kWh) | 0.7631782058022063 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.11435322773946523 |
| Emissions (Co2eq in kg) | 0.023266674042228857 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x4 |
| model_name | ft_32_13e6_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.3e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.696288 | 0.094655 |
| 1 | 0.309722 | 0.218758 | 0.913903 |
| 2 | 0.173660 | 0.217908 | 0.906518 |
| 3 | 0.124395 | 0.223244 | 0.920656 |
| 4 | 0.073899 | 0.262751 | 0.933872 |
| 5 | 0.039802 | 0.328386 | 0.907399 |
| 6 | 0.023364 | 0.409061 | 0.914888 |
|
altamash96/message-classifer | altamash96 | "2024-06-23T10:57:00Z" | 0 | 0 | null | [
"license:mit",
"region:us"
] | null | "2024-06-23T10:57:00Z" | ---
license: mit
---
|
damgomz/ft_32_13e6_base_x8 | damgomz | "2024-06-24T03:01:16Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:57:18Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 60530.468687057495 |
| Emissions (Co2eq in kg) | 0.0366279330673225 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.7145945333186138 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0630520785860717 |
| Consumed energy (kWh) | 0.7776466119046849 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.11652115222258569 |
| Emissions (Co2eq in kg) | 0.023707766902430854 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_13e6_base_x8 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.3e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.703413 | 0.599719 |
| 1 | 0.343454 | 0.234083 | 0.918774 |
| 2 | 0.209892 | 0.227700 | 0.915555 |
| 3 | 0.160849 | 0.231055 | 0.924774 |
| 4 | 0.126931 | 0.250159 | 0.911939 |
| 5 | 0.087303 | 0.325753 | 0.916357 |
| 6 | 0.058676 | 0.369368 | 0.909896 |
|
damgomz/ft_32_16e6_x1 | damgomz | "2024-06-24T07:23:39Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:58:02Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 76273.86081504822 |
| Emissions (Co2eq in kg) | 0.0461544709922368 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9004531301612628 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0794511392223338 |
| Consumed energy (kWh) | 0.9799042693835982 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.1468271820689678 |
| Emissions (Co2eq in kg) | 0.02987392881922722 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/ThunBERT_bs16_lr5_MLM |
| model_name | ft_32_16e6_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.6e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.696664 | 0.510751 |
| 1 | 0.277794 | 0.199627 | 0.946928 |
| 2 | 0.155206 | 0.209703 | 0.915917 |
| 3 | 0.097123 | 0.240390 | 0.923577 |
| 4 | 0.048558 | 0.258102 | 0.916016 |
| 5 | 0.025158 | 0.292661 | 0.930326 |
| 6 | 0.013297 | 0.377909 | 0.927573 |
|
damgomz/ft_32_16e6_x2 | damgomz | "2024-06-24T07:28:54Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:58:11Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 76589.04898381233 |
| Emissions (Co2eq in kg) | 0.0463451953789473 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9041740978658196 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0797794355824589 |
| Consumed energy (kWh) | 0.9839535334482776 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.14743391929383873 |
| Emissions (Co2eq in kg) | 0.029997377518659826 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x2 |
| model_name | ft_32_16e6_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.6e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.697933 | 0.332367 |
| 1 | 0.298595 | 0.197205 | 0.934918 |
| 2 | 0.159834 | 0.219332 | 0.916928 |
| 3 | 0.104084 | 0.223367 | 0.939561 |
| 4 | 0.056699 | 0.282072 | 0.914538 |
| 5 | 0.031568 | 0.327818 | 0.917533 |
| 6 | 0.021717 | 0.372366 | 0.933519 |
|
damgomz/ft_32_16e6_base_x2 | damgomz | "2024-06-24T07:26:48Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:58:19Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 76463.57305526733 |
| Emissions (Co2eq in kg) | 0.0462692720003635 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9026928755258524 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0796487308171892 |
| Consumed energy (kWh) | 0.9823416063430404 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.1471923781313896 |
| Emissions (Co2eq in kg) | 0.029948232779979704 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_16e6_base_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.6e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.737350 | 0.165081 |
| 1 | 0.311884 | 0.222528 | 0.926436 |
| 2 | 0.176962 | 0.214609 | 0.916218 |
| 3 | 0.122318 | 0.246520 | 0.919283 |
| 4 | 0.072663 | 0.304516 | 0.886896 |
| 5 | 0.044193 | 0.331732 | 0.917947 |
| 6 | 0.030915 | 0.362305 | 0.895415 |
|
damgomz/ft_32_16e6_base_x1 | damgomz | "2024-06-24T07:26:47Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:58:23Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 76458.34525895119 |
| Emissions (Co2eq in kg) | 0.0462661062558712 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9026311467150866 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0796432477960984 |
| Consumed energy (kWh) | 0.9822743945111838 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.14718231462348103 |
| Emissions (Co2eq in kg) | 0.029946185226422548 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_16e6_base_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.6e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.704099 | 0.350817 |
| 1 | 0.317000 | 0.214237 | 0.933752 |
| 2 | 0.184154 | 0.213382 | 0.927406 |
| 3 | 0.137625 | 0.231805 | 0.925672 |
| 4 | 0.103309 | 0.263324 | 0.916410 |
| 5 | 0.065206 | 0.274044 | 0.924229 |
| 6 | 0.047976 | 0.337566 | 0.881619 |
|
SwimChoi/villama2-7b-chat-Slovenia-lora | SwimChoi | "2024-06-23T10:58:29Z" | 0 | 0 | peft | [
"peft",
"safetensors",
"arxiv:1910.09700",
"base_model:meta-llama/Llama-2-7b-chat-hf",
"region:us"
] | null | "2024-06-23T10:58:25Z" | ---
library_name: peft
base_model: meta-llama/Llama-2-7b-chat-hf
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.10.1.dev0 |
damgomz/ft_32_16e6_base_x4 | damgomz | "2024-06-24T07:35:39Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:58:31Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 76993.88854002953 |
| Emissions (Co2eq in kg) | 0.0465901744011517 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9089535264518518 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0802011496941249 |
| Consumed energy (kWh) | 0.9891546761459789 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.14821323543955683 |
| Emissions (Co2eq in kg) | 0.03015593967817823 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_16e6_base_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.6e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.739663 | 0.407901 |
| 1 | 0.339062 | 0.228771 | 0.909112 |
| 2 | 0.194211 | 0.227631 | 0.942929 |
| 3 | 0.142600 | 0.235747 | 0.901380 |
| 4 | 0.095002 | 0.263570 | 0.920270 |
| 5 | 0.064711 | 0.334313 | 0.924025 |
| 6 | 0.040565 | 0.377199 | 0.918595 |
|
damgomz/ft_32_16e6_x4 | damgomz | "2024-06-24T07:38:10Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:58:44Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 77144.10741114616 |
| Emissions (Co2eq in kg) | 0.046681072769273 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9107269244251996 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0803576124057173 |
| Consumed energy (kWh) | 0.9910845368309172 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.14850240676645637 |
| Emissions (Co2eq in kg) | 0.03021477540269891 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x4 |
| model_name | ft_32_16e6_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.6e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.712482 | 0.495563 |
| 1 | 0.299999 | 0.232740 | 0.914570 |
| 2 | 0.173250 | 0.210255 | 0.917777 |
| 3 | 0.115689 | 0.240210 | 0.936128 |
| 4 | 0.068289 | 0.289696 | 0.909382 |
| 5 | 0.041385 | 0.308911 | 0.915930 |
| 6 | 0.023509 | 0.373765 | 0.920001 |
|
damgomz/ft_32_18e6_base_x1 | damgomz | "2024-06-24T07:48:32Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:58:55Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 77766.42371606827 |
| Emissions (Co2eq in kg) | 0.0470576441493907 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9180736756990344 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0810058374618491 |
| Consumed energy (kWh) | 0.9990795131608836 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.14970036565343142 |
| Emissions (Co2eq in kg) | 0.03045851595546007 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_18e6_base_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.8e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.700395 | 0.790555 |
| 1 | 0.318444 | 0.230828 | 0.920868 |
| 2 | 0.189804 | 0.222478 | 0.902729 |
| 3 | 0.161528 | 0.230840 | 0.925825 |
| 4 | 0.115337 | 0.252489 | 0.905116 |
| 5 | 0.078328 | 0.268760 | 0.908056 |
| 6 | 0.056382 | 0.281858 | 0.912528 |
|
damgomz/ft_32_16e6_base_x8 | damgomz | "2024-06-24T07:57:39Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:59:04Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 78313.53327870369 |
| Emissions (Co2eq in kg) | 0.0473887137294402 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9245326654970666 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0815757765623429 |
| Consumed energy (kWh) | 1.0061084420594089 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.1507535515615046 |
| Emissions (Co2eq in kg) | 0.030672800534158943 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_16e6_base_x8 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.6e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.733605 | 0.521574 |
| 1 | 0.343848 | 0.250949 | 0.927429 |
| 2 | 0.207227 | 0.216056 | 0.905269 |
| 3 | 0.158365 | 0.231901 | 0.928852 |
| 4 | 0.115495 | 0.261195 | 0.920515 |
| 5 | 0.076285 | 0.315937 | 0.903032 |
| 6 | 0.055799 | 0.309163 | 0.927541 |
|
damgomz/ft_32_18e6_x1 | damgomz | "2024-06-24T07:55:32Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:59:10Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 78184.71831536293 |
| Emissions (Co2eq in kg) | 0.0473107595238934 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9230118314592392 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0814415670650699 |
| Consumed energy (kWh) | 1.0044533985243094 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.15050558275707365 |
| Emissions (Co2eq in kg) | 0.03062234800685048 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/ThunBERT_bs16_lr5_MLM |
| model_name | ft_32_18e6_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.8e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.698141 | 0.206651 |
| 1 | 0.272480 | 0.193901 | 0.939546 |
| 2 | 0.154576 | 0.201308 | 0.942076 |
| 3 | 0.099094 | 0.219778 | 0.934614 |
| 4 | 0.059062 | 0.277845 | 0.919980 |
| 5 | 0.028669 | 0.309169 | 0.928023 |
| 6 | 0.015430 | 0.360813 | 0.917627 |
|
damgomz/ft_32_17e6_x4 | damgomz | "2024-06-24T08:05:46Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:59:15Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 78798.89624452591 |
| Emissions (Co2eq in kg) | 0.0476824025173412 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9302624583818844 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0820812820342679 |
| Consumed energy (kWh) | 1.0123437404161528 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.15168787527071237 |
| Emissions (Co2eq in kg) | 0.03086290102910598 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x4 |
| model_name | ft_32_17e6_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.7e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.713710 | 0.618507 |
| 1 | 0.306459 | 0.211496 | 0.934590 |
| 2 | 0.172426 | 0.218808 | 0.940250 |
| 3 | 0.117376 | 0.263535 | 0.929094 |
| 4 | 0.067785 | 0.288533 | 0.914189 |
| 5 | 0.043456 | 0.303358 | 0.917680 |
| 6 | 0.030474 | 0.370767 | 0.926662 |
|
damgomz/ft_32_17e6_base_x4 | damgomz | "2024-06-24T08:05:14Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:59:22Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 78768.32392835617 |
| Emissions (Co2eq in kg) | 0.0476639024126978 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9299015276362512 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0820494375810027 |
| Consumed energy (kWh) | 1.011950965217253 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.15162902356208563 |
| Emissions (Co2eq in kg) | 0.0308509268719395 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_17e6_base_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.7e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.702786 | 0.053460 |
| 1 | 0.315886 | 0.238229 | 0.924759 |
| 2 | 0.193645 | 0.229385 | 0.923752 |
| 3 | 0.136677 | 0.235313 | 0.915139 |
| 4 | 0.086468 | 0.321663 | 0.884123 |
| 5 | 0.060418 | 0.296657 | 0.920084 |
| 6 | 0.041358 | 0.356031 | 0.923231 |
|
damgomz/ft_32_16e6_x8 | damgomz | "2024-06-24T08:01:41Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:59:32Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 78555.43899655342 |
| Emissions (Co2eq in kg) | 0.0475350863926364 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9273883515700714 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0818277243653932 |
| Consumed energy (kWh) | 1.0092160759354667 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.15121922006836533 |
| Emissions (Co2eq in kg) | 0.030767546940316755 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x8 |
| model_name | ft_32_16e6_x8 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.6e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.696512 | 0.495578 |
| 1 | 0.302255 | 0.226014 | 0.918930 |
| 2 | 0.170988 | 0.226232 | 0.908398 |
| 3 | 0.115273 | 0.246663 | 0.921976 |
| 4 | 0.070741 | 0.280451 | 0.918229 |
| 5 | 0.041068 | 0.323950 | 0.912251 |
| 6 | 0.027156 | 0.391745 | 0.899214 |
|
damgomz/ft_32_17e6_x8 | damgomz | "2024-06-24T08:22:56Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T10:59:57Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 79829.65154862404 |
| Emissions (Co2eq in kg) | 0.0483061263122423 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9424309913574008 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0831550113228459 |
| Consumed energy (kWh) | 1.0255860026802477 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.15367207923110127 |
| Emissions (Co2eq in kg) | 0.031266613523211084 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x8 |
| model_name | ft_32_17e6_x8 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.7e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.725172 | 0.488614 |
| 1 | 0.299050 | 0.215116 | 0.928639 |
| 2 | 0.173200 | 0.209131 | 0.928052 |
| 3 | 0.117315 | 0.227597 | 0.930839 |
| 4 | 0.070195 | 0.272285 | 0.916066 |
| 5 | 0.039318 | 0.328803 | 0.928794 |
| 6 | 0.024986 | 0.369566 | 0.909873 |
|
damgomz/ft_32_17e6_base_x8 | damgomz | "2024-06-24T08:23:05Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:00:01Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 79839.45226216316 |
| Emissions (Co2eq in kg) | 0.0483120636537122 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9425468512781836 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0831652069310347 |
| Consumed energy (kWh) | 1.0257120582092154 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.1536909456046641 |
| Emissions (Co2eq in kg) | 0.031270452136013906 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_17e6_base_x8 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.7e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.712283 | 0.640911 |
| 1 | 0.326548 | 0.247295 | 0.926940 |
| 2 | 0.206238 | 0.237259 | 0.930069 |
| 3 | 0.159045 | 0.226758 | 0.922457 |
| 4 | 0.117338 | 0.288397 | 0.922248 |
| 5 | 0.077770 | 0.307476 | 0.915256 |
| 6 | 0.056569 | 0.378800 | 0.908784 |
|
damgomz/ft_32_17e6_x12 | damgomz | "2024-06-24T08:32:54Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:00:26Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 80426.77821731567 |
| Emissions (Co2eq in kg) | 0.048667475162368 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9494807264938968 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0837770634122195 |
| Consumed energy (kWh) | 1.033257789906117 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.15482154806833265 |
| Emissions (Co2eq in kg) | 0.0315004881351153 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x12 |
| model_name | ft_32_17e6_x12 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.7e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.698811 | 0.596278 |
| 1 | 0.295764 | 0.222173 | 0.927828 |
| 2 | 0.172616 | 0.233945 | 0.924010 |
| 3 | 0.119810 | 0.259121 | 0.915427 |
| 4 | 0.072792 | 0.272543 | 0.914879 |
| 5 | 0.040004 | 0.330694 | 0.929110 |
| 6 | 0.025648 | 0.365092 | 0.922332 |
|
damgomz/ft_32_17e6_base_x12 | damgomz | "2024-06-24T08:38:37Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:00:33Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 80765.59874010086 |
| Emissions (Co2eq in kg) | 0.0488724939203312 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9534805622928684 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0841299750700593 |
| Consumed energy (kWh) | 1.0376105373629272 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.15547377757469413 |
| Emissions (Co2eq in kg) | 0.031633192839872835 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_17e6_base_x12 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.7e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.732474 | 0.231398 |
| 1 | 0.375652 | 0.278516 | 0.908269 |
| 2 | 0.221592 | 0.247247 | 0.896841 |
| 3 | 0.184791 | 0.228275 | 0.906988 |
| 4 | 0.152636 | 0.245225 | 0.922379 |
| 5 | 0.113951 | 0.248734 | 0.918360 |
| 6 | 0.084203 | 0.296184 | 0.914779 |
|
damgomz/ft_32_7e6_base_x1 | damgomz | "2024-06-24T08:09:10Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:00:41Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 79004.84612965584 |
| Emissions (Co2eq in kg) | 0.0478070245138415 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9326938064159624 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0822957800862689 |
| Consumed energy (kWh) | 1.0149895865022311 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.15208432879958747 |
| Emissions (Co2eq in kg) | 0.0309435647341152 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_7e6_base_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 7e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.727935 | 0.566018 |
| 1 | 0.358678 | 0.239306 | 0.930819 |
| 2 | 0.200254 | 0.203475 | 0.928775 |
| 3 | 0.146464 | 0.210787 | 0.921165 |
| 4 | 0.101517 | 0.239812 | 0.920853 |
| 5 | 0.065537 | 0.267798 | 0.910738 |
| 6 | 0.042361 | 0.303130 | 0.932235 |
|
srihari5544/whisper-small-en-scratch-2 | srihari5544 | "2024-06-23T11:14:18Z" | 0 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"whisper",
"automatic-speech-recognition",
"generated_from_trainer",
"base_model:openai/whisper-small",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | "2024-06-23T11:00:52Z" | ---
license: apache-2.0
base_model: openai/whisper-small
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: whisper-small-en-scratch-2
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# whisper-small-en-scratch-2
This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.5575
- Wer: 33.9105
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 6
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 4
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:------:|:----:|:---------------:|:-------:|
| No log | 0.2222 | 2 | 1.8510 | 44.5887 |
| No log | 0.4444 | 4 | 1.8490 | 44.5887 |
| 0.7847 | 0.6667 | 6 | 1.8453 | 44.7330 |
| 0.7847 | 0.8889 | 8 | 1.8406 | 44.7330 |
| 0.7387 | 1.1111 | 10 | 1.8346 | 44.8773 |
| 0.7387 | 1.3333 | 12 | 1.8272 | 45.0216 |
| 0.7387 | 1.5556 | 14 | 1.8185 | 45.4545 |
| 0.7199 | 1.7778 | 16 | 1.8079 | 45.5988 |
| 0.7199 | 2.0 | 18 | 1.7952 | 45.5988 |
| 0.7153 | 2.2222 | 20 | 1.7810 | 45.1659 |
| 0.7153 | 2.4444 | 22 | 1.7643 | 45.0216 |
| 0.7153 | 2.6667 | 24 | 1.7443 | 34.3434 |
| 0.6205 | 2.8889 | 26 | 1.7225 | 34.1991 |
| 0.6205 | 3.1111 | 28 | 1.7001 | 34.1991 |
| 0.4817 | 3.3333 | 30 | 1.6743 | 33.6219 |
| 0.4817 | 3.5556 | 32 | 1.6421 | 33.6219 |
| 0.4817 | 3.7778 | 34 | 1.6030 | 33.9105 |
| 0.3973 | 4.0 | 36 | 1.5575 | 33.9105 |
### Framework versions
- Transformers 4.42.0.dev0
- Pytorch 1.13.1+cu117
- Datasets 2.19.3.dev0
- Tokenizers 0.19.1
|
damgomz/ft_32_7e6_x1 | damgomz | "2024-06-24T08:08:43Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:00:53Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 78978.28361129761 |
| Emissions (Co2eq in kg) | 0.0477909535721322 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9323802866213852 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0822680981715519 |
| Consumed energy (kWh) | 1.0146483847929388 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.15203319595174786 |
| Emissions (Co2eq in kg) | 0.03093316108109156 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/ThunBERT_bs16_lr5_MLM |
| model_name | ft_32_7e6_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 7e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.703596 | 0.346171 |
| 1 | 0.324011 | 0.236833 | 0.924990 |
| 2 | 0.178597 | 0.196135 | 0.936993 |
| 3 | 0.127945 | 0.200144 | 0.928396 |
| 4 | 0.091947 | 0.225701 | 0.912197 |
| 5 | 0.053299 | 0.244373 | 0.933727 |
| 6 | 0.030095 | 0.281922 | 0.926978 |
|
damgomz/ft_32_7e6_x2 | damgomz | "2024-06-24T08:13:20Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:00:55Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 79255.22461080551 |
| Emissions (Co2eq in kg) | 0.047958526166105 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9356495326735894 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0825565811266503 |
| Consumed energy (kWh) | 1.0182061138002385 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.1525663073758006 |
| Emissions (Co2eq in kg) | 0.031041629639232154 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x2 |
| model_name | ft_32_7e6_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 7e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.699140 | 0.660155 |
| 1 | 0.363280 | 0.211358 | 0.923057 |
| 2 | 0.177077 | 0.191385 | 0.931281 |
| 3 | 0.131015 | 0.206242 | 0.931328 |
| 4 | 0.087442 | 0.244335 | 0.924667 |
| 5 | 0.048868 | 0.297485 | 0.904074 |
| 6 | 0.029791 | 0.331672 | 0.915473 |
|
SwimChoi/villama2-7b-chat-Cyprus-lora | SwimChoi | "2024-06-23T11:01:12Z" | 0 | 0 | peft | [
"peft",
"safetensors",
"arxiv:1910.09700",
"base_model:meta-llama/Llama-2-7b-chat-hf",
"region:us"
] | null | "2024-06-23T11:01:09Z" | ---
library_name: peft
base_model: meta-llama/Llama-2-7b-chat-hf
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.10.1.dev0 |
damgomz/ft_32_7e6_base_x2 | damgomz | "2024-06-24T08:16:55Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:01:14Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 79470.73570871353 |
| Emissions (Co2eq in kg) | 0.0480889349404615 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.938193734708262 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.08278108409519 |
| Consumed energy (kWh) | 1.0209748188034555 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.15298116623927355 |
| Emissions (Co2eq in kg) | 0.031126038152579465 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_7e6_base_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 7e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.707066 | 0.335475 |
| 1 | 0.340416 | 0.250506 | 0.891045 |
| 2 | 0.202082 | 0.236799 | 0.888755 |
| 3 | 0.151717 | 0.224433 | 0.911224 |
| 4 | 0.104353 | 0.256283 | 0.904543 |
| 5 | 0.061494 | 0.298868 | 0.908277 |
| 6 | 0.035558 | 0.332564 | 0.908750 |
|
damgomz/ft_32_7e6_base_x4 | damgomz | "2024-06-24T08:22:26Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:01:36Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 79800.8502380848 |
| Emissions (Co2eq in kg) | 0.0482886922008511 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.94209092052132 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.083124939032644 |
| Consumed energy (kWh) | 1.0252158595539629 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.15361663670831321 |
| Emissions (Co2eq in kg) | 0.03125533300991654 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_7e6_base_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 7e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.718756 | 0.500131 |
| 1 | 0.353712 | 0.241377 | 0.916281 |
| 2 | 0.213126 | 0.211512 | 0.917583 |
| 3 | 0.161509 | 0.222965 | 0.923510 |
| 4 | 0.125749 | 0.234407 | 0.927739 |
| 5 | 0.080237 | 0.272451 | 0.918019 |
| 6 | 0.049158 | 0.324561 | 0.907007 |
|
damgomz/ft_32_7e6_x4 | damgomz | "2024-06-24T08:28:24Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:01:45Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 80159.37039136887 |
| Emissions (Co2eq in kg) | 0.0485056404355235 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.946323495596483 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0834983857971926 |
| Consumed energy (kWh) | 1.0298218813936757 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.15430678800338507 |
| Emissions (Co2eq in kg) | 0.03139575340328614 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x4 |
| model_name | ft_32_7e6_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 7e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.714926 | 0.621487 |
| 1 | 0.349946 | 0.245496 | 0.917076 |
| 2 | 0.194886 | 0.215978 | 0.924458 |
| 3 | 0.148132 | 0.220138 | 0.924500 |
| 4 | 0.112454 | 0.221735 | 0.927324 |
| 5 | 0.074575 | 0.272739 | 0.915914 |
| 6 | 0.043918 | 0.300753 | 0.916970 |
|
damgomz/ft_32_17e6_x2 | damgomz | "2024-06-24T08:19:05Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:01:50Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 79599.91120457649 |
| Emissions (Co2eq in kg) | 0.0481670780880689 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9397184047894326 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0829154689726731 |
| Consumed energy (kWh) | 1.0226338737621052 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.15322982906880975 |
| Emissions (Co2eq in kg) | 0.031176631888459122 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x2 |
| model_name | ft_32_17e6_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.7e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.705081 | 0.501642 |
| 1 | 0.301390 | 0.212497 | 0.932608 |
| 2 | 0.161391 | 0.201730 | 0.930413 |
| 3 | 0.109724 | 0.211987 | 0.929595 |
| 4 | 0.056979 | 0.286972 | 0.941885 |
| 5 | 0.034072 | 0.319155 | 0.931504 |
| 6 | 0.018354 | 0.374343 | 0.921303 |
|
damgomz/ft_32_6e6_x8 | damgomz | "2024-06-24T08:42:57Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:02:01Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 81029.61869692802 |
| Emissions (Co2eq in kg) | 0.0490322391045145 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9565971863961876 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0844048964579901 |
| Consumed energy (kWh) | 1.041002082854176 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.15598201599158645 |
| Emissions (Co2eq in kg) | 0.031736600656296805 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x8 |
| model_name | ft_32_6e6_x8 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 6e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.696089 | 0.349678 |
| 1 | 0.352716 | 0.238475 | 0.913756 |
| 2 | 0.203627 | 0.220733 | 0.915962 |
| 3 | 0.159582 | 0.217955 | 0.927139 |
| 4 | 0.124792 | 0.237107 | 0.911855 |
| 5 | 0.090320 | 0.270415 | 0.914128 |
| 6 | 0.056342 | 0.299888 | 0.930272 |
|
damgomz/ft_32_3e6_base_x1 | damgomz | "2024-06-24T08:13:34Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:02:09Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 79268.96590352058 |
| Emissions (Co2eq in kg) | 0.047966836155913 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9358116411315084 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0825709018275143 |
| Consumed energy (kWh) | 1.0183825429590248 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.15259275936427713 |
| Emissions (Co2eq in kg) | 0.03104701164554556 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_3e6_base_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 3e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.709158 | 0.275495 |
| 1 | 0.434058 | 0.312007 | 0.896704 |
| 2 | 0.249555 | 0.246421 | 0.912820 |
| 3 | 0.187484 | 0.217297 | 0.907058 |
| 4 | 0.146818 | 0.226701 | 0.910997 |
| 5 | 0.108932 | 0.238091 | 0.906553 |
| 6 | 0.075487 | 0.250010 | 0.921942 |
|
damgomz/ft_32_6e6_base_x2 | damgomz | "2024-06-24T08:51:29Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:02:24Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 81544.04263401031 |
| Emissions (Co2eq in kg) | 0.0493435342201205 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9626704126213974 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0849407680355013 |
| Consumed energy (kWh) | 1.0476111806568973 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.15697228207046984 |
| Emissions (Co2eq in kg) | 0.03193808336498737 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_6e6_base_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 6e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.704487 | 0.355798 |
| 1 | 0.341921 | 0.251525 | 0.905432 |
| 2 | 0.204760 | 0.210480 | 0.920269 |
| 3 | 0.155029 | 0.230605 | 0.899533 |
| 4 | 0.109544 | 0.237654 | 0.918624 |
| 5 | 0.065636 | 0.299642 | 0.914945 |
| 6 | 0.036088 | 0.349576 | 0.904576 |
|
damgomz/ft_32_6e6_x1 | damgomz | "2024-06-24T08:45:13Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:02:24Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 81167.52666306496 |
| Emissions (Co2eq in kg) | 0.0491156926103222 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.958225337481499 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0845485443962114 |
| Consumed energy (kWh) | 1.0427738818777144 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.15624748882640002 |
| Emissions (Co2eq in kg) | 0.03179061460970044 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/ThunBERT_bs16_lr5_MLM |
| model_name | ft_32_6e6_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 6e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.708064 | 0.369049 |
| 1 | 0.341608 | 0.213487 | 0.915795 |
| 2 | 0.181108 | 0.197728 | 0.919906 |
| 3 | 0.134788 | 0.205607 | 0.918615 |
| 4 | 0.096347 | 0.213107 | 0.937047 |
| 5 | 0.059694 | 0.242867 | 0.925012 |
| 6 | 0.033484 | 0.280116 | 0.923403 |
|
damgomz/ft_32_2e6_x4 | damgomz | "2024-06-24T08:25:12Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:02:29Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 79966.68921732903 |
| Emissions (Co2eq in kg) | 0.0483890442038945 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9440487454518672 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0832976846429208 |
| Consumed energy (kWh) | 1.027346430094787 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.15393587674335837 |
| Emissions (Co2eq in kg) | 0.03132028661012053 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x4 |
| model_name | ft_32_2e6_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 2e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.706277 | 0.486549 |
| 1 | 0.466249 | 0.305492 | 0.888577 |
| 2 | 0.260374 | 0.250720 | 0.907067 |
| 3 | 0.210482 | 0.230002 | 0.909887 |
| 4 | 0.181776 | 0.219238 | 0.916580 |
| 5 | 0.157157 | 0.215187 | 0.927132 |
| 6 | 0.137032 | 0.218023 | 0.916048 |
|
damgomz/ft_32_6e6_base_x1 | damgomz | "2024-06-24T08:47:29Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:02:34Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 81303.61992526054 |
| Emissions (Co2eq in kg) | 0.0491980418320612 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9598319297734268 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0846903061039744 |
| Consumed energy (kWh) | 1.0445222358773971 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.15650946835612656 |
| Emissions (Co2eq in kg) | 0.03184391780406038 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_6e6_base_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 6e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.742219 | 0.760623 |
| 1 | 0.379147 | 0.253497 | 0.905509 |
| 2 | 0.210643 | 0.224062 | 0.935292 |
| 3 | 0.154627 | 0.227763 | 0.930814 |
| 4 | 0.110174 | 0.232698 | 0.911840 |
| 5 | 0.076161 | 0.253944 | 0.917156 |
| 6 | 0.047911 | 0.286995 | 0.921704 |
|
damgomz/ft_32_6e6_x2 | damgomz | "2024-06-24T08:50:57Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:02:35Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 81511.28000450134 |
| Emissions (Co2eq in kg) | 0.0493237009703858 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9622834697269724 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0849066317652662 |
| Consumed energy (kWh) | 1.0471901014922378 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.15690921400866506 |
| Emissions (Co2eq in kg) | 0.031925251335096355 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x2 |
| model_name | ft_32_6e6_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 6e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.700447 | 0.604789 |
| 1 | 0.380918 | 0.225158 | 0.915259 |
| 2 | 0.185228 | 0.197471 | 0.926696 |
| 3 | 0.142736 | 0.203819 | 0.924086 |
| 4 | 0.102383 | 0.220093 | 0.932317 |
| 5 | 0.065159 | 0.254997 | 0.921107 |
| 6 | 0.038284 | 0.294963 | 0.923176 |
|
damgomz/ft_32_6e6_x12 | damgomz | "2024-06-24T08:55:04Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:02:37Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 81758.97024178505 |
| Emissions (Co2eq in kg) | 0.0494735859419808 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9652076878360556 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0851646172525982 |
| Consumed energy (kWh) | 1.0503723050886578 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.15738601771543623 |
| Emissions (Co2eq in kg) | 0.03202226334469914 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x12 |
| model_name | ft_32_6e6_x12 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 6e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.704507 | 0.496770 |
| 1 | 0.359602 | 0.251930 | 0.894508 |
| 2 | 0.208636 | 0.213533 | 0.929166 |
| 3 | 0.166296 | 0.224096 | 0.906348 |
| 4 | 0.128563 | 0.232381 | 0.920644 |
| 5 | 0.092740 | 0.257756 | 0.925921 |
| 6 | 0.060820 | 0.297630 | 0.923177 |
|
damgomz/ft_32_6e6_x4 | damgomz | "2024-06-24T09:03:01Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:02:47Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 82235.88673067093 |
| Emissions (Co2eq in kg) | 0.0497621632536577 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9708376775056118 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0856614043325184 |
| Consumed energy (kWh) | 1.056499081838129 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.15830408195654153 |
| Emissions (Co2eq in kg) | 0.03220905563617944 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x4 |
| model_name | ft_32_6e6_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 6e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.700966 | 0.495226 |
| 1 | 0.351622 | 0.237150 | 0.923190 |
| 2 | 0.194934 | 0.203977 | 0.921933 |
| 3 | 0.148990 | 0.212348 | 0.931770 |
| 4 | 0.109384 | 0.231842 | 0.937159 |
| 5 | 0.078790 | 0.257602 | 0.928530 |
| 6 | 0.048718 | 0.304421 | 0.906031 |
|
damgomz/ft_32_6e6_base_x12 | damgomz | "2024-06-23T22:51:35Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:02:48Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | [More Information Needed] |
| Emissions (Co2eq in kg) | [More Information Needed] |
| CPU power (W) | [NO CPU] |
| GPU power (W) | [No GPU] |
| RAM power (W) | [More Information Needed] |
| CPU energy (kWh) | [No CPU] |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | [More Information Needed] |
| Consumed energy (kWh) | [More Information Needed] |
| Country name | [More Information Needed] |
| Cloud provider | [No Cloud] |
| Cloud region | [No Cloud] |
| CPU count | [No CPU] |
| CPU model | [No CPU] |
| GPU count | [No GPU] |
| GPU model | [No GPU] |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | [No CPU] |
| Emissions (Co2eq in kg) | [More Information Needed] |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_6e6_base_x12 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 6e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.724442 | 0.535691 |
| 1 | 0.386584 | 0.296452 | 0.891890 |
| 2 | 0.263757 | 0.269844 | 0.882921 |
| 3 | 0.222154 | 0.248667 | 0.915979 |
| 4 | 0.190100 | 0.235574 | 0.917539 |
|
damgomz/ft_32_1e6_base_x1 | damgomz | "2024-06-24T09:28:08Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:02:52Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 83741.29759025574 |
| Emissions (Co2eq in kg) | 0.0506731160404761 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9886099754863296 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0872295192266502 |
| Consumed energy (kWh) | 1.0758394947129823 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.16120199786124229 |
| Emissions (Co2eq in kg) | 0.032798674889516835 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_1e6_base_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.801658 | 0.334963 |
| 1 | 0.565470 | 0.464311 | 0.856138 |
| 2 | 0.406829 | 0.365098 | 0.884232 |
| 3 | 0.316148 | 0.301531 | 0.881326 |
| 4 | 0.255616 | 0.268551 | 0.910092 |
| 5 | 0.212830 | 0.246640 | 0.913674 |
| 6 | 0.181439 | 0.245415 | 0.905188 |
|
damgomz/ft_32_6e6_base_x4 | damgomz | "2024-06-24T08:58:14Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:02:57Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 81948.49645328522 |
| Emissions (Co2eq in kg) | 0.0495882706839072 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.967445109701324 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.085362063902368 |
| Consumed energy (kWh) | 1.052807173603686 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.15775085567257402 |
| Emissions (Co2eq in kg) | 0.03209649444420338 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_6e6_base_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 6e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.709260 | 0.345444 |
| 1 | 0.352131 | 0.265789 | 0.904976 |
| 2 | 0.216744 | 0.220877 | 0.917889 |
| 3 | 0.166817 | 0.221011 | 0.913739 |
| 4 | 0.127526 | 0.226639 | 0.931894 |
| 5 | 0.084004 | 0.271546 | 0.922001 |
| 6 | 0.059928 | 0.307502 | 0.899673 |
|
damgomz/ft_32_19e6_base_x1 | damgomz | "2024-06-24T08:51:09Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:02:58Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 81521.76889538765 |
| Emissions (Co2eq in kg) | 0.0493300581815189 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9624074856905492 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0849175855716069 |
| Consumed energy (kWh) | 1.047325071262159 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.15692940512362122 |
| Emissions (Co2eq in kg) | 0.03192935948402683 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_19e6_base_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.9e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.697413 | 0.498850 |
| 1 | 0.309828 | 0.229587 | 0.914338 |
| 2 | 0.185570 | 0.260364 | 0.896540 |
| 3 | 0.144719 | 0.223334 | 0.931744 |
| 4 | 0.100979 | 0.234904 | 0.919179 |
| 5 | 0.074212 | 0.260365 | 0.928091 |
| 6 | 0.052888 | 0.309077 | 0.922794 |
|
damgomz/ft_32_2e6_x2 | damgomz | "2024-06-24T08:33:36Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:03:00Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 80470.50708627701 |
| Emissions (Co2eq in kg) | 0.0486939097815886 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9499965548243796 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0838224677354101 |
| Consumed energy (kWh) | 1.033819022559789 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.15490572614108322 |
| Emissions (Co2eq in kg) | 0.031517615275458495 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x2 |
| model_name | ft_32_2e6_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 2e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.704877 | 0.786814 |
| 1 | 0.542718 | 0.339129 | 0.885753 |
| 2 | 0.267687 | 0.239245 | 0.910360 |
| 3 | 0.206070 | 0.218004 | 0.907588 |
| 4 | 0.176140 | 0.204541 | 0.932401 |
| 5 | 0.152299 | 0.204936 | 0.930888 |
| 6 | 0.132149 | 0.207065 | 0.925418 |
|
damgomz/ft_32_5e6_x1 | damgomz | "2024-06-24T09:31:22Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:03:02Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 83936.26286959648 |
| Emissions (Co2eq in kg) | 0.0507910899241483 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9909116231575594 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0874325717459121 |
| Consumed energy (kWh) | 1.078344194903471 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.16157730602397322 |
| Emissions (Co2eq in kg) | 0.03287503629059195 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/ThunBERT_bs16_lr5_MLM |
| model_name | ft_32_5e6_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 5e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.717080 | 0.404168 |
| 1 | 0.370401 | 0.234197 | 0.926642 |
| 2 | 0.194178 | 0.193134 | 0.931831 |
| 3 | 0.149313 | 0.206890 | 0.920385 |
| 4 | 0.111487 | 0.208223 | 0.928962 |
| 5 | 0.076829 | 0.224699 | 0.919291 |
| 6 | 0.044347 | 0.252501 | 0.924167 |
|
damgomz/ft_32_19e6_x2 | damgomz | "2024-06-24T08:54:50Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:03:02Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 81744.97626900673 |
| Emissions (Co2eq in kg) | 0.049465129258145 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9650425936813166 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0851501677917937 |
| Consumed energy (kWh) | 1.0501927614731128 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.15735907931783794 |
| Emissions (Co2eq in kg) | 0.03201678237202763 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x2 |
| model_name | ft_32_19e6_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.9e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.727473 | 0.165280 |
| 1 | 0.305742 | 0.209376 | 0.944935 |
| 2 | 0.163984 | 0.199954 | 0.931879 |
| 3 | 0.102745 | 0.261541 | 0.917667 |
| 4 | 0.060469 | 0.274702 | 0.918155 |
| 5 | 0.034150 | 0.323645 | 0.931082 |
| 6 | 0.026382 | 0.337212 | 0.919652 |
|
damgomz/ft_32_19e6_x1 | damgomz | "2024-06-24T08:56:42Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:03:02Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 81857.15504932404 |
| Emissions (Co2eq in kg) | 0.0495329941011243 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9663666981958676 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0852668998407818 |
| Consumed energy (kWh) | 1.0516335980366511 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.15757502346994876 |
| Emissions (Co2eq in kg) | 0.03206071906098524 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/ThunBERT_bs16_lr5_MLM |
| model_name | ft_32_19e6_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.9e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.711379 | 0.730108 |
| 1 | 0.276182 | 0.191061 | 0.935506 |
| 2 | 0.154496 | 0.223484 | 0.916736 |
| 3 | 0.095163 | 0.227535 | 0.939098 |
| 4 | 0.051238 | 0.283583 | 0.934291 |
| 5 | 0.023429 | 0.328065 | 0.918598 |
| 6 | 0.019536 | 0.370221 | 0.923137 |
|
damgomz/ft_32_5e6_base_x2 | damgomz | "2024-06-24T09:36:02Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:03:04Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 84216.49479651451 |
| Emissions (Co2eq in kg) | 0.0509606668864246 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9942199889325468 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0877244896650316 |
| Consumed energy (kWh) | 1.0819444785975798 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.1621167524832904 |
| Emissions (Co2eq in kg) | 0.03298479379530151 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_5e6_base_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 5e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.723977 | 0.220560 |
| 1 | 0.358270 | 0.253965 | 0.922467 |
| 2 | 0.210130 | 0.225206 | 0.917343 |
| 3 | 0.156564 | 0.218143 | 0.935316 |
| 4 | 0.115515 | 0.240640 | 0.923552 |
| 5 | 0.070426 | 0.274872 | 0.918056 |
| 6 | 0.038166 | 0.319556 | 0.915421 |
|
damgomz/ft_32_1e6_x2 | damgomz | "2024-06-24T09:29:48Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:03:07Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 83842.70004177094 |
| Emissions (Co2eq in kg) | 0.050734471040808 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9898069643634896 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0873351566277441 |
| Consumed energy (kWh) | 1.0771421209912329 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.16139719758040905 |
| Emissions (Co2eq in kg) | 0.032838390849693616 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x2 |
| model_name | ft_32_1e6_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.699989 | 0.331379 |
| 1 | 0.665445 | 0.571562 | 0.837237 |
| 2 | 0.401150 | 0.314154 | 0.883665 |
| 3 | 0.269103 | 0.247632 | 0.913446 |
| 4 | 0.222599 | 0.227815 | 0.919434 |
| 5 | 0.197380 | 0.213939 | 0.925075 |
| 6 | 0.180440 | 0.206377 | 0.925302 |
|
damgomz/ft_32_19e6_base_x2 | damgomz | "2024-06-24T08:59:31Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:03:08Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 82016.95650601387 |
| Emissions (Co2eq in kg) | 0.0496296998664142 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9682533590194252 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0854333963873485 |
| Consumed energy (kWh) | 1.05368675540677 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.1578826412740767 |
| Emissions (Co2eq in kg) | 0.03212330796485543 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_19e6_base_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.9e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.784040 | 0.495607 |
| 1 | 0.318247 | 0.241446 | 0.934972 |
| 2 | 0.182249 | 0.210981 | 0.923626 |
| 3 | 0.126879 | 0.228292 | 0.908654 |
| 4 | 0.076337 | 0.313764 | 0.928976 |
| 5 | 0.054462 | 0.306271 | 0.910717 |
| 6 | 0.038489 | 0.326108 | 0.918542 |
|
damgomz/ft_32_1e6_x1 | damgomz | "2024-06-24T09:30:43Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:03:12Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 83898.19526410103 |
| Emissions (Co2eq in kg) | 0.0507680587805448 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9904622792965836 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0873929420478643 |
| Consumed energy (kWh) | 1.0778552213444477 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.16150402588339446 |
| Emissions (Co2eq in kg) | 0.03286012647843957 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/ThunBERT_bs16_lr5_MLM |
| model_name | ft_32_1e6_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.696478 | 0.232135 |
| 1 | 0.552451 | 0.438652 | 0.874495 |
| 2 | 0.374388 | 0.330575 | 0.893278 |
| 3 | 0.284570 | 0.267519 | 0.913257 |
| 4 | 0.233110 | 0.236401 | 0.920771 |
| 5 | 0.202751 | 0.223335 | 0.908936 |
| 6 | 0.182598 | 0.212633 | 0.923205 |
|
damgomz/ft_32_5e6_base_x1 | damgomz | "2024-06-24T09:34:47Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:03:12Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 84141.62449789047 |
| Emissions (Co2eq in kg) | 0.0509153560592317 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9933359973132604 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0876464883926012 |
| Consumed energy (kWh) | 1.080982485705861 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.16197262715843916 |
| Emissions (Co2eq in kg) | 0.0329554695950071 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_5e6_base_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 5e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.730071 | 0.550302 |
| 1 | 0.374210 | 0.248296 | 0.901349 |
| 2 | 0.206695 | 0.234511 | 0.913291 |
| 3 | 0.152858 | 0.228780 | 0.930295 |
| 4 | 0.111139 | 0.244988 | 0.917061 |
| 5 | 0.070761 | 0.259285 | 0.911558 |
| 6 | 0.046198 | 0.291387 | 0.909853 |
|
damgomz/ft_32_5e6_x2 | damgomz | "2024-06-24T09:37:04Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:03:13Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 84278.55794286728 |
| Emissions (Co2eq in kg) | 0.0509982140895369 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9949525065354152 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0877891356704137 |
| Consumed energy (kWh) | 1.0827416422058342 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.1622362240400195 |
| Emissions (Co2eq in kg) | 0.03300910186095635 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x2 |
| model_name | ft_32_5e6_x2 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 5e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.717108 | 0.666354 |
| 1 | 0.412750 | 0.241664 | 0.900959 |
| 2 | 0.198323 | 0.198621 | 0.929248 |
| 3 | 0.150829 | 0.204761 | 0.916151 |
| 4 | 0.114953 | 0.221281 | 0.935168 |
| 5 | 0.078120 | 0.246305 | 0.926568 |
| 6 | 0.048869 | 0.282362 | 0.922933 |
|
damgomz/ft_32_19e6_x4 | damgomz | "2024-06-24T09:41:17Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:03:19Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 84531.86149334908 |
| Emissions (Co2eq in kg) | 0.0511514947529426 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9979429230343956 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0880530166047314 |
| Consumed energy (kWh) | 1.08599593963913 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.16272383337469695 |
| Emissions (Co2eq in kg) | 0.03310831241822838 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x4 |
| model_name | ft_32_19e6_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.9e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.707783 | 0.637775 |
| 1 | 0.311302 | 0.222628 | 0.929990 |
| 2 | 0.169911 | 0.215388 | 0.939025 |
| 3 | 0.110805 | 0.250348 | 0.918396 |
| 4 | 0.063846 | 0.282927 | 0.928378 |
| 5 | 0.041853 | 0.303873 | 0.926867 |
| 6 | 0.023700 | 0.414345 | 0.931528 |
|
damgomz/ft_32_19e6_base_x4 | damgomz | "2024-06-24T09:08:01Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:03:23Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 82536.04255104065 |
| Emissions (Co2eq in kg) | 0.0499437995576998 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9743813026997792 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.085974094376713 |
| Consumed energy (kWh) | 1.060355397076493 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.15888188191075323 |
| Emissions (Co2eq in kg) | 0.03232661666582425 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_19e6_base_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.9e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.716812 | 0.321399 |
| 1 | 0.316622 | 0.222771 | 0.923284 |
| 2 | 0.188987 | 0.254623 | 0.899390 |
| 3 | 0.140403 | 0.233587 | 0.925130 |
| 4 | 0.091719 | 0.278762 | 0.922702 |
| 5 | 0.055096 | 0.395175 | 0.882975 |
| 6 | 0.044405 | 0.355451 | 0.910667 |
|
buddhadilesh/my-pet-dog-srg | buddhadilesh | "2024-06-23T11:33:48Z" | 0 | 0 | null | [
"Text-to-Image",
" Diffusers",
" Safetensors ",
" StableDiffusionPipeline ",
"NxtWave-GenAI-Webinar ",
"stable-diffusion ",
"license:creativeml-openrail-m",
"region:us"
] | null | "2024-06-23T11:03:24Z" | ---
license: creativeml-openrail-m
tags:
- Text-to-Image
- ' Diffusers'
- ' Safetensors '
- ' StableDiffusionPipeline '
- 'NxtWave-GenAI-Webinar '
- 'stable-diffusion '
---
### Dog Dreambooth model trained by buddhadilesh following the "Build your own Gen AI model" session by NxtWave.
Project Submission Code: 22L61A0585
Sample pictures of this concept:
|
damgomz/ft_32_5e6_base_x4 | damgomz | "2024-06-24T09:43:21Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:03:24Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 84655.95785808563 |
| Emissions (Co2eq in kg) | 0.0512265887715501 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9994080464348196 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0881822121913236 |
| Consumed energy (kWh) | 1.0875902586261474 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.16296271887681482 |
| Emissions (Co2eq in kg) | 0.0331569168277502 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_5e6_base_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 5e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.726668 | 0.334553 |
| 1 | 0.362076 | 0.270883 | 0.878418 |
| 2 | 0.223458 | 0.230738 | 0.894364 |
| 3 | 0.178340 | 0.232899 | 0.913810 |
| 4 | 0.141190 | 0.229558 | 0.913377 |
| 5 | 0.103393 | 0.273028 | 0.896759 |
| 6 | 0.073538 | 0.281520 | 0.908772 |
|
khongtrunght/khongtrunght | khongtrunght | "2024-06-23T11:03:24Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-23T11:03:24Z" | Entry not found |
damgomz/ft_32_5e6_x4 | damgomz | "2024-06-24T09:42:18Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:03:26Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 84593.03540968895 |
| Emissions (Co2eq in kg) | 0.0511885078659687 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9986650677652844 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0881166962305703 |
| Consumed energy (kWh) | 1.0867817639958537 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.16284159316365124 |
| Emissions (Co2eq in kg) | 0.03313227220212817 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x4 |
| model_name | ft_32_5e6_x4 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 5e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.726340 | 0.332935 |
| 1 | 0.366748 | 0.236639 | 0.913753 |
| 2 | 0.202514 | 0.216440 | 0.925653 |
| 3 | 0.158706 | 0.207701 | 0.930613 |
| 4 | 0.125466 | 0.231257 | 0.904056 |
| 5 | 0.097189 | 0.246146 | 0.928232 |
| 6 | 0.065838 | 0.261989 | 0.925164 |
|
damgomz/ft_32_6e6_base_x8 | damgomz | "2024-06-24T09:20:21Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:03:30Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 83275.29592871666 |
| Emissions (Co2eq in kg) | 0.0503911391020655 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9831087263792726 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0867441239225366 |
| Consumed energy (kWh) | 1.06985285030181 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.16030494466277959 |
| Emissions (Co2eq in kg) | 0.03261615757208069 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_6e6_base_x8 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 6e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.718706 | 0.428597 |
| 1 | 0.368201 | 0.260952 | 0.920472 |
| 2 | 0.229549 | 0.232955 | 0.919445 |
| 3 | 0.191444 | 0.227946 | 0.900556 |
| 4 | 0.155176 | 0.238332 | 0.898591 |
| 5 | 0.121049 | 0.265901 | 0.924556 |
| 6 | 0.089754 | 0.278236 | 0.909430 |
|
SwimChoi/villama2-7b-chat-Sweden-lora | SwimChoi | "2024-06-23T11:03:52Z" | 0 | 0 | peft | [
"peft",
"safetensors",
"arxiv:1910.09700",
"base_model:meta-llama/Llama-2-7b-chat-hf",
"region:us"
] | null | "2024-06-23T11:03:49Z" | ---
library_name: peft
base_model: meta-llama/Llama-2-7b-chat-hf
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.10.1.dev0 |
tapan247/Llama-2-7b-chat-finetune | tapan247 | "2024-06-23T11:15:44Z" | 0 | 0 | transformers | [
"transformers",
"pytorch",
"llama",
"text-generation",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | "2024-06-23T11:04:02Z" | Entry not found |
damgomz/ft_32_5e6_base_x8 | damgomz | "2024-06-24T10:00:49Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:04:02Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 85703.25652909279 |
| Emissions (Co2eq in kg) | 0.051860334927213 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 1.0117721012261205 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0892732041222352 |
| Consumed energy (kWh) | 1.1010453053483589 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.16497876881850362 |
| Emissions (Co2eq in kg) | 0.03356710880722801 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_5e6_base_x8 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 5e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.718306 | 0.012435 |
| 1 | 0.368546 | 0.267537 | 0.921596 |
| 2 | 0.228472 | 0.228982 | 0.898315 |
| 3 | 0.185974 | 0.233243 | 0.933199 |
| 4 | 0.158744 | 0.224832 | 0.909920 |
| 5 | 0.131466 | 0.234063 | 0.926528 |
| 6 | 0.098023 | 0.270619 | 0.917607 |
|
damgomz/ft_32_19e6_x8 | damgomz | "2024-06-24T09:59:59Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:04:04Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 85653.45368814468 |
| Emissions (Co2eq in kg) | 0.0518301931545655 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 1.0111840223819015 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0892213438431421 |
| Consumed energy (kWh) | 1.1004053662250386 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.1648828983496785 |
| Emissions (Co2eq in kg) | 0.03354760269452334 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x8 |
| model_name | ft_32_19e6_x8 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.9e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.702052 | 0.497521 |
| 1 | 0.298530 | 0.223112 | 0.928086 |
| 2 | 0.168823 | 0.224202 | 0.920357 |
| 3 | 0.112629 | 0.247916 | 0.919562 |
| 4 | 0.065288 | 0.281048 | 0.936415 |
| 5 | 0.037261 | 0.348494 | 0.896289 |
| 6 | 0.025527 | 0.416870 | 0.930228 |
|
damgomz/ft_32_18e6_base_x8 | damgomz | "2024-06-24T09:26:48Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:04:07Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 83661.49552226067 |
| Emissions (Co2eq in kg) | 0.0506248307469959 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9876679645523444 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0871463864592213 |
| Consumed energy (kWh) | 1.0748143510115693 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.16104837888035178 |
| Emissions (Co2eq in kg) | 0.03276741907955209 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_18e6_base_x8 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.8e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.729164 | 0.515696 |
| 1 | 0.327088 | 0.233053 | 0.922262 |
| 2 | 0.200008 | 0.226412 | 0.926072 |
| 3 | 0.154432 | 0.243826 | 0.923200 |
| 4 | 0.107959 | 0.308675 | 0.907737 |
| 5 | 0.078035 | 0.288907 | 0.927639 |
| 6 | 0.048632 | 0.382458 | 0.909777 |
|
damgomz/ft_32_5e6_x12 | damgomz | "2024-06-23T21:36:37Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:04:15Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | [More Information Needed] |
| Emissions (Co2eq in kg) | [More Information Needed] |
| CPU power (W) | [NO CPU] |
| GPU power (W) | [No GPU] |
| RAM power (W) | [More Information Needed] |
| CPU energy (kWh) | [No CPU] |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | [More Information Needed] |
| Consumed energy (kWh) | [More Information Needed] |
| Country name | [More Information Needed] |
| Cloud provider | [No Cloud] |
| Cloud region | [No Cloud] |
| CPU count | [No CPU] |
| CPU model | [No CPU] |
| GPU count | [No GPU] |
| GPU model | [No GPU] |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | [No CPU] |
| Emissions (Co2eq in kg) | [More Information Needed] |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x12 |
| model_name | ft_32_5e6_x12 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 5e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.701135 | 0.039939 |
| 1 | 0.374848 | 0.262630 | 0.917082 |
| 2 | 0.223393 | 0.223830 | 0.924577 |
|
damgomz/ft_32_5e6_base_x12 | damgomz | "2024-06-24T09:37:34Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:04:15Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 84307.6551721096 |
| Emissions (Co2eq in kg) | 0.0510158327103399 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.99529624085625 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0878194617899756 |
| Consumed energy (kWh) | 1.0831157026462264 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.16229223620631097 |
| Emissions (Co2eq in kg) | 0.033020498275742924 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_5e6_base_x12 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 5e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.718784 | 0.668353 |
| 1 | 0.402769 | 0.321245 | 0.851871 |
| 2 | 0.282093 | 0.276767 | 0.878442 |
| 3 | 0.234059 | 0.246080 | 0.916013 |
| 4 | 0.203769 | 0.232428 | 0.910324 |
| 5 | 0.177173 | 0.232535 | 0.919689 |
| 6 | 0.155451 | 0.230514 | 0.917727 |
|
damgomz/ft_32_2e6_x8 | damgomz | "2024-06-24T09:04:02Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:04:16Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 82297.30823588371 |
| Emissions (Co2eq in kg) | 0.0497993389093254 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.9715629698327852 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0857253873003026 |
| Consumed energy (kWh) | 1.0572883571330844 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.15842231835407614 |
| Emissions (Co2eq in kg) | 0.03223311239238779 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x8 |
| model_name | ft_32_2e6_x8 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 2e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.696429 | 0.619710 |
| 1 | 0.466246 | 0.305018 | 0.875728 |
| 2 | 0.267149 | 0.252896 | 0.906483 |
| 3 | 0.223972 | 0.232961 | 0.912724 |
| 4 | 0.196987 | 0.223361 | 0.913181 |
| 5 | 0.176798 | 0.225579 | 0.920709 |
| 6 | 0.161701 | 0.231319 | 0.918238 |
|
damgomz/ft_32_5e6_x8 | damgomz | "2024-06-24T10:04:19Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:04:18Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 85913.62610125542 |
| Emissions (Co2eq in kg) | 0.0519876239158291 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 1.0142554613759092 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0894923128927751 |
| Consumed energy (kWh) | 1.10374777426868 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.16538373024491668 |
| Emissions (Co2eq in kg) | 0.03364950355632504 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x8 |
| model_name | ft_32_5e6_x8 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 5e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.704706 | 0.165325 |
| 1 | 0.372469 | 0.276302 | 0.883169 |
| 2 | 0.216595 | 0.220360 | 0.921653 |
| 3 | 0.174007 | 0.211170 | 0.935329 |
| 4 | 0.140274 | 0.217598 | 0.934812 |
| 5 | 0.107311 | 0.253637 | 0.894713 |
| 6 | 0.078157 | 0.265494 | 0.921922 |
|
damgomz/ft_32_19e6_base_x8 | damgomz | "2024-06-24T09:57:57Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:04:25Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 85531.66389083862 |
| Emissions (Co2eq in kg) | 0.0517564975194601 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 1.0097462786599969 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.089094457618644 |
| Consumed energy (kWh) | 1.0988407362786388 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.16464845298986436 |
| Emissions (Co2eq in kg) | 0.03349990169057846 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_19e6_base_x8 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.9e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.690710 | 0.749326 |
| 1 | 0.333250 | 0.246641 | 0.907186 |
| 2 | 0.202962 | 0.219654 | 0.926463 |
| 3 | 0.155026 | 0.235011 | 0.911817 |
| 4 | 0.110104 | 0.257781 | 0.924271 |
| 5 | 0.082243 | 0.309697 | 0.904224 |
| 6 | 0.052416 | 0.345349 | 0.920317 |
|
damgomz/ft_32_18e6_base_x12 | damgomz | "2024-06-24T08:27:03Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:04:27Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | [More Information Needed] |
| Emissions (Co2eq in kg) | [More Information Needed] |
| CPU power (W) | [NO CPU] |
| GPU power (W) | [No GPU] |
| RAM power (W) | [More Information Needed] |
| CPU energy (kWh) | [No CPU] |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | [More Information Needed] |
| Consumed energy (kWh) | [More Information Needed] |
| Country name | [More Information Needed] |
| Cloud provider | [No Cloud] |
| Cloud region | [No Cloud] |
| CPU count | [No CPU] |
| CPU model | [No CPU] |
| GPU count | [No GPU] |
| GPU model | [No GPU] |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | [No CPU] |
| Emissions (Co2eq in kg) | [More Information Needed] |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_18e6_base_x12 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.8e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.707339 | 0.780756 |
| 1 | 0.370445 | 0.253259 | 0.920701 |
| 2 | 0.220112 | 0.240859 | 0.891298 |
| 3 | 0.184156 | 0.228855 | 0.913678 |
| 4 | 0.147339 | 0.233711 | 0.923496 |
|
damgomz/ft_32_2e6_x12 | damgomz | "2024-06-23T15:57:02Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:04:35Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | [More Information Needed] |
| Emissions (Co2eq in kg) | [More Information Needed] |
| CPU power (W) | [NO CPU] |
| GPU power (W) | [No GPU] |
| RAM power (W) | [More Information Needed] |
| CPU energy (kWh) | [No CPU] |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | [More Information Needed] |
| Consumed energy (kWh) | [More Information Needed] |
| Country name | [More Information Needed] |
| Cloud provider | [No Cloud] |
| Cloud region | [No Cloud] |
| CPU count | [No CPU] |
| CPU model | [No CPU] |
| GPU count | [No GPU] |
| GPU model | [No GPU] |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | [No CPU] |
| Emissions (Co2eq in kg) | [More Information Needed] |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x12 |
| model_name | ft_32_2e6_x12 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 2e-06 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.705599 | 0.413386 |
| 1 | 0.467405 | 0.326145 | 0.890617 |
|
prabinpanta0/celsius-to-fahrenheit | prabinpanta0 | "2024-06-23T12:22:20Z" | 0 | 1 | tensorflow | [
"tensorflow",
"keras",
"temperature-conversion",
"celsius-to-fahrenheit",
"neural-network",
"en",
"dataset:prabinpanta0/celsius-to-fahrenheit",
"license:mit",
"region:us"
] | null | "2024-06-23T11:04:37Z" | ---
license: mit
language: en
metrics: mean_squared_error
library_name: tensorflow
tags:
- temperature-conversion
- celsius-to-fahrenheit
- tensorflow
- neural-network
datasets:
- prabinpanta0/celsius-to-fahrenheit
---
# My Temperature Conversion Model
This model is a simple neural network that converts temperatures from Celsius to Fahrenheit.
## Model Description
This model was created as a practice exercise for the course "Intro to TensorFlow for Deep Learning" from Udacity, given by TensorFlow. It was trained on a dataset of temperature values in Celsius and their corresponding values in Fahrenheit. The model uses a small neural network built with TensorFlow.
## Usage
To use this model, you can load it with TensorFlow and make predictions as shown below:
```python
import tensorflow as tf
model = tf.keras.models.load_model('celsius-to-fahrenheit')
prediction = model.predict([100.0])
print(f"Prediction for 100°C in Fahrenheit: {prediction[0][0]}")
```
## Training
The model was trained using the following parameters:
- Optimizer: Adam
- Loss function: Mean Squared Error
- Epochs: 1000
- Batch size: 10
## Metrics
The model was evaluated based on the Mean Squared Error loss during training.
![image/png](https://cdn-uploads.huggingface.co/production/uploads/662ccaab9d047b3700b1d4cd/Pc4sHWyXfsoUbjdSAY_zA.png)
## Model Output
![image/png](https://cdn-uploads.huggingface.co/production/uploads/662ccaab9d047b3700b1d4cd/AbEhf1yTPAbqq59fxmGLG.png)
## Datasets
The model was trained on the [prabinpanta0/celsius-to-fahrenheit](https://huggingface.co/datasets/prabinpanta0/celsius-to-fahrenheit) dataset.
## License
This model is released under the MIT license. |
damgomz/ft_32_18e6_x12 | damgomz | "2024-06-23T20:04:04Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:04:38Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | [More Information Needed] |
| Emissions (Co2eq in kg) | [More Information Needed] |
| CPU power (W) | [NO CPU] |
| GPU power (W) | [No GPU] |
| RAM power (W) | [More Information Needed] |
| CPU energy (kWh) | [No CPU] |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | [More Information Needed] |
| Consumed energy (kWh) | [More Information Needed] |
| Country name | [More Information Needed] |
| Cloud provider | [No Cloud] |
| Cloud region | [No Cloud] |
| CPU count | [No CPU] |
| CPU model | [No CPU] |
| GPU count | [No GPU] |
| GPU model | [No GPU] |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | [No CPU] |
| Emissions (Co2eq in kg) | [More Information Needed] |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x12 |
| model_name | ft_32_18e6_x12 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.8e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.733001 | 0.833548 |
| 1 | 0.291751 | 0.222184 | 0.939866 |
| 2 | 0.179155 | 0.220241 | 0.923579 |
| 3 | 0.114086 | 0.227378 | 0.936924 |
| 4 | 0.068266 | 0.294327 | 0.933305 |
| 5 | 0.036952 | 0.358650 | 0.919452 |
| 6 | 0.029002 | 0.350685 | 0.930841 |
|
damgomz/ft_32_19e6_x12 | damgomz | "2024-06-24T10:10:06Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:04:42Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 86260.43604326248 |
| Emissions (Co2eq in kg) | 0.0521974785330882 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 1.0183495854083031 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.089853606309245 |
| Consumed energy (kWh) | 1.108203191717549 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.16605133938328026 |
| Emissions (Co2eq in kg) | 0.0337853374502778 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x12 |
| model_name | ft_32_19e6_x12 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.9e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.697563 | 0.368138 |
| 1 | 0.297268 | 0.229660 | 0.937037 |
| 2 | 0.170018 | 0.217992 | 0.918936 |
| 3 | 0.111955 | 0.250201 | 0.932553 |
| 4 | 0.065125 | 0.296230 | 0.912781 |
| 5 | 0.042337 | 0.353860 | 0.933457 |
| 6 | 0.025348 | 0.390722 | 0.921170 |
|
damgomz/ft_32_19e6_base_x12 | damgomz | "2024-06-24T10:14:06Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:04:49Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 86500.26889634132 |
| Emissions (Co2eq in kg) | 0.0523426177021578 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 1.0211812222340055 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0901034150806569 |
| Consumed energy (kWh) | 1.1112846373146603 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.16651301762545703 |
| Emissions (Co2eq in kg) | 0.03387927198440035 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_19e6_base_x12 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.9e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.703764 | 0.206445 |
| 1 | 0.361659 | 0.262179 | 0.924176 |
| 2 | 0.222014 | 0.237443 | 0.886685 |
| 3 | 0.188105 | 0.230483 | 0.910450 |
| 4 | 0.152888 | 0.242579 | 0.923003 |
| 5 | 0.113482 | 0.288091 | 0.910115 |
| 6 | 0.083856 | 0.315568 | 0.904766 |
|
ariaze/ARAZmixPony | ariaze | "2024-06-29T16:45:24Z" | 0 | 0 | null | [
"license:unknown",
"region:us"
] | null | "2024-06-23T11:07:41Z" | ---
license: unknown
---
|
hngan/sdxl300 | hngan | "2024-06-23T11:11:23Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-23T11:10:21Z" | Entry not found |
CreeperCatcher/AIstinct | CreeperCatcher | "2024-06-23T11:21:36Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-23T11:11:23Z" | Entry not found |
SwimChoi/villama2-7b-chat-Estonia-lora | SwimChoi | "2024-06-23T11:11:47Z" | 0 | 0 | peft | [
"peft",
"safetensors",
"arxiv:1910.09700",
"base_model:meta-llama/Llama-2-7b-chat-hf",
"region:us"
] | null | "2024-06-23T11:11:44Z" | ---
library_name: peft
base_model: meta-llama/Llama-2-7b-chat-hf
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.10.1.dev0 |
woweenie/v71-sd21-curated2-5e6-cd0.02-embeddingperturb1-3k-half | woweenie | "2024-06-23T11:13:13Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-23T11:13:13Z" | Entry not found |
woweenie/v71-sd21-curated2-5e6-cd0.02-embeddingperturb1-3k | woweenie | "2024-06-23T11:20:23Z" | 0 | 0 | diffusers | [
"diffusers",
"autotrain_compatible",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | "2024-06-23T11:13:52Z" | Entry not found |
SwimChoi/villama2-7b-chat-Switzerland-lora | SwimChoi | "2024-06-23T11:17:04Z" | 0 | 0 | peft | [
"peft",
"safetensors",
"arxiv:1910.09700",
"base_model:meta-llama/Llama-2-7b-chat-hf",
"region:us"
] | null | "2024-06-23T11:17:00Z" | ---
library_name: peft
base_model: meta-llama/Llama-2-7b-chat-hf
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.10.1.dev0 |
slelab/AES8 | slelab | "2024-06-23T11:42:25Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-23T11:19:07Z" | Entry not found |
ardipm/predictive_credit_card | ardipm | "2024-06-23T11:19:43Z" | 0 | 0 | null | [
"region:us"
] | null | "2024-06-23T11:19:43Z" | Entry not found |
SwimChoi/villama2-7b-chat-Portugal-lora | SwimChoi | "2024-06-23T11:21:00Z" | 0 | 0 | peft | [
"peft",
"safetensors",
"arxiv:1910.09700",
"base_model:meta-llama/Llama-2-7b-chat-hf",
"region:us"
] | null | "2024-06-23T11:20:57Z" | ---
library_name: peft
base_model: meta-llama/Llama-2-7b-chat-hf
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.10.1.dev0 |
damgomz/ft_32_15e6_x12 | damgomz | "2024-06-24T04:09:31Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:22:29Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | [More Information Needed] |
| Emissions (Co2eq in kg) | [More Information Needed] |
| CPU power (W) | [NO CPU] |
| GPU power (W) | [No GPU] |
| RAM power (W) | [More Information Needed] |
| CPU energy (kWh) | [No CPU] |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | [More Information Needed] |
| Consumed energy (kWh) | [More Information Needed] |
| Country name | [More Information Needed] |
| Cloud provider | [No Cloud] |
| Cloud region | [No Cloud] |
| CPU count | [No CPU] |
| CPU model | [No CPU] |
| GPU count | [No GPU] |
| GPU model | [No GPU] |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | [No CPU] |
| Emissions (Co2eq in kg) | [More Information Needed] |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/fp_bs16_lr5_x12 |
| model_name | ft_32_15e6_x12 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.5e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.713766 | 0.835124 |
| 1 | 0.322328 | 0.217589 | 0.910550 |
| 2 | 0.177534 | 0.208082 | 0.919247 |
|
damgomz/ft_32_11e6_base_x1 | damgomz | "2024-06-24T16:22:26Z" | 0 | 0 | transformers | [
"transformers",
"safetensors",
"albert",
"text-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | "2024-06-23T11:24:01Z" | ---
language: en
tags:
- text-classification
pipeline_tag: text-classification
widget:
- text: GEPS Techno is the pioneer of hybridization of renewable energies at sea.
We imagine, design and commercialize innovative off-grid systems that aim to generate
power at sea, stabilize and collect data. The success of our low power platforms
WAVEPEAL enabled us to scale-up the device up to WAVEGEM, the 150-kW capacity
platform.
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 108600.56103658676 |
| Emissions (Co2eq in kg) | 0.0657157773761293 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 1.28208679255512 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.1131230198932193 |
| Consumed energy (kWh) | 1.395209812448341 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.20905607999542952 |
| Emissions (Co2eq in kg) | 0.04253521973932981 |
## Note
19 juin 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_32_11e6_base_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 1.1e-05 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 29328 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score
---|---|---|---
| 0 | 0.000000 | 0.775399 | 0.278800 |
| 1 | 0.342542 | 0.227123 | 0.912421 |
| 2 | 0.195072 | 0.210132 | 0.930485 |
| 3 | 0.144655 | 0.222816 | 0.926510 |
| 4 | 0.102367 | 0.240379 | 0.915922 |
| 5 | 0.068007 | 0.254159 | 0.908042 |
| 6 | 0.046899 | 0.296062 | 0.932938 |
|