Dataset Preview
View in Dataset Viewer
Viewer
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed because of a cast error
Error code: DatasetGenerationCastError Exception: DatasetGenerationCastError Message: An error occurred while generating the dataset All the data files must have the same columns, but at some point there are 1 missing columns ({'private'}) This happened while the json dataset builder was generating data using hf://datasets/cot-leaderboard/cot-leaderboard-requests/ChavyvAkvar/habib-DPO-v3_eval_request_False_float16_Original.json (at revision 72eb1845594a8ec023c146933ab274766e8a2dfd) Please either edit the data files to have matching columns, or separate them into different configurations (see docs at https://hf.co/docs/hub/datasets-manual-configuration#multiple-configurations) Traceback: Traceback (most recent call last): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 2011, in _prepare_split_single writer.write_table(table) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 585, in write_table pa_table = table_cast(pa_table, self._schema) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2302, in table_cast return cast_table_to_schema(table, schema) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2256, in cast_table_to_schema raise CastError( datasets.table.CastError: Couldn't cast model: string base_model: string revision: string precision: string weight_type: string status: string submitted_time: timestamp[s] model_type: string likes: int64 params: double license: string to {'model': Value(dtype='string', id=None), 'base_model': Value(dtype='string', id=None), 'revision': Value(dtype='string', id=None), 'private': Value(dtype='bool', id=None), 'precision': Value(dtype='string', id=None), 'weight_type': Value(dtype='string', id=None), 'status': Value(dtype='string', id=None), 'submitted_time': Value(dtype='timestamp[s]', id=None), 'model_type': Value(dtype='string', id=None), 'likes': Value(dtype='int64', id=None), 'params': Value(dtype='float64', id=None), 'license': Value(dtype='string', id=None)} because column names don't match During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1324, in compute_config_parquet_and_info_response parquet_operations = convert_to_parquet(builder) File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 938, in convert_to_parquet builder.download_and_prepare( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1027, in download_and_prepare self._download_and_prepare( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1122, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1882, in _prepare_split for job_id, done, content in self._prepare_split_single( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 2013, in _prepare_split_single raise DatasetGenerationCastError.from_cast_error( datasets.exceptions.DatasetGenerationCastError: An error occurred while generating the dataset All the data files must have the same columns, but at some point there are 1 missing columns ({'private'}) This happened while the json dataset builder was generating data using hf://datasets/cot-leaderboard/cot-leaderboard-requests/ChavyvAkvar/habib-DPO-v3_eval_request_False_float16_Original.json (at revision 72eb1845594a8ec023c146933ab274766e8a2dfd) Please either edit the data files to have matching columns, or separate them into different configurations (see docs at https://hf.co/docs/hub/datasets-manual-configuration#multiple-configurations)
Need help to make the dataset viewer work? Open a discussion for direct support.
model
string | base_model
string | revision
string | private
bool | precision
string | weight_type
string | status
string | submitted_time
unknown | model_type
string | likes
int64 | params
float64 | license
string |
---|---|---|---|---|---|---|---|---|---|---|---|
01-ai/Yi-34B-Chat | main | false | bfloat16 | Original | FINISHED | "2024-05-09T19:26:01" | instruction-tuned | 322 | 34.389 | other |
|
01-ai/Yi-34B | main | false | bfloat16 | Original | FINISHED | "2024-02-01T13:27:36" | pretrained | 1,158 | 34.389 | other |
|
01-ai/Yi-6B | main | false | float16 | Original | FINISHED | "2024-02-01T13:25:27" | pretrained | 335 | 6.061 | other |
|
ChavyvAkvar/habib-DPO-v3 | main | null | float16 | Original | FINISHED | "2024-04-14T04:35:27" | 🔶 : fine-tuned | 0 | 7.242 | apache-2.0 |
|
Deci/DeciLM-7B-instruct | main | false | float16 | Original | FINISHED | "2024-01-30T12:25:35" | instruction-tuned | 92 | 7.044 | apache-2.0 |
|
Deci/DeciLM-7B | main | false | float16 | Original | FINISHED | "2024-01-30T12:37:15" | pretrained | 205 | 7.044 | apache-2.0 |
|
HuggingFaceH4/zephyr-7b-beta | main | false | bfloat16 | Original | FINISHED | "2024-03-17T15:54:35" | instruction-tuned | 1,365 | 7.242 | mit |
|
Intel/neural-chat-7b-v3-1 | main | null | float16 | Original | FINISHED | "2024-01-29T17:34:13" | instruction-tuned | 508 | 7.242 | apache-2.0 |
|
LeroyDyer/Mixtral_AI_Chat_1.0 | main | null | float16 | Original | PENDING | "2024-05-03T09:21:21" | 0 | 7.242 | apache-2.0 |
||
LeroyDyer/Mixtral_AI_CyberBrain_3_0 | main | null | float16 | Original | PENDING | "2024-05-02T05:18:29" | 🔶 : fine-tuned | 1 | 7.242 | mit |
|
LeroyDyer/Mixtral_AI_CyberTron_DeepMind_III_UFT | main | null | float16 | Original | PENDING | "2024-05-07T10:20:51" | 🔶 : fine-tuned | 0 | 7.242 | apache-2.0 |
|
LeroyDyer/Mixtral_AI_CyberUltron_DPO | main | null | float16 | Original | PENDING | "2024-05-01T16:17:06" | 🟢 : pretrained | 1 | 7.242 | apache-2.0 |
|
LeroyDyer/Mixtral_AI_Cyber_4.0 | main | null | float16 | Original | PENDING | "2024-05-02T05:18:46" | 🔶 : fine-tuned | 0 | 7.242 | mit |
|
LeroyDyer/Mixtral_AI_Cyber_Boss | main | null | float16 | Original | PENDING | "2024-05-02T05:19:15" | 🔶 : fine-tuned | 1 | 7.242 | openrail |
|
Locutusque/Hercules-4.0-Mistral-v0.2-7B | main | null | bfloat16 | Original | FINISHED | "2024-04-02T16:26:13" | â• : instruction-tuned | 10 | 7.242 | apache-2.0 |
|
Locutusque/OpenCerebrum-1.0-7b-DPO | main | null | bfloat16 | Original | FINISHED | "2024-04-02T16:26:38" | â• : instruction-tuned | 11 | 7.242 | apache-2.0 |
|
NousResearch/Hermes-2-Pro-Mistral-7B | main | false | bfloat16 | Original | FINISHED | "2024-04-15T05:18:42" | instruction-tuned | 413 | 7.242 | apache-2.0 |
|
NousResearch/Nous-Hermes-llama-2-7b | main | false | bfloat16 | Original | FINISHED | "2024-03-22T06:20:52" | instruction-tuned | 64 | 6.738 | mit |
|
OpenBuddy/openbuddy-llama3-8b-v21.1-8k | main | null | bfloat16 | Original | FINISHED | "2024-04-21T14:49:27" | â• : instruction-tuned | 9 | 8.03 | other |
|
OpenBuddy/openbuddy-mistral-22b-v21.1-32k | main | null | bfloat16 | Original | PENDING | "2024-05-18T14:15:52" | â• : instruction-tuned | 1 | 22.354 | apache-2.0 |
|
OpenBuddy/openbuddy-mistral2-7b-v20.2-32k | main | null | bfloat16 | Original | FINISHED | "2024-04-03T01:56:39" | â• : instruction-tuned | 0 | 7.279 | apache-2.0 |
|
OpenBuddy/openbuddy-yi1.5-9b-v21.1-32k | main | null | bfloat16 | Original | PENDING | "2024-05-22T02:06:26" | â• : instruction-tuned | 0 | 8.85 | apache-2.0 |
|
PetroGPT/WestSeverus-7B-DPO-v2 | main | null | bfloat16 | Original | PENDING | "2024-04-26T11:36:26" | 3 | 0 | apache-2.0 |
||
abacusai/Smaug-34B-v0.1 | main | null | bfloat16 | Original | PENDING | "2024-04-24T18:14:01" | 🟦 : RL-tuned | 54 | 34.389 | other |
|
allenai/tulu-2-13b | main | false | bfloat16 | Original | FINISHED | "2024-03-24T09:42:16" | instruction-tuned | 3 | 13 | null |
|
allenai/tulu-2-70b | main | false | bfloat16 | Original | FINISHED | "2024-03-23T10:18:39" | instruction-tuned | 8 | 70 | null |
|
allenai/tulu-2-7b | main | false | float16 | Original | FINISHED | "2023-01-31T06:38:21" | instruction-tuned | 4 | 7 | null |
|
allenai/tulu-2-dpo-13b | main | false | bfloat16 | Original | FINISHED | "2024-03-24T09:42:48" | instruction-tuned | 16 | 13 | null |
|
allenai/tulu-2-dpo-70b | main | false | bfloat16 | Original | FINISHED | "2024-03-24T05:41:00" | instruction-tuned | 141 | 68.977 | other |
|
allenai/tulu-2-dpo-7b | main | false | bfloat16 | Original | FINISHED | "2024-03-26T05:55:57" | instruction-tuned | 17 | 7 | null |
|
bunnycore/Mnemosyne-7B | bunnycore/Mnemosyne-7B | main | null | bfloat16 | Original | FINISHED | "2024-04-08T20:38:04" | â• : instruction-tuned | 1 | 7.242 | apache-2.0 |
cookinai/LlamaReflect-8B-CoT-safetensors | main | null | bfloat16 | Original | FINISHED | "2024-04-27T14:15:34" | 🔶 : fine-tuned | 0 | 8.03 | apache-2.0 |
|
databricks/dolly-v2-3b | main | false | bfloat16 | Original | FINISHED | "2024-03-20T10:41:26" | instruction-tuned | 269 | 3 | mit |
|
davidkim205/Rhea-72b-v0.5 | main | null | bfloat16 | Original | PENDING | "2024-04-24T18:10:58" | 🟦 : RL-tuned | 61 | 72.289 | apache-2.0 |
|
google/gemma-2b-it | main | false | bfloat16 | Original | FINISHED | "2024-03-17T10:20:54" | instruction-tuned | 389 | 2.506 | other |
|
google/gemma-2b | main | false | bfloat16 | Original | FINISHED | "2024-03-17T07:13:12" | pretrained | 561 | 2.506 | other |
|
google/gemma-7b-it | main | false | bfloat16 | Original | FINISHED | "2024-04-09T12:11:58" | instruction-tuned | 1,035 | 8.538 | gemma |
|
google/gemma-7b | main | false | bfloat16 | Original | FINISHED | "2024-05-14T15:49:14" | pretrained | 2,869 | 8.538 | gemma |
|
ibm/merlinite-7b | main | null | float16 | Original | PENDING | "2024-05-05T17:16:17" | 🔶 : fine-tuned | 98 | 7.242 | apache-2.0 |
|
ichigoberry/pandafish-2-7b-32k | main | null | bfloat16 | Original | FINISHED | "2024-04-06T18:48:52" | â• : instruction-tuned | 3 | 7.242 | apache-2.0 |
|
internlm/internlm2-7b | main | false | bfloat16 | Original | FINISHED | "2024-05-09T19:25:34" | pretrained | 35 | 7 | other |
|
internlm/internlm2-chat-20b | main | false | bfloat16 | Original | FINISHED | "2024-05-09T19:24:52" | instruction-tuned | 73 | 19.861 | other |
|
internlm/internlm2-chat-7b | main | false | bfloat16 | Original | FINISHED | "2024-05-09T19:24:16" | instruction-tuned | 66 | 7.738 | other |
|
internlm/internlm2-math-20b | main | false | bfloat16 | Original | FINISHED | "2024-05-14T08:32:50" | instruction-tuned | 17 | 19.861 | other |
|
internlm/internlm2-math-7b | main | false | bfloat16 | Original | FINISHED | "2024-05-13T17:50:01" | instruction-tuned | 22 | 7.738 | other |
|
jeonsworld/CarbonVillain-en-10.7B-v4 | main | null | float16 | Original | PENDING | "2024-04-24T18:15:48" | 🔶 : fine-tuned | 6 | 10.732 | cc-by-nc-sa-4.0 |
|
meta-llama/Llama-2-13b-chat-hf | main | false | float16 | Original | FINISHED | "2024-03-24T09:40:18" | instruction-tuned | 903 | 13.016 | null |
|
meta-llama/Llama-2-13b-hf | main | false | float16 | Original | FINISHED | "2024-03-24T09:38:15" | pretrained | 519 | 13.016 | null |
|
meta-llama/Llama-2-70b-chat-hf | main | false | float16 | Original | FINISHED | "2024-03-25T05:45:15" | instruction-tuned | 2,044 | 68.977 | null |
|
meta-llama/Llama-2-70b-hf | main | null | bfloat16 | Original | FINISHED | "2024-03-23T15:22:51" | pretrained | 777 | 68.977 | LLAMA2 |
|
meta-llama/Llama-2-7b-hf | main | null | float16 | Original | FINISHED | "2024-01-26T16:41:09" | pretrained | 1,050 | 6.74 | LLAMA2 |
|
meta-llama/Meta-Llama-3-70B-Instruct | main | false | bfloat16 | Original | FINISHED | "2024-05-13T13:05:16" | instruction-tuned | 947 | 70.554 | llama3 |
|
meta-llama/Meta-Llama-3-70B | main | false | bfloat16 | Original | FINISHED | "2024-05-15T06:44:35" | pretrained | 630 | 70.554 | llama3 |
|
meta-llama/Meta-Llama-3-8B-Instruct | main | null | bfloat16 | Original | FINISHED | "2024-04-20T15:58:07" | â• : instruction-tuned | 891 | 8.03 | other |
|
meta-llama/Meta-Llama-3-8B | main | null | bfloat16 | Original | FINISHED | "2024-04-19T01:06:36" | 🟢 : pretrained | 568 | 8.03 | other |
|
microsoft/Orca-2-13b | main | null | float16 | Original | FINISHED | "2024-02-08T06:20:22" | â• : instruction-tuned | 643 | 0 | other |
|
microsoft/Orca-2-7b | main | null | float16 | Original | FINISHED | "2024-02-08T06:17:10" | â• : instruction-tuned | 200 | 0 | other |
|
microsoft/phi-2 | main | false | float16 | Original | FINISHED | "2024-02-05T19:19:37" | pretrained | 2,672 | 2.78 | mit |
|
mistralai/Mistral-7B-Instruct-v0.2 | main | null | float16 | Original | FINISHED | "2024-01-26T16:41:09" | instruction-tuned | 731 | 7.242 | apache-2.0 |
|
mistralai/Mistral-7B-v0.1 | main | false | float16 | Original | FINISHED | "2024-01-30T12:38:07" | pretrained | 2,692 | 7.242 | apache-2.0 |
|
mistralai/Mixtral-8x7B-Instruct-v0.1 | main | false | bfloat16 | Original | FINISHED | "2024-02-01T13:32:17" | instruction-tuned | 2,570 | 46.703 | apache-2.0 |
|
mistralai/Mixtral-8x7B-v0.1 | main | false | bfloat16 | Original | FINISHED | "2024-02-01T13:32:41" | pretrained | 1,185 | 46.703 | apache-2.0 |
|
mlabonne/AlphaMonarch-7B | main | null | bfloat16 | Original | FINISHED | "2024-04-10T22:24:12" | 🟦 : RL-tuned | 139 | 7.242 | cc-by-nc-4.0 |
|
openbmb/Eurus-70b-sft | main | false | float16 | Original | FINISHED | "2024-04-10T06:59:32" | instruction-tuned | 4 | 68.977 | apache-2.0 |
|
openbmb/Eurus-7b-kto | main | false | bfloat16 | Original | FINISHED | "2024-04-10T07:00:08" | instruction-tuned | 8 | 7.242 | apache-2.0 |
|
openchat/openchat-3.5-0106 | main | false | bfloat16 | Original | FINISHED | "2024-02-07T20:52:36" | instruction-tuned | 163 | 7.242 | apache-2.0 |
|
refine-ai/Power-Llama-3-7B-Instruct | rhysjones/Phi-3-mini-mango-1-llamafied | main | null | float16 | Original | PENDING | "2024-05-08T03:32:17" | â• : instruction-tuned | 1 | 6.539 | mit |
teknium/OpenHermes-2.5-Mistral-7B | main | false | bfloat16 | Original | FINISHED | "2024-03-21T11:30:14" | instruction-tuned | 713 | 7.242 | apache-2.0 |
|
upstage/SOLAR-10.7B-Instruct-v1.0 | main | false | float16 | Original | FINISHED | "2024-04-09T13:23:01" | instruction-tuned | 567 | 10.732 | cc-by-nc-4.0 |
|
upstage/SOLAR-10.7B-v1.0 | main | false | float16 | Original | FINISHED | "2024-04-09T13:23:35" | pretrained | 222 | 10.732 | apache-2.0 |
|
wandb/gemma-2b-zephyr-dpo | main | false | bfloat16 | Original | FINISHED | "2024-04-22T04:41:03" | instruction-tuned | 2 | 2.506 | other |
No dataset card yet
New: Create and edit this dataset card directly on the website!
Contribute a Dataset Card- Downloads last month
- 0