The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed because of a cast error
Error code: DatasetGenerationCastError Exception: DatasetGenerationCastError Message: An error occurred while generating the dataset All the data files must have the same columns, but at some point there are 174 new columns ({'report.per_token.throughput.value', 'report.per_token.efficiency', 'report.per_token.latency.p95', 'config.launcher.numactl', 'report.load.memory.max_process_vram', 'config.backend.hub_kwargs.local_files_only', 'config.environment.gpu_vram_mb', 'report.prefill.efficiency.value', 'report.load.memory.max_reserved', 'config.backend.device_ids', 'config.backend.intra_op_num_threads', 'config.backend.quantization_scheme', 'report.prefill.latency.values', 'config.scenario.generate_kwargs.max_new_tokens', 'config.environment.peft_commit', 'config.backend.hub_kwargs.trust_remote_code', 'config.scenario.new_tokens', 'report.prefill.latency.mean', 'config.backend.inter_op_num_threads', 'report.prefill.energy.cpu', 'config.scenario.input_shapes.num_choices', 'config.backend.task', 'config.backend.torch_dtype', 'config.backend.torch_compile', 'config.scenario.latency', 'report.decode.memory.max_ram', 'report.decode.energy.ram', 'report.per_token.memory', 'report.prefill.latency.p90', 'config.backend.hub_kwargs.revision', 'config.environment.accelerate_commit', 'report.decode.latency.total', 'config.environment.processor', 'report.prefill.energy.unit', 'report.decode.latency.p50', 'config.backend.version', 'report.load.latency.p90', 'config.backend.quantization_config.exllama_config.version', 'report.decode.latency.p99', 'config.environment.peft_version', 'config.backend.peft_type', 'report.prefill.efficiency.unit', 'report.load.latency.stdev', 'config.environment.platform', 'report.loa ... on', 'config.backend.device', 'config.backend.quantization_config.exllama_config.max_input_len', 'report.prefill.energy.gpu', 'report.per_token.latency.total', 'config.environment.optimum_version', 'report.decode.memory.max_reserved', 'report.load.memory.max_allocated', 'report.decode.throughput.value', 'report.per_token.latency.stdev', 'report.decode.throughput.unit', 'config.backend.autocast_dtype', 'config.backend.library', 'config.environment.optimum_benchmark_version', 'report.decode.energy.cpu', 'config.backend.quantization_config.exllama_config.max_batch_size', 'report.load.throughput', 'config.environment.optimum_benchmark_commit', 'config.launcher.name', 'report.prefill.memory.max_reserved', 'config.environment.diffusers_commit', 'config.environment.optimum_commit', 'config.scenario.input_shapes.batch_size', 'report.load.energy.total', 'report.prefill.throughput.unit', 'report.per_token.latency.count', 'report.prefill.memory.max_process_vram', 'config.environment.transformers_commit', 'config.backend.low_cpu_mem_usage', 'report.prefill.memory.max_allocated', 'report.per_token.latency.p90', 'report.decode.memory.unit', 'config.environment.cpu_count', 'report.decode.latency.p90', 'config.environment.machine', 'report.decode.latency.count', 'report.per_token.latency.mean', 'report.load.memory.max_global_vram', 'config.environment.system', 'report.decode.energy.unit', 'config.backend.no_weights', 'config.scenario.memory', 'report.traceback', 'report.decode.latency.unit'}) and 34 missing columns ({'Hub β€οΈ', 'Architecture', 'MUSR', 'Submission Date', 'Weight type', '#Params (B)', 'BBH Raw', 'Generation', 'MoE', 'Upload To Hub Date', 'IFEval Raw', 'T', 'fullname', 'GPQA Raw', 'MMLU-PRO', 'MATH Lvl 5', 'Model', 'MMLU-PRO Raw', 'Hub License', 'GPQA', 'Available on the hub', 'IFEval', 'Not_Merged', 'Chat Template', 'Precision', 'Average β¬οΈ', "Maintainer's Highlight", 'Flagged', 'Model sha', 'Type', 'Base Model', 'BBH', 'MATH Lvl 5 Raw', 'MUSR Raw'}). This happened while the csv dataset builder was generating data using hf://datasets/optimum-benchmark/llm-perf-leaderboard/perf-df-awq-1xA10.csv (at revision 10db6ab4da6848bda3621e6fa0eb30d613c32500) Please either edit the data files to have matching columns, or separate them into different configurations (see docs at https://hf.co/docs/hub/datasets-manual-configuration#multiple-configurations) Traceback: Traceback (most recent call last): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1869, in _prepare_split_single writer.write_table(table) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 580, in write_table pa_table = table_cast(pa_table, self._schema) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2292, in table_cast return cast_table_to_schema(table, schema) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2240, in cast_table_to_schema raise CastError( datasets.table.CastError: Couldn't cast config.name: string config.backend.name: string config.backend.version: string config.backend._target_: string config.backend.task: string config.backend.library: string config.backend.model_type: string config.backend.model: string config.backend.processor: string config.backend.device: string config.backend.device_ids: int64 config.backend.seed: int64 config.backend.inter_op_num_threads: double config.backend.intra_op_num_threads: double config.backend.model_kwargs.trust_remote_code: bool config.backend.no_weights: bool config.backend.device_map: double config.backend.torch_dtype: string config.backend.eval_mode: bool config.backend.to_bettertransformer: bool config.backend.low_cpu_mem_usage: double config.backend.attn_implementation: string config.backend.cache_implementation: double config.backend.autocast_enabled: bool config.backend.autocast_dtype: double config.backend.torch_compile: bool config.backend.torch_compile_target: string config.backend.quantization_scheme: string config.backend.quantization_config.bits: int64 config.backend.quantization_config.version: string config.backend.quantization_config.exllama_config.version: double config.backend.quantization_config.exllama_config.max_input_len: double config.backend.quantization_config.exllama_config.max_batch_size: double config.backend.deepspeed_inference: bool config.backend.peft_type: double config.scenario.name: string config.scenario._target_: string config.scenario.iterations: int64 config.scenario.duration: ... rt.decode.latency.mean: double report.decode.latency.stdev: double report.decode.latency.p50: double report.decode.latency.p90: double report.decode.latency.p95: double report.decode.latency.p99: double report.decode.latency.values: string report.decode.throughput.unit: string report.decode.throughput.value: double report.decode.energy.unit: string report.decode.energy.cpu: double report.decode.energy.ram: double report.decode.energy.gpu: double report.decode.energy.total: double report.decode.efficiency.unit: string report.decode.efficiency.value: double report.per_token.memory: double report.per_token.latency.unit: string report.per_token.latency.count: double report.per_token.latency.total: double report.per_token.latency.mean: double report.per_token.latency.stdev: double report.per_token.latency.p50: double report.per_token.latency.p90: double report.per_token.latency.p95: double report.per_token.latency.p99: double report.per_token.latency.values: string report.per_token.throughput.unit: string report.per_token.throughput.value: double report.per_token.energy: double report.per_token.efficiency: double report.traceback: string config.backend.processor_kwargs.trust_remote_code: bool config.backend.hub_kwargs.trust_remote_code: bool config.backend.hub_kwargs.revision: string config.backend.hub_kwargs.force_download: bool config.backend.hub_kwargs.local_files_only: bool -- schema metadata -- pandas: '{"index_columns": [{"kind": "range", "name": null, "start": 0, "' + 27877 to {'T': Value(dtype='string', id=None), 'Model': Value(dtype='string', id=None), 'Average β¬οΈ': Value(dtype='float64', id=None), 'IFEval': Value(dtype='float64', id=None), 'IFEval Raw': Value(dtype='float64', id=None), 'BBH': Value(dtype='float64', id=None), 'BBH Raw': Value(dtype='float64', id=None), 'MATH Lvl 5': Value(dtype='float64', id=None), 'MATH Lvl 5 Raw': Value(dtype='float64', id=None), 'GPQA': Value(dtype='float64', id=None), 'GPQA Raw': Value(dtype='float64', id=None), 'MUSR': Value(dtype='float64', id=None), 'MUSR Raw': Value(dtype='float64', id=None), 'MMLU-PRO': Value(dtype='float64', id=None), 'MMLU-PRO Raw': Value(dtype='float64', id=None), 'Type': Value(dtype='string', id=None), 'Architecture': Value(dtype='string', id=None), 'Weight type': Value(dtype='string', id=None), 'Precision': Value(dtype='string', id=None), 'Not_Merged': Value(dtype='bool', id=None), 'Hub License': Value(dtype='string', id=None), '#Params (B)': Value(dtype='int64', id=None), 'Hub β€οΈ': Value(dtype='int64', id=None), 'Available on the hub': Value(dtype='bool', id=None), 'Model sha': Value(dtype='string', id=None), 'Flagged': Value(dtype='bool', id=None), 'MoE': Value(dtype='bool', id=None), 'Submission Date': Value(dtype='string', id=None), 'Upload To Hub Date': Value(dtype='string', id=None), 'Chat Template': Value(dtype='bool', id=None), "Maintainer's Highlight": Value(dtype='bool', id=None), 'fullname': Value(dtype='string', id=None), 'Generation': Value(dtype='int64', id=None), 'Base Model': Value(dtype='string', id=None)} because column names don't match During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1392, in compute_config_parquet_and_info_response parquet_operations = convert_to_parquet(builder) File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1041, in convert_to_parquet builder.download_and_prepare( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 924, in download_and_prepare self._download_and_prepare( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 999, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1740, in _prepare_split for job_id, done, content in self._prepare_split_single( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1871, in _prepare_split_single raise DatasetGenerationCastError.from_cast_error( datasets.exceptions.DatasetGenerationCastError: An error occurred while generating the dataset All the data files must have the same columns, but at some point there are 174 new columns ({'report.per_token.throughput.value', 'report.per_token.efficiency', 'report.per_token.latency.p95', 'config.launcher.numactl', 'report.load.memory.max_process_vram', 'config.backend.hub_kwargs.local_files_only', 'config.environment.gpu_vram_mb', 'report.prefill.efficiency.value', 'report.load.memory.max_reserved', 'config.backend.device_ids', 'config.backend.intra_op_num_threads', 'config.backend.quantization_scheme', 'report.prefill.latency.values', 'config.scenario.generate_kwargs.max_new_tokens', 'config.environment.peft_commit', 'config.backend.hub_kwargs.trust_remote_code', 'config.scenario.new_tokens', 'report.prefill.latency.mean', 'config.backend.inter_op_num_threads', 'report.prefill.energy.cpu', 'config.scenario.input_shapes.num_choices', 'config.backend.task', 'config.backend.torch_dtype', 'config.backend.torch_compile', 'config.scenario.latency', 'report.decode.memory.max_ram', 'report.decode.energy.ram', 'report.per_token.memory', 'report.prefill.latency.p90', 'config.backend.hub_kwargs.revision', 'config.environment.accelerate_commit', 'report.decode.latency.total', 'config.environment.processor', 'report.prefill.energy.unit', 'report.decode.latency.p50', 'config.backend.version', 'report.load.latency.p90', 'config.backend.quantization_config.exllama_config.version', 'report.decode.latency.p99', 'config.environment.peft_version', 'config.backend.peft_type', 'report.prefill.efficiency.unit', 'report.load.latency.stdev', 'config.environment.platform', 'report.loa ... on', 'config.backend.device', 'config.backend.quantization_config.exllama_config.max_input_len', 'report.prefill.energy.gpu', 'report.per_token.latency.total', 'config.environment.optimum_version', 'report.decode.memory.max_reserved', 'report.load.memory.max_allocated', 'report.decode.throughput.value', 'report.per_token.latency.stdev', 'report.decode.throughput.unit', 'config.backend.autocast_dtype', 'config.backend.library', 'config.environment.optimum_benchmark_version', 'report.decode.energy.cpu', 'config.backend.quantization_config.exllama_config.max_batch_size', 'report.load.throughput', 'config.environment.optimum_benchmark_commit', 'config.launcher.name', 'report.prefill.memory.max_reserved', 'config.environment.diffusers_commit', 'config.environment.optimum_commit', 'config.scenario.input_shapes.batch_size', 'report.load.energy.total', 'report.prefill.throughput.unit', 'report.per_token.latency.count', 'report.prefill.memory.max_process_vram', 'config.environment.transformers_commit', 'config.backend.low_cpu_mem_usage', 'report.prefill.memory.max_allocated', 'report.per_token.latency.p90', 'report.decode.memory.unit', 'config.environment.cpu_count', 'report.decode.latency.p90', 'config.environment.machine', 'report.decode.latency.count', 'report.per_token.latency.mean', 'report.load.memory.max_global_vram', 'config.environment.system', 'report.decode.energy.unit', 'config.backend.no_weights', 'config.scenario.memory', 'report.traceback', 'report.decode.latency.unit'}) and 34 missing columns ({'Hub β€οΈ', 'Architecture', 'MUSR', 'Submission Date', 'Weight type', '#Params (B)', 'BBH Raw', 'Generation', 'MoE', 'Upload To Hub Date', 'IFEval Raw', 'T', 'fullname', 'GPQA Raw', 'MMLU-PRO', 'MATH Lvl 5', 'Model', 'MMLU-PRO Raw', 'Hub License', 'GPQA', 'Available on the hub', 'IFEval', 'Not_Merged', 'Chat Template', 'Precision', 'Average β¬οΈ', "Maintainer's Highlight", 'Flagged', 'Model sha', 'Type', 'Base Model', 'BBH', 'MATH Lvl 5 Raw', 'MUSR Raw'}). This happened while the csv dataset builder was generating data using hf://datasets/optimum-benchmark/llm-perf-leaderboard/perf-df-awq-1xA10.csv (at revision 10db6ab4da6848bda3621e6fa0eb30d613c32500) Please either edit the data files to have matching columns, or separate them into different configurations (see docs at https://hf.co/docs/hub/datasets-manual-configuration#multiple-configurations)
Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
T
string | Model
string | Average β¬οΈ
float64 | IFEval
float64 | IFEval Raw
float64 | BBH
float64 | BBH Raw
float64 | MATH Lvl 5
float64 | MATH Lvl 5 Raw
float64 | GPQA
float64 | GPQA Raw
float64 | MUSR
float64 | MUSR Raw
float64 | MMLU-PRO
float64 | MMLU-PRO Raw
float64 | Type
string | Architecture
string | Weight type
string | Precision
string | Not_Merged
bool | Hub License
string | #Params (B)
int64 | Hub β€οΈ
int64 | Available on the hub
bool | Model sha
string | Flagged
bool | MoE
bool | Submission Date
string | Upload To Hub Date
string | Chat Template
bool | Maintainer's Highlight
bool | fullname
string | Generation
int64 | Base Model
string |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
π¬ | dfurman/CalmeRys-78B-Orpo-v0.1 | 50.78 | 81.63 | 0.82 | 61.92 | 0.73 | 37.92 | 0.38 | 20.02 | 0.4 | 36.37 | 0.59 | 66.8 | 0.7 | π¬ chat models (RLHF, DPO, IFT, ...) | Qwen2ForCausalLM | Original | bfloat16 | true | mit | 77 | 23 | true | 7988deb48419c3f56bb24c139c23e5c476ec03f8 | true | true | 2024-09-24 | 2024-09-24 | true | false | dfurman/CalmeRys-78B-Orpo-v0.1 | 1 | dfurman/CalmeRys-78B-Orpo-v0.1 (Merge) |
π¬ | MaziyarPanahi/calme-2.4-rys-78b | 50.26 | 80.11 | 0.8 | 62.16 | 0.73 | 37.69 | 0.38 | 20.36 | 0.4 | 34.57 | 0.58 | 66.69 | 0.7 | π¬ chat models (RLHF, DPO, IFT, ...) | Qwen2ForCausalLM | Original | bfloat16 | true | mit | 77 | 32 | true | 0a35e51ffa9efa644c11816a2d56434804177acb | true | true | 2024-09-03 | 2024-08-07 | true | false | MaziyarPanahi/calme-2.4-rys-78b | 2 | dnhkng/RYS-XLarge |
πΆ | rombodawg/Rombos-LLM-V2.5-Qwen-72b | 45.39 | 71.55 | 0.72 | 61.27 | 0.72 | 47.58 | 0.48 | 19.8 | 0.4 | 17.32 | 0.46 | 54.83 | 0.59 | πΆ fine-tuned on domain-specific datasets | Qwen2ForCausalLM | Original | bfloat16 | true | other | 72 | 17 | true | 5260f182e7859e13d515c4cb3926ac85ad057504 | true | true | 2024-09-30 | 2024-09-30 | false | false | rombodawg/Rombos-LLM-V2.5-Qwen-72b | 1 | rombodawg/Rombos-LLM-V2.5-Qwen-72b (Merge) |
πΆ | dnhkng/RYS-XLarge | 44.75 | 79.96 | 0.8 | 58.77 | 0.71 | 38.97 | 0.39 | 17.9 | 0.38 | 23.72 | 0.5 | 49.2 | 0.54 | πΆ fine-tuned on domain-specific datasets | Qwen2ForCausalLM | Original | bfloat16 | true | mit | 77 | 70 | true | 0f84dd9dde60f383e1e2821496befb4ce9a11ef6 | true | true | 2024-08-07 | 2024-07-24 | false | false | dnhkng/RYS-XLarge | 0 | dnhkng/RYS-XLarge |
π¬ | MaziyarPanahi/calme-2.1-rys-78b | 44.14 | 81.36 | 0.81 | 59.47 | 0.71 | 36.4 | 0.36 | 19.24 | 0.39 | 19 | 0.47 | 49.38 | 0.54 | π¬ chat models (RLHF, DPO, IFT, ...) | Qwen2ForCausalLM | Original | bfloat16 | true | mit | 77 | 3 | true | e746f5ddc0c9b31a2382d985a4ec87fa910847c7 | true | true | 2024-08-08 | 2024-08-06 | true | false | MaziyarPanahi/calme-2.1-rys-78b | 1 | dnhkng/RYS-XLarge |
πΆ | rombodawg/Rombos-LLM-V2.5-Qwen-32b | 44.1 | 68.27 | 0.68 | 58.26 | 0.7 | 39.12 | 0.39 | 19.57 | 0.4 | 24.73 | 0.5 | 54.62 | 0.59 | πΆ fine-tuned on domain-specific datasets | Qwen2ForCausalLM | Original | bfloat16 | true | apache-2.0 | 32 | 12 | true | 234abe4b494dbe83ba805b791f74feb33462a33d | true | true | 2024-10-07 | 2024-09-30 | false | false | rombodawg/Rombos-LLM-V2.5-Qwen-32b | 1 | rombodawg/Rombos-LLM-V2.5-Qwen-32b (Merge) |
π¬ | MaziyarPanahi/calme-2.3-rys-78b | 44.02 | 80.66 | 0.81 | 59.57 | 0.71 | 36.56 | 0.37 | 20.58 | 0.4 | 17 | 0.45 | 49.73 | 0.55 | π¬ chat models (RLHF, DPO, IFT, ...) | Qwen2ForCausalLM | Original | bfloat16 | true | mit | 77 | 4 | true | a8a4e55c2f7054d25c2f0ab3a3b3d806eb915180 | true | true | 2024-09-03 | 2024-08-06 | true | false | MaziyarPanahi/calme-2.3-rys-78b | 1 | dnhkng/RYS-XLarge |
π¬ | MaziyarPanahi/calme-2.2-rys-78b | 43.92 | 79.86 | 0.8 | 59.27 | 0.71 | 37.92 | 0.38 | 20.92 | 0.41 | 16.83 | 0.45 | 48.73 | 0.54 | π¬ chat models (RLHF, DPO, IFT, ...) | Qwen2ForCausalLM | Original | bfloat16 | true | mit | 77 | 3 | true | 8d0dde25c9042705f65559446944a19259c3fc8e | true | true | 2024-08-08 | 2024-08-06 | true | false | MaziyarPanahi/calme-2.2-rys-78b | 1 | dnhkng/RYS-XLarge |
π¬ | MaziyarPanahi/calme-2.1-qwen2-72b | 43.61 | 81.63 | 0.82 | 57.33 | 0.7 | 36.03 | 0.36 | 17.45 | 0.38 | 20.15 | 0.47 | 49.05 | 0.54 | π¬ chat models (RLHF, DPO, IFT, ...) | Qwen2ForCausalLM | Original | bfloat16 | true | other | 72 | 27 | true | 0369c39770f45f2464587918f2dbdb8449ea3a0d | true | true | 2024-06-26 | 2024-06-08 | true | false | MaziyarPanahi/calme-2.1-qwen2-72b | 2 | Qwen/Qwen2-72B |
πΆ | dnhkng/RYS-XLarge-base | 43.56 | 79.1 | 0.79 | 58.69 | 0.7 | 34.67 | 0.35 | 17.23 | 0.38 | 22.42 | 0.49 | 49.23 | 0.54 | πΆ fine-tuned on domain-specific datasets | ? | Adapter | bfloat16 | true | mit | 77 | 3 | true | c718b3d9e24916e3b0347d3fdaa5e5a097c2f603 | true | true | 2024-08-30 | 2024-08-02 | true | false | dnhkng/RYS-XLarge-base | 0 | dnhkng/RYS-XLarge-base |
π¬ | arcee-ai/Arcee-Nova | 43.5 | 79.07 | 0.79 | 56.74 | 0.69 | 40.48 | 0.4 | 18.01 | 0.39 | 17.22 | 0.46 | 49.47 | 0.55 | π¬ chat models (RLHF, DPO, IFT, ...) | Qwen2ForCausalLM | Original | bfloat16 | true | other | 72 | 38 | true | ec3bfe88b83f81481daa04b6789c1e0d32827dc5 | true | true | 2024-09-19 | 2024-07-16 | true | false | arcee-ai/Arcee-Nova | 0 | arcee-ai/Arcee-Nova |
π¬ | MaziyarPanahi/calme-2.2-qwen2-72b | 43.4 | 80.08 | 0.8 | 56.8 | 0.69 | 41.16 | 0.41 | 16.55 | 0.37 | 16.52 | 0.45 | 49.27 | 0.54 | π¬ chat models (RLHF, DPO, IFT, ...) | Qwen2ForCausalLM | Original | bfloat16 | true | other | 72 | 5 | true | 529e9bd80a76d943409bc92bb246aa7ca63dd9e6 | true | true | 2024-08-06 | 2024-07-09 | true | false | MaziyarPanahi/calme-2.2-qwen2-72b | 1 | Qwen/Qwen2-72B |
π¬ | dfurman/Qwen2-72B-Orpo-v0.1 | 43.32 | 78.8 | 0.79 | 57.41 | 0.7 | 35.42 | 0.35 | 17.9 | 0.38 | 20.87 | 0.48 | 49.5 | 0.55 | π¬ chat models (RLHF, DPO, IFT, ...) | Qwen2ForCausalLM | Original | bfloat16 | true | other | 72 | 4 | true | 26c7bbaa728822c60bb47b2808972140653aae4c | true | true | 2024-08-22 | 2024-07-05 | true | false | dfurman/Qwen2-72B-Orpo-v0.1 | 1 | dfurman/Qwen2-72B-Orpo-v0.1 (Merge) |
πΆ | Undi95/MG-FinalMix-72B | 43.28 | 80.14 | 0.8 | 57.5 | 0.7 | 33.61 | 0.34 | 18.01 | 0.39 | 21.22 | 0.48 | 49.19 | 0.54 | πΆ fine-tuned on domain-specific datasets | Qwen2ForCausalLM | Original | bfloat16 | false | other | 72 | 3 | true | 6c9c2f5d052495dcd49f44bf5623d21210653c65 | true | true | 2024-07-13 | 2024-06-25 | true | false | Undi95/MG-FinalMix-72B | 1 | Undi95/MG-FinalMix-72B (Merge) |
π¬ | Qwen/Qwen2-72B-Instruct | 42.49 | 79.89 | 0.8 | 57.48 | 0.7 | 35.12 | 0.35 | 16.33 | 0.37 | 17.17 | 0.46 | 48.92 | 0.54 | π¬ chat models (RLHF, DPO, IFT, ...) | Qwen2ForCausalLM | Original | bfloat16 | true | other | 72 | 672 | true | 1af63c698f59c4235668ec9c1395468cb7cd7e79 | true | true | 2024-06-26 | 2024-05-28 | false | true | Qwen/Qwen2-72B-Instruct | 1 | Qwen/Qwen2-72B |
πΆ | abacusai/Dracarys-72B-Instruct | 42.37 | 78.56 | 0.79 | 56.94 | 0.69 | 33.61 | 0.34 | 18.79 | 0.39 | 16.81 | 0.46 | 49.51 | 0.55 | πΆ fine-tuned on domain-specific datasets | Qwen2ForCausalLM | Original | bfloat16 | true | other | 72 | 17 | true | 10cabc4beb57a69df51533f65e39a7ad22821370 | true | true | 2024-08-16 | 2024-08-14 | true | true | abacusai/Dracarys-72B-Instruct | 0 | abacusai/Dracarys-72B-Instruct |
πΆ | VAGOsolutions/Llama-3.1-SauerkrautLM-70b-Instruct | 42.24 | 86.56 | 0.87 | 57.24 | 0.7 | 29.91 | 0.3 | 12.19 | 0.34 | 19.39 | 0.47 | 48.17 | 0.53 | πΆ fine-tuned on domain-specific datasets | LlamaForCausalLM | Original | bfloat16 | true | llama3.1 | 70 | 15 | true | e8e74aa789243c25a3a8f7565780a402f5050bbb | true | true | 2024-08-26 | 2024-07-29 | true | false | VAGOsolutions/Llama-3.1-SauerkrautLM-70b-Instruct | 0 | VAGOsolutions/Llama-3.1-SauerkrautLM-70b-Instruct |
π¬ | anthracite-org/magnum-v1-72b | 42.21 | 76.06 | 0.76 | 57.65 | 0.7 | 35.27 | 0.35 | 18.79 | 0.39 | 15.62 | 0.45 | 49.85 | 0.55 | π¬ chat models (RLHF, DPO, IFT, ...) | Qwen2ForCausalLM | Original | bfloat16 | true | other | 72 | 160 | true | f8f85021bace7e8250ed8559c5b78b8b34f0c4cc | true | true | 2024-09-21 | 2024-06-17 | true | false | anthracite-org/magnum-v1-72b | 2 | Qwen/Qwen2-72B |
π¬ | alpindale/magnum-72b-v1 | 42.17 | 76.06 | 0.76 | 57.65 | 0.7 | 35.27 | 0.35 | 18.79 | 0.39 | 15.62 | 0.45 | 49.64 | 0.55 | π¬ chat models (RLHF, DPO, IFT, ...) | Qwen2ForCausalLM | Original | bfloat16 | true | other | 72 | 160 | true | fef27e0f235ae8858b84b765db773a2a954110dd | true | true | 2024-07-25 | 2024-06-17 | true | false | alpindale/magnum-72b-v1 | 2 | Qwen/Qwen2-72B |
π¬ | meta-llama/Meta-Llama-3.1-70B-Instruct | 41.74 | 86.69 | 0.87 | 55.93 | 0.69 | 28.02 | 0.28 | 14.21 | 0.36 | 17.69 | 0.46 | 47.88 | 0.53 | π¬ chat models (RLHF, DPO, IFT, ...) | LlamaForCausalLM | Original | bfloat16 | true | llama3.1 | 70 | 615 | true | b9461463b511ed3c0762467538ea32cf7c9669f2 | true | true | 2024-08-15 | 2024-07-16 | true | true | meta-llama/Meta-Llama-3.1-70B-Instruct | 1 | meta-llama/Meta-Llama-3.1-70B |
πΆ | dnhkng/RYS-Llama3.1-Large | 41.6 | 84.92 | 0.85 | 55.41 | 0.69 | 28.4 | 0.28 | 16.55 | 0.37 | 17.09 | 0.46 | 47.21 | 0.52 | πΆ fine-tuned on domain-specific datasets | LlamaForCausalLM | Original | bfloat16 | true | mit | 81 | 1 | true | 52cc979de78155b33689efa48f52a8aab184bd86 | true | true | 2024-08-22 | 2024-08-11 | true | false | dnhkng/RYS-Llama3.1-Large | 0 | dnhkng/RYS-Llama3.1-Large |
πΆ | rombodawg/Rombos-LLM-V2.6-Nemotron-70b | 41.49 | 75.27 | 0.75 | 55.81 | 0.69 | 30.59 | 0.31 | 20.81 | 0.41 | 18.39 | 0.47 | 48.1 | 0.53 | πΆ fine-tuned on domain-specific datasets | LlamaForCausalLM | Original | bfloat16 | true | llama3.1 | 70 | 1 | true | 951c9cdf68d6e679c78625d1a1f396eb71cdf746 | true | true | 2024-10-17 | 2024-10-17 | false | false | rombodawg/Rombos-LLM-V2.6-Nemotron-70b | 0 | rombodawg/Rombos-LLM-V2.6-Nemotron-70b |
π¬ | anthracite-org/magnum-v2-72b | 41.15 | 75.6 | 0.76 | 57.85 | 0.7 | 31.65 | 0.32 | 18.12 | 0.39 | 14.18 | 0.44 | 49.51 | 0.55 | π¬ chat models (RLHF, DPO, IFT, ...) | Qwen2ForCausalLM | Original | bfloat16 | true | other | 72 | 31 | true | c9c5826ef42b9fcc8a8e1079be574481cf0b6cc6 | true | true | 2024-09-05 | 2024-08-18 | true | false | anthracite-org/magnum-v2-72b | 2 | Qwen/Qwen2-72B |
π¬ | abacusai/Smaug-Qwen2-72B-Instruct | 41.08 | 78.25 | 0.78 | 56.27 | 0.69 | 35.35 | 0.35 | 14.88 | 0.36 | 15.18 | 0.44 | 46.56 | 0.52 | π¬ chat models (RLHF, DPO, IFT, ...) | Qwen2ForCausalLM | Original | bfloat16 | true | other | 72 | 8 | true | af015925946d0c60ef69f512c3b35f421cf8063d | true | true | 2024-07-29 | 2024-06-26 | true | true | abacusai/Smaug-Qwen2-72B-Instruct | 0 | abacusai/Smaug-Qwen2-72B-Instruct |
π€ | paulml/ECE-ILAB-Q1 | 40.93 | 78.65 | 0.79 | 53.7 | 0.67 | 26.13 | 0.26 | 18.23 | 0.39 | 18.81 | 0.46 | 50.06 | 0.55 | π€ base merges and moerges | Qwen2ForCausalLM | Original | bfloat16 | false | other | 72 | 0 | true | 393bea0ee85e4c752acd5fd77ce07f577fc13bd9 | true | true | 2024-09-16 | 2024-06-06 | false | false | paulml/ECE-ILAB-Q1 | 0 | paulml/ECE-ILAB-Q1 |
πΆ | KSU-HW-SEC/Llama3.1-70b-SVA-FT-1000step | 40.33 | 72.38 | 0.72 | 55.49 | 0.69 | 29.61 | 0.3 | 19.46 | 0.4 | 17.83 | 0.46 | 47.24 | 0.53 | πΆ fine-tuned on domain-specific datasets | LlamaForCausalLM | Original | bfloat16 | true | null | 70 | 0 | false | b195fea0d8f350ff29243d4e88654b1baa5af79e | true | true | 2024-09-08 | null | false | false | KSU-HW-SEC/Llama3.1-70b-SVA-FT-1000step | 0 | Removed |
π¬ | MaziyarPanahi/calme-2.3-llama3.1-70b | 40.3 | 86.05 | 0.86 | 55.59 | 0.69 | 21.45 | 0.21 | 12.53 | 0.34 | 17.74 | 0.46 | 48.48 | 0.54 | π¬ chat models (RLHF, DPO, IFT, ...) | LlamaForCausalLM | Original | bfloat16 | true | null | 70 | 3 | false | a39c79250721b75beefa1b1763895eafd010f6f6 | true | true | 2024-09-18 | 2024-09-10 | true | false | MaziyarPanahi/calme-2.3-llama3.1-70b | 2 | meta-llama/Meta-Llama-3.1-70B |
π¬ | upstage/solar-pro-preview-instruct | 39.61 | 84.16 | 0.84 | 54.82 | 0.68 | 20.09 | 0.2 | 16.11 | 0.37 | 15.01 | 0.44 | 47.48 | 0.53 | π¬ chat models (RLHF, DPO, IFT, ...) | SolarForCausalLM | Original | bfloat16 | true | mit | 22 | 407 | true | b4db141b5fb08b23f8bc323bc34e2cff3e9675f8 | true | true | 2024-09-11 | 2024-09-09 | true | true | upstage/solar-pro-preview-instruct | 0 | upstage/solar-pro-preview-instruct |
πΆ | pankajmathur/orca_mini_v7_72b | 39.06 | 59.3 | 0.59 | 55.06 | 0.68 | 26.44 | 0.26 | 18.01 | 0.39 | 24.21 | 0.51 | 51.35 | 0.56 | πΆ fine-tuned on domain-specific datasets | Qwen2ForCausalLM | Original | bfloat16 | true | apache-2.0 | 72 | 11 | true | 447f11912cfa496e32e188a55214043a05760d3a | true | true | 2024-06-26 | 2024-06-26 | false | false | pankajmathur/orca_mini_v7_72b | 0 | pankajmathur/orca_mini_v7_72b |
π¬ | MaziyarPanahi/calme-2.1-qwen2.5-72b | 38.38 | 86.62 | 0.87 | 61.66 | 0.73 | 2.27 | 0.02 | 15.1 | 0.36 | 13.3 | 0.43 | 51.32 | 0.56 | π¬ chat models (RLHF, DPO, IFT, ...) | Qwen2ForCausalLM | Original | bfloat16 | true | other | 72 | 1 | true | eb6c92dec932070ea872f39469ca5b9daf2d34e6 | true | true | 2024-09-26 | 2024-09-19 | true | false | MaziyarPanahi/calme-2.1-qwen2.5-72b | 1 | Qwen/Qwen2.5-72B |
π€ | gbueno86/Meta-LLama-3-Cat-Smaug-LLama-70b | 38.27 | 80.72 | 0.81 | 51.51 | 0.67 | 26.81 | 0.27 | 10.29 | 0.33 | 15 | 0.44 | 45.28 | 0.51 | π€ base merges and moerges | LlamaForCausalLM | Original | bfloat16 | false | llama3 | 70 | 1 | true | 2d73b7e1c7157df482555944d6a6b1362bc6c3c5 | true | true | 2024-06-27 | 2024-05-24 | true | false | gbueno86/Meta-LLama-3-Cat-Smaug-LLama-70b | 1 | gbueno86/Meta-LLama-3-Cat-Smaug-LLama-70b (Merge) |
π¬ | Qwen/Qwen2.5-72B-Instruct | 38.21 | 86.38 | 0.86 | 61.87 | 0.73 | 1.21 | 0.01 | 16.67 | 0.38 | 11.74 | 0.42 | 51.4 | 0.56 | π¬ chat models (RLHF, DPO, IFT, ...) | Qwen2ForCausalLM | Original | bfloat16 | true | other | 72 | 347 | true | a13fff9ad76700c7ecff2769f75943ba8395b4a7 | true | true | 2024-10-16 | 2024-09-16 | true | true | Qwen/Qwen2.5-72B-Instruct | 1 | Qwen/Qwen2.5-72B |
π¬ | MaziyarPanahi/calme-2.2-qwen2.5-72b | 38.01 | 84.77 | 0.85 | 61.8 | 0.73 | 3.63 | 0.04 | 14.54 | 0.36 | 12.02 | 0.42 | 51.31 | 0.56 | π¬ chat models (RLHF, DPO, IFT, ...) | Qwen2ForCausalLM | Original | bfloat16 | true | other | 72 | 5 | true | c6c7fdf70d8bf81364108975eb8ba78eecac83d4 | true | true | 2024-09-26 | 2024-09-19 | true | false | MaziyarPanahi/calme-2.2-qwen2.5-72b | 1 | Qwen/Qwen2.5-72B |
π¬ | MaziyarPanahi/calme-2.2-llama3-70b | 37.98 | 82.08 | 0.82 | 48.57 | 0.64 | 22.96 | 0.23 | 12.19 | 0.34 | 15.3 | 0.44 | 46.74 | 0.52 | π¬ chat models (RLHF, DPO, IFT, ...) | LlamaForCausalLM | Original | bfloat16 | true | llama3 | 70 | 17 | true | 95366b974baedee4d95c1e841bc3d15e94753804 | true | true | 2024-06-26 | 2024-04-27 | true | false | MaziyarPanahi/calme-2.2-llama3-70b | 2 | meta-llama/Meta-Llama-3-70B |
π’ | Qwen/Qwen2.5-72B | 37.94 | 41.37 | 0.41 | 54.62 | 0.68 | 36.1 | 0.36 | 20.69 | 0.41 | 19.64 | 0.48 | 55.2 | 0.6 | π’ pretrained | Qwen2ForCausalLM | Original | bfloat16 | true | other | 72 | 33 | true | 587cc4061cf6a7cc0d429d05c109447e5cf063af | true | true | 2024-09-19 | 2024-09-15 | false | true | Qwen/Qwen2.5-72B | 0 | Qwen/Qwen2.5-72B |
πΆ | VAGOsolutions/Llama-3-SauerkrautLM-70b-Instruct | 37.82 | 80.45 | 0.8 | 52.03 | 0.67 | 21.68 | 0.22 | 10.4 | 0.33 | 13.54 | 0.43 | 48.8 | 0.54 | πΆ fine-tuned on domain-specific datasets | LlamaForCausalLM | Original | bfloat16 | true | other | 70 | 21 | true | 707cfd1a93875247c0223e0c7e3d86d58c432318 | true | true | 2024-06-26 | 2024-04-24 | true | false | VAGOsolutions/Llama-3-SauerkrautLM-70b-Instruct | 0 | VAGOsolutions/Llama-3-SauerkrautLM-70b-Instruct |
πΈ | Qwen/Qwen2-VL-72B-Instruct | 37.69 | 59.82 | 0.6 | 56.31 | 0.69 | 23.34 | 0.23 | 18.34 | 0.39 | 15.89 | 0.45 | 52.41 | 0.57 | πΈ multimodal | Qwen2VLForConditionalGeneration | Original | bfloat16 | true | other | 73 | 137 | true | f400120e59a6196b024298b7d09fb517f742db7d | true | true | 2024-10-20 | 2024-09-17 | true | true | Qwen/Qwen2-VL-72B-Instruct | 0 | Qwen/Qwen2-VL-72B-Instruct |
π’ | Qwen/Qwen2.5-32B | 37.54 | 40.77 | 0.41 | 53.95 | 0.68 | 32.85 | 0.33 | 21.59 | 0.41 | 22.7 | 0.5 | 53.39 | 0.58 | π’ pretrained | Qwen2ForCausalLM | Original | bfloat16 | true | apache-2.0 | 32 | 22 | true | ff23665d01c3665be5fdb271d18a62090b65c06d | true | true | 2024-09-19 | 2024-09-15 | false | true | Qwen/Qwen2.5-32B | 0 | Qwen/Qwen2.5-32B |
π¬ | ssmits/Qwen2.5-95B-Instruct | 37.43 | 84.31 | 0.84 | 58.53 | 0.7 | 6.04 | 0.06 | 15.21 | 0.36 | 13.61 | 0.43 | 46.85 | 0.52 | π¬ chat models (RLHF, DPO, IFT, ...) | Qwen2ForCausalLM | Original | bfloat16 | true | other | 94 | 2 | true | 9c0e7df57a4fcf4d364efd916a0fc0abdd2d20a3 | true | true | 2024-09-26 | 2024-09-24 | true | false | ssmits/Qwen2.5-95B-Instruct | 1 | ssmits/Qwen2.5-95B-Instruct (Merge) |
π€ | mlabonne/BigQwen2.5-52B-Instruct | 37.42 | 79.29 | 0.79 | 59.81 | 0.71 | 17.82 | 0.18 | 6.94 | 0.3 | 10.45 | 0.41 | 50.22 | 0.55 | π€ base merges and moerges | Qwen2ForCausalLM | Original | bfloat16 | false | apache-2.0 | 52 | 2 | true | 425b9bffc9871085cc0d42c34138ce776f96ba02 | true | true | 2024-09-25 | 2024-09-23 | true | true | mlabonne/BigQwen2.5-52B-Instruct | 1 | mlabonne/BigQwen2.5-52B-Instruct (Merge) |
π¬ | NousResearch/Hermes-3-Llama-3.1-70B | 37.31 | 76.61 | 0.77 | 53.77 | 0.68 | 13.75 | 0.14 | 14.88 | 0.36 | 23.43 | 0.49 | 41.41 | 0.47 | π¬ chat models (RLHF, DPO, IFT, ...) | LlamaForCausalLM | Original | bfloat16 | true | llama3 | 70 | 82 | true | 093242c69a91f8d9d5b8094c380b88772f9bd7f8 | true | true | 2024-08-28 | 2024-07-29 | true | true | NousResearch/Hermes-3-Llama-3.1-70B | 1 | meta-llama/Meta-Llama-3.1-70B |
π¬ | MaziyarPanahi/calme-2.3-llama3-70b | 36.84 | 80.1 | 0.8 | 48.01 | 0.64 | 21.9 | 0.22 | 11.74 | 0.34 | 12.57 | 0.43 | 46.72 | 0.52 | π¬ chat models (RLHF, DPO, IFT, ...) | LlamaForCausalLM | Original | bfloat16 | true | llama3 | 70 | 3 | true | bd17453eaae0e36d1e1e17da13fdd155fce91a29 | true | true | 2024-08-30 | 2024-04-27 | true | false | MaziyarPanahi/calme-2.3-llama3-70b | 2 | meta-llama/Meta-Llama-3-70B |
πΆ | ValiantLabs/Llama3-70B-Fireplace | 36.82 | 77.74 | 0.78 | 49.56 | 0.65 | 19.64 | 0.2 | 13.98 | 0.35 | 16.77 | 0.44 | 43.25 | 0.49 | πΆ fine-tuned on domain-specific datasets | LlamaForCausalLM | Original | float16 | true | llama3 | 70 | 3 | true | 220079e4115733991eb19c30d5480db9696a665e | true | true | 2024-06-26 | 2024-05-09 | true | false | ValiantLabs/Llama3-70B-Fireplace | 0 | ValiantLabs/Llama3-70B-Fireplace |
πΆ | BAAI/Infinity-Instruct-7M-Gen-Llama3_1-70B | 36.79 | 73.35 | 0.73 | 52.5 | 0.67 | 21.07 | 0.21 | 16.78 | 0.38 | 16.97 | 0.45 | 40.08 | 0.46 | πΆ fine-tuned on domain-specific datasets | LlamaForCausalLM | Original | bfloat16 | true | llama3.1 | 70 | 14 | true | 1ef63c4993a8c723c9695c827295c17080a64435 | true | true | 2024-09-26 | 2024-07-25 | true | false | BAAI/Infinity-Instruct-7M-Gen-Llama3_1-70B | 0 | BAAI/Infinity-Instruct-7M-Gen-Llama3_1-70B |
π¬ | tenyx/Llama3-TenyxChat-70B | 36.54 | 80.87 | 0.81 | 49.62 | 0.65 | 22.66 | 0.23 | 6.82 | 0.3 | 12.52 | 0.43 | 46.78 | 0.52 | π¬ chat models (RLHF, DPO, IFT, ...) | LlamaForCausalLM | Original | bfloat16 | true | llama3 | 70 | 63 | true | a85d31e3af8fcc847cc9169f1144cf02f5351fab | true | true | 2024-08-04 | 2024-04-26 | true | false | tenyx/Llama3-TenyxChat-70B | 0 | tenyx/Llama3-TenyxChat-70B |
π¬ | MaziyarPanahi/calme-2.2-llama3.1-70b | 36.39 | 85.93 | 0.86 | 54.21 | 0.68 | 2.11 | 0.02 | 9.96 | 0.32 | 17.07 | 0.45 | 49.05 | 0.54 | π¬ chat models (RLHF, DPO, IFT, ...) | LlamaForCausalLM | Original | bfloat16 | true | null | 70 | 2 | false | c81ac05ed2c2344e9fd366cfff197da406ef5234 | true | true | 2024-09-09 | 2024-09-09 | true | false | MaziyarPanahi/calme-2.2-llama3.1-70b | 2 | meta-llama/Meta-Llama-3.1-70B |
π€ | gbueno86/Brinebreath-Llama-3.1-70B | 36.29 | 55.33 | 0.55 | 55.46 | 0.69 | 29.98 | 0.3 | 12.86 | 0.35 | 17.49 | 0.45 | 46.62 | 0.52 | π€ base merges and moerges | LlamaForCausalLM | Original | bfloat16 | false | llama3.1 | 70 | 1 | true | c508ecf356167e8c498c6fa3937ba30a82208983 | true | true | 2024-08-29 | 2024-08-23 | true | false | gbueno86/Brinebreath-Llama-3.1-70B | 1 | gbueno86/Brinebreath-Llama-3.1-70B (Merge) |
π¬ | meta-llama/Meta-Llama-3-70B-Instruct | 36.18 | 80.99 | 0.81 | 50.19 | 0.65 | 23.34 | 0.23 | 4.92 | 0.29 | 10.92 | 0.42 | 46.74 | 0.52 | π¬ chat models (RLHF, DPO, IFT, ...) | LlamaForCausalLM | Original | bfloat16 | true | llama3 | 70 | 1,415 | true | 7129260dd854a80eb10ace5f61c20324b472b31c | true | true | 2024-06-12 | 2024-04-17 | true | true | meta-llama/Meta-Llama-3-70B-Instruct | 1 | meta-llama/Meta-Llama-3-70B |
π¬ | Qwen/Qwen2.5-32B-Instruct | 36.17 | 83.46 | 0.83 | 56.49 | 0.69 | 0 | 0 | 11.74 | 0.34 | 13.5 | 0.43 | 51.85 | 0.57 | π¬ chat models (RLHF, DPO, IFT, ...) | Qwen2ForCausalLM | Original | bfloat16 | true | apache-2.0 | 32 | 99 | true | 70e8dfb9ad18a7d499f765fe206ff065ed8ca197 | true | true | 2024-09-19 | 2024-09-17 | true | true | Qwen/Qwen2.5-32B-Instruct | 1 | Qwen/Qwen2.5-32B |
πΆ | flammenai/Mahou-1.5-llama3.1-70B | 36.04 | 71.47 | 0.71 | 52.37 | 0.67 | 13.14 | 0.13 | 13.87 | 0.35 | 23.71 | 0.5 | 41.66 | 0.47 | πΆ fine-tuned on domain-specific datasets | LlamaForCausalLM | Original | bfloat16 | true | llama3.1 | 70 | 4 | true | 49f45cc4c21e2ba7ed5c5e71f90ffd0bd9169e2d | true | true | 2024-10-14 | 2024-10-14 | true | false | flammenai/Mahou-1.5-llama3.1-70B | 1 | flammenai/Mahou-1.5-llama3.1-70B (Merge) |
π€ | nisten/franqwenstein-35b | 35.94 | 37.99 | 0.38 | 52.23 | 0.66 | 30.29 | 0.3 | 20.47 | 0.4 | 22.12 | 0.49 | 52.56 | 0.57 | π€ base merges and moerges | Qwen2ForCausalLM | Original | float16 | true | mit | 34 | 5 | true | 7180aa73e82945a1d2ae0eb304508e21d57e4c27 | true | true | 2024-10-03 | 2024-10-03 | false | false | nisten/franqwenstein-35b | 1 | nisten/franqwenstein-35b (Merge) |
πΆ | rombodawg/Rombos-LLM-V2.6-Qwen-14b | 35.89 | 52.14 | 0.52 | 49.22 | 0.65 | 28.85 | 0.29 | 17 | 0.38 | 19.26 | 0.48 | 48.85 | 0.54 | πΆ fine-tuned on domain-specific datasets | Qwen2ForCausalLM | Original | bfloat16 | true | apache-2.0 | 14 | 26 | true | 887910d75a1837b8b8c7c3e50a257517d286ec60 | true | true | 2024-10-13 | 2024-10-12 | false | false | rombodawg/Rombos-LLM-V2.6-Qwen-14b | 1 | rombodawg/Rombos-LLM-V2.6-Qwen-14b (Merge) |
πΆ | BAAI/Infinity-Instruct-3M-0625-Llama3-70B | 35.88 | 74.42 | 0.74 | 52.03 | 0.67 | 16.31 | 0.16 | 14.32 | 0.36 | 18.34 | 0.46 | 39.85 | 0.46 | πΆ fine-tuned on domain-specific datasets | LlamaForCausalLM | Original | float16 | true | apache-2.0 | 70 | 3 | true | 6d8ceada57e55cff3503191adc4d6379ff321fe2 | true | true | 2024-08-30 | 2024-07-09 | true | false | BAAI/Infinity-Instruct-3M-0625-Llama3-70B | 0 | BAAI/Infinity-Instruct-3M-0625-Llama3-70B |
πΆ | KSU-HW-SEC/Llama3-70b-SVA-FT-1415 | 35.8 | 61.8 | 0.62 | 51.33 | 0.67 | 20.09 | 0.2 | 16.67 | 0.38 | 17.8 | 0.46 | 47.14 | 0.52 | πΆ fine-tuned on domain-specific datasets | LlamaForCausalLM | Original | bfloat16 | true | null | 70 | 0 | false | 1c09728455567898116d2d9cfb6cbbbbd4ee730c | true | true | 2024-09-08 | null | false | false | KSU-HW-SEC/Llama3-70b-SVA-FT-1415 | 0 | Removed |
πΆ | failspy/llama-3-70B-Instruct-abliterated | 35.79 | 80.23 | 0.8 | 48.94 | 0.65 | 23.72 | 0.24 | 5.26 | 0.29 | 10.53 | 0.41 | 46.06 | 0.51 | πΆ fine-tuned on domain-specific datasets | LlamaForCausalLM | Original | bfloat16 | true | llama3 | 70 | 87 | true | 53ae9dafe8b3d163e05d75387575f8e9f43253d0 | true | true | 2024-07-03 | 2024-05-07 | true | false | failspy/llama-3-70B-Instruct-abliterated | 0 | failspy/llama-3-70B-Instruct-abliterated |
π¬ | dnhkng/RYS-Llama-3-Large-Instruct | 35.78 | 80.51 | 0.81 | 49.67 | 0.65 | 21.83 | 0.22 | 5.26 | 0.29 | 11.45 | 0.42 | 45.97 | 0.51 | π¬ chat models (RLHF, DPO, IFT, ...) | LlamaForCausalLM | Original | bfloat16 | true | mit | 73 | 1 | true | 01e3208aaf7bf6d2b09737960c701ec6628977fe | true | true | 2024-08-07 | 2024-08-06 | true | false | dnhkng/RYS-Llama-3-Large-Instruct | 0 | dnhkng/RYS-Llama-3-Large-Instruct |
πΆ | KSU-HW-SEC/Llama3-70b-SVA-FT-final | 35.78 | 61.65 | 0.62 | 51.33 | 0.67 | 20.09 | 0.2 | 16.67 | 0.38 | 17.8 | 0.46 | 47.14 | 0.52 | πΆ fine-tuned on domain-specific datasets | LlamaForCausalLM | Original | bfloat16 | true | null | 70 | 0 | false | 391bbd94173b34975d1aa2c7356977a630253b75 | true | true | 2024-09-08 | null | false | false | KSU-HW-SEC/Llama3-70b-SVA-FT-final | 0 | Removed |
π¬ | tanliboy/lambda-qwen2.5-32b-dpo-test | 35.75 | 80.84 | 0.81 | 54.41 | 0.68 | 0 | 0 | 14.21 | 0.36 | 13.33 | 0.43 | 51.74 | 0.57 | π¬ chat models (RLHF, DPO, IFT, ...) | Qwen2ForCausalLM | Original | bfloat16 | true | apache-2.0 | 32 | 3 | true | 675b60d6e859455a6139e6e284bbe1844b8ddf46 | true | true | 2024-09-30 | 2024-09-22 | true | false | tanliboy/lambda-qwen2.5-32b-dpo-test | 2 | Qwen/Qwen2.5-32B |
πΆ | flammenai/Llama3.1-Flammades-70B | 35.74 | 70.58 | 0.71 | 52.55 | 0.67 | 13.37 | 0.13 | 13.87 | 0.35 | 22.35 | 0.49 | 41.69 | 0.48 | πΆ fine-tuned on domain-specific datasets | LlamaForCausalLM | Original | bfloat16 | true | llama3.1 | 70 | 1 | true | 48909a734460e667e3a7e91bd25f124ec3b2ba74 | true | true | 2024-10-13 | 2024-10-12 | true | false | flammenai/Llama3.1-Flammades-70B | 1 | flammenai/Llama3.1-Flammades-70B (Merge) |
πΆ | mlabonne/Hermes-3-Llama-3.1-70B-lorablated | 35.7 | 71.44 | 0.71 | 52.34 | 0.66 | 13.82 | 0.14 | 13.2 | 0.35 | 22.02 | 0.48 | 41.37 | 0.47 | πΆ fine-tuned on domain-specific datasets | LlamaForCausalLM | Original | bfloat16 | false | null | 70 | 16 | true | 4303ff3b524418e9aa5e787d60595a44a6173b02 | true | true | 2024-10-12 | 2024-08-16 | true | true | mlabonne/Hermes-3-Llama-3.1-70B-lorablated | 1 | mlabonne/Hermes-3-Llama-3.1-70B-lorablated (Merge) |
πΆ | nbeerbower/Llama3.1-Gutenberg-Doppel-70B | 35.68 | 70.92 | 0.71 | 52.56 | 0.67 | 13.75 | 0.14 | 12.64 | 0.34 | 22.68 | 0.49 | 41.52 | 0.47 | πΆ fine-tuned on domain-specific datasets | LlamaForCausalLM | Original | bfloat16 | true | llama3.1 | 70 | 3 | true | 5de156e97f776ce1b88ce5b2e2dc1e7709205a82 | true | true | 2024-10-12 | 2024-10-11 | true | false | nbeerbower/Llama3.1-Gutenberg-Doppel-70B | 1 | nbeerbower/Llama3.1-Gutenberg-Doppel-70B (Merge) |
πΆ | KSU-HW-SEC/Llama3-70b-SVA-FT-500 | 35.61 | 61.05 | 0.61 | 51.89 | 0.67 | 19.34 | 0.19 | 17.45 | 0.38 | 16.99 | 0.45 | 46.97 | 0.52 | πΆ fine-tuned on domain-specific datasets | LlamaForCausalLM | Original | bfloat16 | true | null | 70 | 0 | false | 856a23f28aeada23d1135c86a37e05524307e8ed | true | true | 2024-09-08 | null | false | false | KSU-HW-SEC/Llama3-70b-SVA-FT-500 | 0 | Removed |
πΆ | cognitivecomputations/dolphin-2.9.2-qwen2-72b | 35.42 | 63.44 | 0.63 | 47.7 | 0.63 | 18.66 | 0.19 | 16 | 0.37 | 17.04 | 0.45 | 49.68 | 0.55 | πΆ fine-tuned on domain-specific datasets | Qwen2ForCausalLM | Original | bfloat16 | true | other | 72 | 60 | true | e79582577c2bf2af304221af0e8308b7e7d46ca1 | true | true | 2024-10-20 | 2024-05-27 | true | true | cognitivecomputations/dolphin-2.9.2-qwen2-72b | 1 | Qwen/Qwen2-72B |
πΆ | cloudyu/Llama-3-70Bx2-MOE | 35.35 | 54.82 | 0.55 | 51.42 | 0.66 | 19.86 | 0.2 | 19.13 | 0.39 | 20.85 | 0.48 | 46.02 | 0.51 | πΆ fine-tuned on domain-specific datasets | MixtralForCausalLM | Original | bfloat16 | true | llama3 | 126 | 1 | true | b8bd85e8db8e4ec352b93441c92e0ae1334bf5a7 | true | false | 2024-06-27 | 2024-05-20 | false | false | cloudyu/Llama-3-70Bx2-MOE | 0 | cloudyu/Llama-3-70Bx2-MOE |
πΆ | Sao10K/L3-70B-Euryale-v2.1 | 35.35 | 73.84 | 0.74 | 48.7 | 0.65 | 20.85 | 0.21 | 10.85 | 0.33 | 12.25 | 0.42 | 45.6 | 0.51 | πΆ fine-tuned on domain-specific datasets | LlamaForCausalLM | Original | bfloat16 | true | cc-by-nc-4.0 | 70 | 115 | true | 36ad832b771cd783ea7ad00ed39e61f679b1a7c6 | true | true | 2024-07-01 | 2024-06-11 | true | false | Sao10K/L3-70B-Euryale-v2.1 | 0 | Sao10K/L3-70B-Euryale-v2.1 |
π€ | allknowingroger/Qwenslerp2-14B | 35.32 | 50.07 | 0.5 | 50.3 | 0.66 | 27.95 | 0.28 | 15.77 | 0.37 | 18.88 | 0.47 | 48.92 | 0.54 | π€ base merges and moerges | Qwen2ForCausalLM | Original | bfloat16 | false | apache-2.0 | 14 | 1 | true | 38e902c114b5640509a8615fc2a2546e07a5fb3f | true | true | 2024-10-21 | 2024-10-19 | false | false | allknowingroger/Qwenslerp2-14B | 1 | allknowingroger/Qwenslerp2-14B (Merge) |
π¬ | OpenBuddy/openbuddy-llama3.1-70b-v22.1-131k | 35.23 | 73.33 | 0.73 | 51.94 | 0.67 | 3.4 | 0.03 | 16.67 | 0.38 | 18.24 | 0.46 | 47.82 | 0.53 | π¬ chat models (RLHF, DPO, IFT, ...) | LlamaForCausalLM | Original | bfloat16 | true | other | 70 | 1 | true | 43ed945180174d79a8f6c68509161c249c884dfa | true | true | 2024-08-24 | 2024-08-21 | true | false | OpenBuddy/openbuddy-llama3.1-70b-v22.1-131k | 0 | OpenBuddy/openbuddy-llama3.1-70b-v22.1-131k |
π€ | allknowingroger/Qwenslerp3-14B | 35.21 | 50.52 | 0.51 | 49.81 | 0.65 | 27.42 | 0.27 | 16.67 | 0.38 | 18.02 | 0.47 | 48.83 | 0.54 | π€ base merges and moerges | Qwen2ForCausalLM | Original | bfloat16 | false | apache-2.0 | 14 | 1 | true | ac60a6c4e224e5b52c42bebfd0cf81f920befdef | true | true | 2024-10-21 | 2024-10-19 | false | false | allknowingroger/Qwenslerp3-14B | 1 | allknowingroger/Qwenslerp3-14B (Merge) |
πΆ | migtissera/Llama-3-70B-Synthia-v3.5 | 35.2 | 60.76 | 0.61 | 49.12 | 0.65 | 18.96 | 0.19 | 18.34 | 0.39 | 23.39 | 0.49 | 40.65 | 0.47 | πΆ fine-tuned on domain-specific datasets | LlamaForCausalLM | Original | float16 | true | llama3 | 70 | 5 | true | 8744db0bccfc18f1847633da9d29fc89b35b4190 | true | true | 2024-08-28 | 2024-05-26 | true | false | migtissera/Llama-3-70B-Synthia-v3.5 | 0 | migtissera/Llama-3-70B-Synthia-v3.5 |
π¬ | OpenBuddy/openbuddy-llama3-70b-v21.2-32k | 35.18 | 70.1 | 0.7 | 49.97 | 0.65 | 18.05 | 0.18 | 12.3 | 0.34 | 18.05 | 0.46 | 42.58 | 0.48 | π¬ chat models (RLHF, DPO, IFT, ...) | LlamaForCausalLM | Original | bfloat16 | true | other | 70 | 1 | true | e79a2f16c052fc76eeafb5b51d16261b2b981d0f | true | true | 2024-09-05 | 2024-06-12 | true | false | OpenBuddy/openbuddy-llama3-70b-v21.2-32k | 0 | OpenBuddy/openbuddy-llama3-70b-v21.2-32k |
π’ | Qwen/Qwen2-72B | 35.13 | 38.24 | 0.38 | 51.86 | 0.66 | 29.15 | 0.29 | 19.24 | 0.39 | 19.73 | 0.47 | 52.56 | 0.57 | π’ pretrained | Qwen2ForCausalLM | Original | bfloat16 | true | other | 72 | 189 | true | 87993795c78576318087f70b43fbf530eb7789e7 | true | true | 2024-06-26 | 2024-05-22 | false | true | Qwen/Qwen2-72B | 0 | Qwen/Qwen2-72B |
πΆ | Sao10K/L3-70B-Euryale-v2.1 | 35.11 | 72.81 | 0.73 | 49.19 | 0.65 | 20.24 | 0.2 | 10.85 | 0.33 | 12.05 | 0.42 | 45.51 | 0.51 | πΆ fine-tuned on domain-specific datasets | LlamaForCausalLM | Original | float16 | true | cc-by-nc-4.0 | 70 | 115 | true | 36ad832b771cd783ea7ad00ed39e61f679b1a7c6 | true | true | 2024-06-26 | 2024-06-11 | true | false | Sao10K/L3-70B-Euryale-v2.1 | 0 | Sao10K/L3-70B-Euryale-v2.1 |
π¬ | microsoft/Phi-3.5-MoE-instruct | 35.1 | 69.25 | 0.69 | 48.77 | 0.64 | 20.54 | 0.21 | 14.09 | 0.36 | 17.33 | 0.46 | 40.64 | 0.47 | π¬ chat models (RLHF, DPO, IFT, ...) | Phi3ForCausalLM | Original | bfloat16 | true | mit | 42 | 504 | true | 482a9ba0eb0e1fa1671e3560e009d7cec2e5147c | true | false | 2024-08-21 | 2024-08-17 | true | true | microsoft/Phi-3.5-MoE-instruct | 0 | microsoft/Phi-3.5-MoE-instruct |
π¬ | Qwen/Qwen2-Math-72B-Instruct | 34.79 | 56.94 | 0.57 | 47.96 | 0.63 | 35.95 | 0.36 | 15.77 | 0.37 | 15.73 | 0.45 | 36.36 | 0.43 | π¬ chat models (RLHF, DPO, IFT, ...) | Qwen2ForCausalLM | Original | bfloat16 | true | other | 72 | 83 | true | 5c267882f3377bcfc35882f8609098a894eeeaa8 | true | true | 2024-08-19 | 2024-08-08 | true | true | Qwen/Qwen2-Math-72B-Instruct | 0 | Qwen/Qwen2-Math-72B-Instruct |
πΆ | aaditya/Llama3-OpenBioLLM-70B | 34.73 | 75.97 | 0.76 | 47.15 | 0.64 | 18.2 | 0.18 | 9.73 | 0.32 | 14.35 | 0.44 | 42.97 | 0.49 | πΆ fine-tuned on domain-specific datasets | LlamaForCausalLM | Original | bfloat16 | true | llama3 | 70 | 340 | true | 5f79deaf38bc5f662943d304d59cb30357e8e5bd | true | true | 2024-08-30 | 2024-04-24 | true | false | aaditya/Llama3-OpenBioLLM-70B | 2 | meta-llama/Meta-Llama-3-70B |
π¬ | abacusai/Smaug-Llama-3-70B-Instruct-32K | 34.72 | 77.61 | 0.78 | 49.07 | 0.65 | 21.22 | 0.21 | 6.15 | 0.3 | 12.43 | 0.42 | 41.83 | 0.48 | π¬ chat models (RLHF, DPO, IFT, ...) | LlamaForCausalLM | Original | bfloat16 | true | llama3 | 70 | 21 | true | 33840982dc253968f32ef3a534ee0e025eb97482 | true | true | 2024-08-06 | 2024-06-11 | true | true | abacusai/Smaug-Llama-3-70B-Instruct-32K | 0 | abacusai/Smaug-Llama-3-70B-Instruct-32K |
πΆ | dnhkng/RYS-XLarge2 | 34.7 | 49.02 | 0.49 | 51.55 | 0.66 | 25.38 | 0.25 | 16.55 | 0.37 | 17.05 | 0.45 | 48.65 | 0.54 | πΆ fine-tuned on domain-specific datasets | Qwen2ForCausalLM | Original | bfloat16 | true | null | 77 | 0 | false | 3ce16c9427e93e09ce10a28fa644469d49a51113 | true | true | 2024-10-11 | null | true | false | dnhkng/RYS-XLarge2 | 0 | Removed |
πΆ | rombodawg/Rombos-LLM-V2.5-Qwen-14b | 34.52 | 58.4 | 0.58 | 49.39 | 0.65 | 15.63 | 0.16 | 16.22 | 0.37 | 18.83 | 0.47 | 48.62 | 0.54 | πΆ fine-tuned on domain-specific datasets | Qwen2ForCausalLM | Original | bfloat16 | true | apache-2.0 | 14 | 5 | true | 834ddb1712ae6d1b232b2d5b26be658d90d23e43 | true | true | 2024-09-29 | 2024-10-06 | false | false | rombodawg/Rombos-LLM-V2.5-Qwen-14b | 1 | rombodawg/Rombos-LLM-V2.5-Qwen-14b (Merge) |
πΆ | BAAI/Infinity-Instruct-3M-0613-Llama3-70B | 34.47 | 68.21 | 0.68 | 51.33 | 0.66 | 14.88 | 0.15 | 14.43 | 0.36 | 16.53 | 0.45 | 41.44 | 0.47 | πΆ fine-tuned on domain-specific datasets | LlamaForCausalLM | Original | bfloat16 | true | apache-2.0 | 70 | 5 | true | 9fc53668064bdda22975ca72c5a287f8241c95b3 | true | true | 2024-06-28 | 2024-06-27 | true | false | BAAI/Infinity-Instruct-3M-0613-Llama3-70B | 0 | BAAI/Infinity-Instruct-3M-0613-Llama3-70B |
π¬ | dnhkng/RYS-Llama-3-Huge-Instruct | 34.37 | 76.86 | 0.77 | 49.07 | 0.65 | 21.22 | 0.21 | 1.45 | 0.26 | 11.93 | 0.42 | 45.66 | 0.51 | π¬ chat models (RLHF, DPO, IFT, ...) | LlamaForCausalLM | Original | bfloat16 | true | mit | 99 | 1 | true | cfe14a5339e88a7a89f075d9d48215d45f64acaf | true | true | 2024-08-07 | 2024-08-06 | true | false | dnhkng/RYS-Llama-3-Huge-Instruct | 0 | dnhkng/RYS-Llama-3-Huge-Instruct |
π¬ | MaziyarPanahi/calme-2.1-llama3.1-70b | 34.34 | 84.34 | 0.84 | 48.55 | 0.64 | 1.44 | 0.01 | 10.4 | 0.33 | 13.72 | 0.44 | 47.58 | 0.53 | π¬ chat models (RLHF, DPO, IFT, ...) | LlamaForCausalLM | Original | bfloat16 | true | null | 70 | 4 | false | f39ad1c90b0f30379e80756d29c6533cf84c362a | true | true | 2024-07-24 | 2024-07-23 | true | false | MaziyarPanahi/calme-2.1-llama3.1-70b | 2 | meta-llama/Meta-Llama-3.1-70B |
π¬ | nvidia/Llama-3.1-Nemotron-70B-Instruct-HF | 34.29 | 73.81 | 0.74 | 47.11 | 0.63 | 26.96 | 0.27 | 1.12 | 0.26 | 13.2 | 0.43 | 43.54 | 0.49 | π¬ chat models (RLHF, DPO, IFT, ...) | LlamaForCausalLM | Original | bfloat16 | true | llama3.1 | 70 | 1,038 | true | 250db5cf2323e04a6d2025a2ca2b94a95c439e88 | true | true | 2024-10-16 | 2024-10-12 | true | true | nvidia/Llama-3.1-Nemotron-70B-Instruct-HF | 2 | meta-llama/Meta-Llama-3.1-70B |
πΆ | nisten/franqwenstein-35b | 34.16 | 39.14 | 0.39 | 51.68 | 0.66 | 28.7 | 0.29 | 14.54 | 0.36 | 19.68 | 0.47 | 51.23 | 0.56 | πΆ fine-tuned on domain-specific datasets | Qwen2ForCausalLM | Original | bfloat16 | true | mit | 34 | 5 | true | 901351a987d664a1cd7f483115a167d3ae5694ec | true | true | 2024-10-03 | 2024-10-03 | true | false | nisten/franqwenstein-35b | 1 | nisten/franqwenstein-35b (Merge) |
π¬ | mistralai/Mixtral-8x22B-Instruct-v0.1 | 33.89 | 71.84 | 0.72 | 44.11 | 0.61 | 18.73 | 0.19 | 16.44 | 0.37 | 13.49 | 0.43 | 38.7 | 0.45 | π¬ chat models (RLHF, DPO, IFT, ...) | MixtralForCausalLM | Original | bfloat16 | true | apache-2.0 | 140 | 679 | true | b0c3516041d014f640267b14feb4e9a84c8e8c71 | true | false | 2024-06-12 | 2024-04-16 | true | true | mistralai/Mixtral-8x22B-Instruct-v0.1 | 1 | mistralai/Mixtral-8x22B-v0.1 |
π€ | allknowingroger/Qwen2.5-slerp-14B | 33.87 | 49.28 | 0.49 | 49.79 | 0.65 | 20.47 | 0.2 | 15.66 | 0.37 | 19.37 | 0.47 | 48.66 | 0.54 | π€ base merges and moerges | Qwen2ForCausalLM | Original | bfloat16 | false | apache-2.0 | 14 | 0 | true | a44b0ea8291b62785152c2fe6ab336f5da672d1e | true | true | 2024-10-21 | 2024-10-17 | false | false | allknowingroger/Qwen2.5-slerp-14B | 1 | allknowingroger/Qwen2.5-slerp-14B (Merge) |
π¬ | arcee-ai/SuperNova-Medius | 33.78 | 71.99 | 0.72 | 48.01 | 0.64 | 14.5 | 0.15 | 11.07 | 0.33 | 12.28 | 0.42 | 44.83 | 0.5 | π¬ chat models (RLHF, DPO, IFT, ...) | Qwen2ForCausalLM | Original | bfloat16 | false | apache-2.0 | 14 | 139 | true | e34fafcac2801be1ae5c7eb744e191a08119f2af | true | true | 2024-10-22 | 2024-10-02 | true | false | arcee-ai/SuperNova-Medius | 1 | arcee-ai/SuperNova-Medius (Merge) |
π¬ | HuggingFaceH4/zephyr-orpo-141b-A35b-v0.1 | 33.77 | 65.11 | 0.65 | 47.5 | 0.63 | 18.35 | 0.18 | 17.11 | 0.38 | 14.72 | 0.45 | 39.85 | 0.46 | π¬ chat models (RLHF, DPO, IFT, ...) | MixtralForCausalLM | Original | float16 | true | apache-2.0 | 140 | 260 | true | a3be084543d278e61b64cd600f28157afc79ffd3 | true | true | 2024-06-12 | 2024-04-10 | true | true | HuggingFaceH4/zephyr-orpo-141b-A35b-v0.1 | 1 | mistral-community/Mixtral-8x22B-v0.1 |
πΆ | nbeerbower/Llama-3.1-Nemotron-lorablated-70B | 33.69 | 71.47 | 0.71 | 48.06 | 0.64 | 23.34 | 0.23 | 0.89 | 0.26 | 14.92 | 0.44 | 43.46 | 0.49 | πΆ fine-tuned on domain-specific datasets | LlamaForCausalLM | Original | bfloat16 | false | llama3.1 | 70 | 2 | true | f335a582cdb7fb0e63a7343a908766ebd0ed9882 | true | true | 2024-10-18 | 2024-10-17 | true | false | nbeerbower/Llama-3.1-Nemotron-lorablated-70B | 1 | nbeerbower/Llama-3.1-Nemotron-lorablated-70B (Merge) |
π€ | Lambent/qwen2.5-reinstruct-alternate-lumen-14B | 33.66 | 47.94 | 0.48 | 48.99 | 0.65 | 19.79 | 0.2 | 16.89 | 0.38 | 19.62 | 0.48 | 48.76 | 0.54 | π€ base merges and moerges | Qwen2ForCausalLM | Original | bfloat16 | true | null | 14 | 3 | false | dac3be334098338fb6c02636349e8ed53f18c4a4 | true | true | 2024-09-28 | 2024-09-23 | false | false | Lambent/qwen2.5-reinstruct-alternate-lumen-14B | 1 | Lambent/qwen2.5-reinstruct-alternate-lumen-14B (Merge) |
π¬ | tanliboy/lambda-qwen2.5-14b-dpo-test | 33.52 | 82.31 | 0.82 | 48.45 | 0.64 | 0 | 0 | 14.99 | 0.36 | 12.59 | 0.43 | 42.75 | 0.48 | π¬ chat models (RLHF, DPO, IFT, ...) | Qwen2ForCausalLM | Original | bfloat16 | true | apache-2.0 | 14 | 6 | true | 96607eea3c67f14f73e576580610dba7530c5dd9 | true | true | 2024-09-20 | 2024-09-20 | true | false | tanliboy/lambda-qwen2.5-14b-dpo-test | 2 | Qwen/Qwen2.5-14B |
π¬ | CohereForAI/c4ai-command-r-plus-08-2024 | 33.42 | 75.4 | 0.75 | 42.84 | 0.6 | 11.03 | 0.11 | 13.42 | 0.35 | 19.84 | 0.48 | 38.01 | 0.44 | π¬ chat models (RLHF, DPO, IFT, ...) | CohereForCausalLM | Original | float16 | true | cc-by-nc-4.0 | 103 | 152 | true | 2d8cf3ab0af78b9e43546486b096f86adf3ba4d0 | true | true | 2024-09-19 | 2024-08-21 | true | true | CohereForAI/c4ai-command-r-plus-08-2024 | 0 | CohereForAI/c4ai-command-r-plus-08-2024 |
π€ | v000000/Qwen2.5-14B-Gutenberg-Instruct-Slerpeno | 33.39 | 48.55 | 0.49 | 49.74 | 0.65 | 19.71 | 0.2 | 15.21 | 0.36 | 18.43 | 0.47 | 48.68 | 0.54 | π€ base merges and moerges | Qwen2ForCausalLM | Original | bfloat16 | false | apache-2.0 | 14 | 4 | true | 1069abb4c25855e67ffaefa08a0befbb376e7ca7 | true | true | 2024-09-28 | 2024-09-20 | false | false | v000000/Qwen2.5-14B-Gutenberg-Instruct-Slerpeno | 1 | v000000/Qwen2.5-14B-Gutenberg-Instruct-Slerpeno (Merge) |
π€ | zelk12/MT1-gemma-2-9B | 33.37 | 79.47 | 0.79 | 44.16 | 0.61 | 13.37 | 0.13 | 12.75 | 0.35 | 13.16 | 0.43 | 37.31 | 0.44 | π€ base merges and moerges | Gemma2ForCausalLM | Original | bfloat16 | true | null | 10 | 1 | false | 3a5e77518ca9c3c8ea2edac4c03bc220ee91f3ed | true | true | 2024-10-14 | 2024-10-12 | true | false | zelk12/MT1-gemma-2-9B | 1 | zelk12/MT1-gemma-2-9B (Merge) |
π¬ | jpacifico/Chocolatine-14B-Instruct-DPO-v1.2 | 33.3 | 68.52 | 0.69 | 49.85 | 0.64 | 17.98 | 0.18 | 10.07 | 0.33 | 12.35 | 0.43 | 41.07 | 0.47 | π¬ chat models (RLHF, DPO, IFT, ...) | Phi3ForCausalLM | Original | float16 | true | mit | 13 | 10 | true | d34bbd55b48e553f28579d86f3ccae19726c6b39 | true | true | 2024-08-28 | 2024-08-12 | true | false | jpacifico/Chocolatine-14B-Instruct-DPO-v1.2 | 0 | jpacifico/Chocolatine-14B-Instruct-DPO-v1.2 |
πΆ | migtissera/Tess-v2.5.2-Qwen2-72B | 33.28 | 44.94 | 0.45 | 52.31 | 0.66 | 27.42 | 0.27 | 13.42 | 0.35 | 10.89 | 0.42 | 50.68 | 0.56 | πΆ fine-tuned on domain-specific datasets | Qwen2ForCausalLM | Original | bfloat16 | true | other | 72 | 11 | true | 0435e634ad9bc8b1172395a535b78e6f25f3594f | true | true | 2024-08-10 | 2024-06-13 | true | false | migtissera/Tess-v2.5.2-Qwen2-72B | 0 | migtissera/Tess-v2.5.2-Qwen2-72B |
π€ | zelk12/MT4-gemma-2-9B | 33.16 | 77.62 | 0.78 | 43.55 | 0.61 | 15.63 | 0.16 | 11.74 | 0.34 | 13 | 0.43 | 37.4 | 0.44 | π€ base merges and moerges | Gemma2ForCausalLM | Original | bfloat16 | true | null | 10 | 0 | false | 2167ea02baf9145a697a7d828a17c75b86e5e282 | true | true | 2024-10-20 | 2024-10-16 | true | false | zelk12/MT4-gemma-2-9B | 1 | zelk12/MT4-gemma-2-9B (Merge) |
π¬ | TheTsar1209/qwen-carpmuscle-v0.2 | 33.11 | 52.57 | 0.53 | 48.18 | 0.64 | 25 | 0.25 | 14.09 | 0.36 | 12.75 | 0.43 | 46.08 | 0.51 | π¬ chat models (RLHF, DPO, IFT, ...) | Qwen2ForCausalLM | Original | bfloat16 | true | apache-2.0 | 14 | 0 | true | 081f6b067ebca9bc384af283f1d267880534b8e3 | true | true | 2024-10-19 | 2024-10-16 | true | false | TheTsar1209/qwen-carpmuscle-v0.2 | 3 | Qwen/Qwen2.5-14B |
π€ | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25 | 33.06 | 77.07 | 0.77 | 43.85 | 0.61 | 14.12 | 0.14 | 12.42 | 0.34 | 13.13 | 0.43 | 37.78 | 0.44 | π€ base merges and moerges | Gemma2ForCausalLM | Original | bfloat16 | true | null | 10 | 1 | false | e652c9e07265526851dad994f4640aa265b9ab56 | true | true | 2024-10-04 | 2024-10-04 | true | false | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25 | 1 | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25 (Merge) |
π€ | zelk12/MT2-gemma-2-9B | 33.03 | 78.86 | 0.79 | 44.17 | 0.61 | 13.22 | 0.13 | 12.98 | 0.35 | 11.54 | 0.42 | 37.43 | 0.44 | π€ base merges and moerges | Gemma2ForCausalLM | Original | bfloat16 | true | null | 10 | 1 | false | d20d7169ce0f53d586504c50b4b7dc470bf8a781 | true | true | 2024-10-15 | 2024-10-14 | true | false | zelk12/MT2-gemma-2-9B | 1 | zelk12/MT2-gemma-2-9B (Merge) |
π¬ | microsoft/Phi-3-medium-4k-instruct | 32.67 | 64.23 | 0.64 | 49.38 | 0.64 | 16.99 | 0.17 | 11.52 | 0.34 | 13.05 | 0.43 | 40.84 | 0.47 | π¬ chat models (RLHF, DPO, IFT, ...) | Phi3ForCausalLM | Original | bfloat16 | true | mit | 13 | 211 | true | d194e4e74ffad5a5e193e26af25bcfc80c7f1ffc | true | true | 2024-06-12 | 2024-05-07 | true | true | microsoft/Phi-3-medium-4k-instruct | 0 | microsoft/Phi-3-medium-4k-instruct |
End of preview.