--- license: apache-2.0 datasets: - argilla/OpenHermes2.5-dpo-binarized-alpha language: - en pipeline_tag: text-generation tags: - merge - dpo - conversation - text-generation-inference - Kukedlc/NeuTrixOmniBe-7B-model-remix --- DPO Finetuned Kukedlc/NeuTrixOmniBe-7B-model-remix using argilla/OpenHermes2.5-dpo-binarized-alpha argilla dpo binarized pairs is a dataset built on top of: https://huggingface.co/datasets/teknium/OpenHermes-2.5 using https://github.com/argilla-io/distilabel if interested. Thx for the great data sources. GGUF: https://huggingface.co/eren23/dpo-binarized-NeutrixOmnibe-7B-GGUF