Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
Technoculture
/
MT7Bi-alpha-dpo-v0.2
like
2
Adapters
Safetensors
argilla/distilabel-intel-orca-dpo-pairs
jondurbin/truthy-dpo-v0.1
argilla/distilabel-math-preference-dpo
argilla/distilabel-capybara-dpo-7k-binarized
English
llama
4-bit precision
bitsandbytes
License:
mit
Model card
Files
Files and versions
Community
1
Train
Use this model
main
MT7Bi-alpha-dpo-v0.2
/
MT7Bi_alpha_dpo_v0_2.ipynb
dkshjn
Upload MT7Bi_alpha_dpo_v0_2.ipynb
822794d
verified
4 months ago
raw
history
contribute
delete
No virus
2.87 MB