Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

radariscs
/
iterative-DPO-model

Transformers
Safetensors
Model card Files Files and versions
xet
Community
iterative-DPO-model
30.9 MB
  • 1 contributor
History: 3 commits
radariscs's picture
radariscs
Upload tokenizer
c07f0ea verified 5 months ago
  • .gitattributes
    1.57 kB
    Upload tokenizer 5 months ago
  • README.md
    5.17 kB
    Upload model 5 months ago
  • adapter_config.json
    757 Bytes
    Upload model 5 months ago
  • adapter_model.safetensors
    13.6 MB
    xet
    Upload model 5 months ago
  • special_tokens_map.json
    325 Bytes
    Upload tokenizer 5 months ago
  • tokenizer.json
    17.2 MB
    xet
    Upload tokenizer 5 months ago
  • tokenizer_config.json
    54.6 kB
    Upload tokenizer 5 months ago