Llama3-8B-Classic-RM / config.json
ankner's picture
Upload folder using huggingface_hub
eec6ee6 verified
raw
history blame
211 Bytes
{
"architectures": [
"FeedbackRewardModel"
],
"base_model_name_or_path": "meta-llama/Meta-Llama-3-8B",
"feedback_method": "vanilla",
"torch_dtype": "bfloat16",
"transformers_version": "4.40.2"
}