Transformers
Safetensors
PEFT
Inference Endpoints
psiyum commited on
Commit
630a9a7
1 Parent(s): 04a57a6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -3,6 +3,7 @@ library_name: transformers
3
  tags:
4
  - transformers
5
  - peft
 
6
  license: llama2
7
  base_model: meta-llama/Llama-2-13b-hf
8
  datasets:
@@ -15,7 +16,7 @@ datasets:
15
 
16
  The model is fine-tuned (calibration-tuned) using a [dataset](https://huggingface.co/datasets/calibration-tuning/Llama-2-13b-hf-20k-choice) of *multiple-choice* generations from `meta-llama/Llama-2-13b-hf`, labeled for correctness.
17
  At test/inference time, the probability of correctness defines the confidence of the model in its answer.
18
- For full details, please see our [paper](https://arxiv.org/) and supporting [code](https://github.com/activatedgeek/calibration-tuning).
19
 
20
  **Other Models**: We also release a broader collection of [Multiple-Choice CT Models](https://huggingface.co/collections/calibration-tuning/multiple-choice-ct-models-66043dedebf973d639090821).
21
 
 
3
  tags:
4
  - transformers
5
  - peft
6
+ - arxiv:2406.08391
7
  license: llama2
8
  base_model: meta-llama/Llama-2-13b-hf
9
  datasets:
 
16
 
17
  The model is fine-tuned (calibration-tuned) using a [dataset](https://huggingface.co/datasets/calibration-tuning/Llama-2-13b-hf-20k-choice) of *multiple-choice* generations from `meta-llama/Llama-2-13b-hf`, labeled for correctness.
18
  At test/inference time, the probability of correctness defines the confidence of the model in its answer.
19
+ For full details, please see our [paper](https://arxiv.org/abs/2406.08391) and supporting [code](https://github.com/activatedgeek/calibration-tuning).
20
 
21
  **Other Models**: We also release a broader collection of [Multiple-Choice CT Models](https://huggingface.co/collections/calibration-tuning/multiple-choice-ct-models-66043dedebf973d639090821).
22