pythia-6.9b-HC3 / README.md
Peter Szemraj
add sharded checkpoint
4b90790
|
raw
history blame
1.39 kB
metadata
license: apache-2.0
tags:
  - generated_from_trainer
datasets:
  - pszemraj/HC3-textgen-qa
metrics:
  - accuracy

HC3-textgen-qa-pythia-6.9b-deduped-r1

This model is a fine-tuned version of EleutherAI/pythia-6.9b-deduped on the pszemraj/HC3-textgen-qa dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2372
  • Accuracy: 0.6769

Model description

Text generation model trained on the HC3 text data of human questions + chatGPT answers.

Intended uses & limitations

More information needed

Training and evaluation data

model-index:
- name: pythia-6.9b-hc3-qa-assistant
  results:
  - task:
      name: Causal Language Modeling
      type: text-generation
    dataset:
      name: pszemraj/HC3-textgen-qa
    metrics:
    - name: Accuracy
      type: accuracy
      value: 0.6768941789814655

Training procedure

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.2598 0.99 79 1.3291 0.6496
0.7446 1.99 158 1.2372 0.6769