Edit model card

python-bytes-distilgpt2

This model is not affiliated with the Python Bytes podcast in any way.

This model is a fine-tuned version of distilgpt2 on Python Bytes show notes.

It achieves the following results on the evaluation set:

  • Loss: 3.0372
  • Accuracy: 0.3969

Model description

This model generates conversation between the two show hosts (Michael Kennedy and Brian Okken), and sometimes guests appear :).

Intended uses & limitations

This model was trained specifically for educational purposes and is intended for other users to use it in a similar manner.

Training and evaluation data

Data is located on GitHub

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 3.0

Training results

Framework versions

  • Transformers 4.22.0.dev0
  • Pytorch 1.12.0+cu113
  • Datasets 2.4.0
  • Tokenizers 0.12.1
Downloads last month
7
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.