YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Checkpoints related to experiments on training a character level language model on enwik8 data.
- Originally forked from Karpathy's NanoGPT, available here - (https://github.com/karpathy/nanoGPT)
- Original enwik8 dataset available here - (https://huggingface.co/datasets/LTCB/enwik8)
- Subclass categorization of enwik8 data available here - (https://huggingface.co/datasets/Shivamkak/enwik8-categories16)
The checkpoints uploaded achieved the following BPC:
- BASE - 1.430
- DEEP - 1.359
- TIPA - 1.460
- SUBCLASS-SCIENCE - 1.075 ** (top performance)
- POST-TRAIN-SCIENCE - 1.417
- SUBCLASS-PHHILOSOPHY - 1.126
- POST-TRAIN-PHILOSOPHY - 1.390
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
HF Inference deployability: The model has no library tag.