Model Sources
- Repository: https://github.com/NathanGodey/headless-lm
- Paper: https://arxiv.org/abs/2309.08351
Model Architecture and Objective
This model is a Pythia-70m architecture trained on OpenWebText-2 using the Contrastive Weight Tying objective, and briefly fine-tuned for language generation on the same dataset.
Citation
BibTeX:
@misc{godey2023headless,
title={Headless Language Models: Learning without Predicting with Contrastive Weight Tying},
author={Nathan Godey and Éric de la Clergerie and Benoît Sagot},
year={2023},
eprint={2309.08351},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
Contact
- Downloads last month
- 21
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.