omarmomen's picture
Update README.md
ab6b775 verified
metadata
license: mit
datasets:
  - omarmomen/babylm_10M
language:
  - en
metrics:
  - perplexity
library_name: transformers

Model Card for omarmomen/ptb_filtered_lowcase_bpe_tokenizer_8

This model is part of the experiments in my master's thesis titled "Linguistic Structure Induction from Language Models" (https://arxiv.org/abs/2403.09714).

"omarmomen/ptb_filtered_lowcase_bpe_tokenizer_8" is a RobertaTokenizer pretrained on the Penn Tree Bank Training dataset (uncased) with 8K tokens.

https://arxiv.org/abs/2403.09714