The dataset viewer is not available because its heuristics could not detect any supported data files. You can try uploading some data files, or configuring the data files location manually.

These binary files contain 8.4B tokens from the FineWeb "10B" Sample processed with TokenMonster, using a subset of the English 100,256 vocabulary but reduced to 50,256 to be the same size as GPT-2.

Motivation

You can use these tokens as a drop-in replacement for FineWeb in the NanoGPT Speedrun repository, for your own personal research only. Why use these tokens?

  • TokenMonster requires about 15% fewer tokens than tiktoken.
  • Experiments indicate ungreedy tokenization can be 20% more effective.

Combined, it's possible to achive a training speed-up of almost 40% — as measured by performance on HellaSwag (completion-style) — by switching to these tokens.

Vocabulary

The tiktoken vocabulary is named english-50256-balanced-v2.vocab and you can load it with any version of the TokenMonster library.

This vocabulary was created by filtering down the English 100,256 balanced vocabulary, removing tokens that include multiple words combined and other infrequent tokens that would likely be under-trained.

Disclaimer

WARNING: The original FineWeb dataset is technically infringing, like many other datasets on this platform. Fair Dealings exceptions under international Copyright law do not allow for dissemination to the public / redistribution. Since HuggingFace is the creator, distributor of the FineWeb dataset, as well as the main beneficiary of its communication to the public, they have decided to take the legal responsibility for its distribution. Please file Legal Issues on the original dataset(s) and address HuggingFace staff directly. Downloading any version of the dataset is infringement. If you have content in this dataset, you may be entitled to damages! The binary files here will be removed if/when HuggingFace takes down FineWeb. Due to the choice of format, technical restrictions, and the small number of tokens, this repository (by design) does not cause any damages beyond the original communication to the public of the upstream repository.

Downloads last month
60