text
stringclasses 6
values |
---|
Largest TOKENS_LLAMA1: 7545 |
Largest TOKENS_LLAMA3: 6397 |
Smallest TOKENS_LLAMA1: 98 |
Smallest TOKENS_LLAMA3: 83 |
Total_TOKENS_LLAMA1: 12874614 |
Total_TOKENS_LLAMA3: 10892913 |
Reddit_Dirty_Writing_Prompts_ShareGPT
Dataset Details
This is the Reddit_Dirty_Writing_Prompts dataset, which I further cleaned and sorted. The original dataset can be found at: https://huggingface.co/datasets/nothingiisreal/Reddit-Dirty-And-WritingPrompts
- Additional meticulously cleaning performed
- ShareGPT Format
- Each entry contains the number of tokens in both LLAMA1 and LLAMA3 tokenizers
- Each entry contains the number of characters
- Longest entry in tokens: TOKENS_LLAMA1: 7545 \ TOKENS_LLAMA3: 6397
- Shortest entry in tokens: TOKENS_LLAMA1: 98 \ TOKENS_LLAMA3: 83
- Total_TOKENS_LLAMA1: 12874614 (12M), Total_TOKENS_LLAMA3: 10892913 (10M)
I hope this helps as many people as possible, let's make AI with less slop, and make AI accessible for everyone 🤗
- Downloads last month
- 34