Locutusque commited on
Commit
12b1bb9
1 Parent(s): 94a2735

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -18,7 +18,7 @@ A pre-trained language model, based on the Mistral 7B model, has been scaled dow
18
  This model should have a context length of around 32,768 tokens.
19
 
20
  During evaluation on InstructMix, this model achieved an average perplexity score of 6.3. More epochs are planned for this model on different datasets.
21
- # [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
22
  Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__TinyMistral-248m)
23
 
24
  | Metric | Value |
 
18
  This model should have a context length of around 32,768 tokens.
19
 
20
  During evaluation on InstructMix, this model achieved an average perplexity score of 6.3. More epochs are planned for this model on different datasets.
21
+ # [Open LLM Leaderboard Evaluation Results (outdated)](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
22
  Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__TinyMistral-248m)
23
 
24
  | Metric | Value |