PY007 commited on
Commit
06ca973
1 Parent(s): 966b96c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -16,5 +16,5 @@ Memory optimization and training recipes to extrapolate language models' context
16
  This model is finetuned from h2oai/h2o-danube2-1.8b-base with EasyContext on context length 256K. Note that I keep max_position_embeddings in config.json to 4096 because HF llama will create 2D causal mask during initialization. If it is set to 256K GPU will just OOM. You can surely use this model with context length longer than 4096.
17
 
18
  <p align="center">
19
- <img src="./heatmap.png" width="500">
20
  </p>
 
16
  This model is finetuned from h2oai/h2o-danube2-1.8b-base with EasyContext on context length 256K. Note that I keep max_position_embeddings in config.json to 4096 because HF llama will create 2D causal mask during initialization. If it is set to 256K GPU will just OOM. You can surely use this model with context length longer than 4096.
17
 
18
  <p align="center">
19
+ <img src="./heatmap.png" width="800">
20
  </p>