Text Generation
Transformers
Safetensors
llama
Generated from Trainer
axolotl
conversational
Inference Endpoints
text-generation-inference
pabloce commited on
Commit
91f0a52
1 Parent(s): f85c129

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -0
README.md CHANGED
@@ -23,6 +23,7 @@ This is our most spectacular outcome ever. FFT, all parameters, 16bit. 70.9 MML
23
 
24
  Although the max positional embeddings is 4k, we used rope theta of 1000000.0 and we trained with sequence length 12k. We plan to train on the upcoming 32k version as well.
25
 
 
26
  Discord: https://discord.gg/cognitivecomputations
27
 
28
  <img src="https://cdn-uploads.huggingface.co/production/uploads/63111b2d88942700629f5771/ldkN1J0WIDQwU4vutGYiD.png" width="600" />
 
23
 
24
  Although the max positional embeddings is 4k, we used rope theta of 1000000.0 and we trained with sequence length 12k. We plan to train on the upcoming 32k version as well.
25
 
26
+ [![Discord](https://img.shields.io/discord/1156064224225808488?logo=Discord&logoColor=%23ffffff&label=Discord&link=https%3A%2F%2Fdiscord.gg%2FtCMkMDDHwm)](https://discord.gg/cognitivecomputations)
27
  Discord: https://discord.gg/cognitivecomputations
28
 
29
  <img src="https://cdn-uploads.huggingface.co/production/uploads/63111b2d88942700629f5771/ldkN1J0WIDQwU4vutGYiD.png" width="600" />