Text Generation
Transformers
Safetensors
llama
conversational
Inference Endpoints
text-generation-inference

What is the context length of abacusai/Smaug-Llama-3-70B-Instruct?

#4
by catworld1212 - opened

What is the context length of abacusai/Smaug-Llama-3-70B-Instruct?

The same as Llama 3, about 8k. You can find it in the config.json file '"max_position_embeddings": 8192'.

Sign up or log in to comment