Deci
/

Text Generation
Transformers
Safetensors
English
Deci AI
DeciLM
custom_code
Eval Results
danaevan commited on
Commit
6b52f36
1 Parent(s): 1909a01

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -109,7 +109,7 @@ DeciLM 6B is a 5.7 billion parameter decoder-only text generation model. With a
109
 
110
  ### Model Description
111
 
112
- Deci developed and publically released the DeciLM 6B large language model (LLM), a pretrained, high-efficiency generative text model with 5.7 billion parameters. DeciLM 6B outpaces pretrained models in its class, with a throughput that's up to 15 times that of Llama 2 7B's. DeciLM-6B was further LoRA fine-tuned for instruction following on a subset of the OpenOrca dataset, creating DeciLM 6B Instruct
113
 
114
  - **Developed by:** Deci
115
  - **Model type:** DeciLM is an auto-regressive language model using an optimized transformer decoder architecture that includes variable Grouped-Query Attention.
 
109
 
110
  ### Model Description
111
 
112
+ Deci developed and publically released the DeciLM 6B large language model, a pretrained, high-efficiency generative text model with 5.7 billion parameters. DeciLM 6B outpaces pretrained models in its class, with a throughput that's up to 15 times that of Llama 2 7B's. DeciLM-6B was further LoRA fine-tuned for instruction following on a subset of the OpenOrca dataset, creating DeciLM 6B Instruct
113
 
114
  - **Developed by:** Deci
115
  - **Model type:** DeciLM is an auto-regressive language model using an optimized transformer decoder architecture that includes variable Grouped-Query Attention.