Text Generation
Transformers
Safetensors
gpt_bigcode
code
granite
Eval Results
Inference Endpoints
text-generation-inference
amezasor commited on
Commit
c6cddd9
1 Parent(s): 8acb183

model summary update

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -251,7 +251,7 @@ model-index:
251
  # Granite-20B-Code-Base
252
 
253
  ## Model Summary
254
- **Granite-20B-Code-Base** is a decoder-only code model designed for code generative tasks (e.g., code generation, code explanation, code fixing, etc.). It is trained from scratch with a two-phase training strategy. In phase 1, our model is trained on 3 to 4 trillion tokens sourced from 116 programming languages, ensuring a comprehensive understanding of programming languages and syntax. In phase 2, our model is trained on 500 billion tokens with a carefully designed mixture of high-quality data from code and natural language domains to improve the models’ ability to reason and follow instructions.
255
 
256
  - **Developers:** IBM Research
257
  - **GitHub Repository:** [ibm-granite/granite-code-models](https://github.com/ibm-granite/granite-code-models)
 
251
  # Granite-20B-Code-Base
252
 
253
  ## Model Summary
254
+ **Granite-20B-Code-Base** is a decoder-only code model designed for code generative tasks (e.g., code generation, code explanation, code fixing, etc.). It is trained from scratch with a two-phase training strategy. In phase 1, our model is trained on 3 trillion tokens sourced from 116 programming languages, ensuring a comprehensive understanding of programming languages and syntax. In phase 2, our model is trained on 500 billion tokens with a carefully designed mixture of high-quality data from code and natural language domains to improve the models’ ability to reason and follow instructions.
255
 
256
  - **Developers:** IBM Research
257
  - **GitHub Repository:** [ibm-granite/granite-code-models](https://github.com/ibm-granite/granite-code-models)