pcuenq HF staff bmuskalla commited on
Commit
59aaab1
1 Parent(s): 61f7354

Fix newline (#10)

Browse files

- Fix newline (f1716f912f974220cd917552f4a4b9a1f9d4e3bd)


Co-authored-by: Benjamin Muskalla <bmuskalla@users.noreply.huggingface.co>

Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -121,7 +121,8 @@ All variants are available in sizes of 7B, 13B, 34B, and 70B parameters.
121
  **Out-of-Scope Uses** Use in any manner that violates applicable laws or regulations (including trade compliance laws). Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Code Llama and its variants.
122
 
123
  ## Hardware and Software
124
- **Training Factors** We used custom training libraries. The training and fine-tuning of the released models have been performed Meta’s Research Super Cluster.\\**Carbon Footprint** In aggregate, training all 12 Code Llama models required 1400K GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 228.55 tCO2eq, 100% of which were offset by Meta’s sustainability program.
 
125
 
126
  ## Evaluation Results
127
 
 
121
  **Out-of-Scope Uses** Use in any manner that violates applicable laws or regulations (including trade compliance laws). Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Code Llama and its variants.
122
 
123
  ## Hardware and Software
124
+ **Training Factors** We used custom training libraries. The training and fine-tuning of the released models have been performed Meta’s Research Super Cluster.
125
+ **Carbon Footprint** In aggregate, training all 12 Code Llama models required 1400K GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 228.55 tCO2eq, 100% of which were offset by Meta’s sustainability program.
126
 
127
  ## Evaluation Results
128