prithivMLmods commited on
Commit
efddbae
·
verified ·
1 Parent(s): 01e9a97

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -2
README.md CHANGED
@@ -8,6 +8,8 @@ pipeline_tag: text-generation
8
  library_name: transformers
9
  tags:
10
  - reason
 
 
11
  ---
12
 
13
  ![logo.png](https://cdn-uploads.huggingface.co/production/uploads/65bb837dbfb878f46c77de4c/Rqm-Qx8AvbHFFbFbVY93X.png)
@@ -65,5 +67,4 @@ Despite its capabilities, Bellatrix has some limitations:
65
  2. **Dependence on Training Data**: It is only as good as the quality and diversity of its training data, which may lead to biases or inaccuracies.
66
  3. **Computational Resources**: The model’s optimized transformer architecture can be resource-intensive, requiring significant computational power for fine-tuning and inference.
67
  4. **Language Coverage**: While multilingual, some languages or dialects may have limited support or lower performance compared to widely used ones.
68
- 5. **Real-World Contexts**: It may struggle with understanding nuanced or ambiguous real-world scenarios not covered during training.
69
-
 
8
  library_name: transformers
9
  tags:
10
  - reason
11
+ - tiny
12
+ - llama
13
  ---
14
 
15
  ![logo.png](https://cdn-uploads.huggingface.co/production/uploads/65bb837dbfb878f46c77de4c/Rqm-Qx8AvbHFFbFbVY93X.png)
 
67
  2. **Dependence on Training Data**: It is only as good as the quality and diversity of its training data, which may lead to biases or inaccuracies.
68
  3. **Computational Resources**: The model’s optimized transformer architecture can be resource-intensive, requiring significant computational power for fine-tuning and inference.
69
  4. **Language Coverage**: While multilingual, some languages or dialects may have limited support or lower performance compared to widely used ones.
70
+ 5. **Real-World Contexts**: It may struggle with understanding nuanced or ambiguous real-world scenarios not covered during training.