sherazkhan commited on
Commit
fe18f92
1 Parent(s): d790059

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -0
README.md CHANGED
@@ -1,3 +1,11 @@
 
 
 
 
 
 
 
 
1
  # Mixllama-8x8b-Instruct-v0.1 based on LLaMA 3
2
 
3
  An experimental MoE (Mixture of Experts) model based on the LLaMA-3-8B.
 
1
+ ---
2
+ license: llama3
3
+ language:
4
+ - en
5
+ library_name: transformers
6
+ tags:
7
+ - text Generation
8
+ ---
9
  # Mixllama-8x8b-Instruct-v0.1 based on LLaMA 3
10
 
11
  An experimental MoE (Mixture of Experts) model based on the LLaMA-3-8B.