ibivibiv commited on
Commit
b6c0708
1 Parent(s): 040bcca

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -9,7 +9,7 @@ library_name: transformers
9
 
10
  ![img](./aegolius-acadicus.png)
11
 
12
- I like to call this model "The little professor". It is simply a MOE merge of lora merged models across Llama2 and Mistral. I am using this as a test case to move to larger models and get my gate discrimination set correctly. This model is best suited for knowledge related use cases, I did not give it a specific workload target as I did with some of the other models in the "Owl Series".
13
 
14
  In this particular run I am expanding data sets and model count to see if that helps/hurts. I am also moving to more of my own fine tuned mistrals
15
 
 
9
 
10
  ![img](./aegolius-acadicus.png)
11
 
12
+ I like to call this model series "The little professor". I am funding this out of my pocket on rented hardware and runpod to create lora adapters and then assemble MOE models from them and others. Ultimately I hope to have them all be lora's that I have made. Whoever took the first one off the leaderboard, I would appreciate it if you would stop. This is no different than Mixtral and I am working just as hard as the rest of you and spending my own money. It is simply a MOE merge of lora merged models across Llama2 and Mistral. I am using this as a test case to move to larger models and get my gate discrimination set correctly. This model is best suited for knowledge related use cases, I did not give it a specific workload target as I did with some of the other models in the "Owl Series".
13
 
14
  In this particular run I am expanding data sets and model count to see if that helps/hurts. I am also moving to more of my own fine tuned mistrals
15