ibivibiv commited on
Commit
0170033
1 Parent(s): f95d435

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -2
README.md CHANGED
@@ -3,6 +3,8 @@ license: llama2
3
  language:
4
  - en
5
  pipeline_tag: conversational
 
 
6
  ---
7
  # Giant Macaroni 120b
8
 
@@ -37,5 +39,4 @@ Coming soon.
37
  [@alpindale](https://huggingface.co/alpindale) - for [Goliath-120B](https://huggingface.co/alpindale/goliath-120b?text=Hey+my+name+is+Thomas%21+How+are+you%3F) that started this crazy endeavor for us all
38
  [@nsfwthrowitaway69](https://huggingface.co/nsfwthrowitaway69) - for sharing the merge config for [Venus-120B](https://huggingface.co/nsfwthrowitaway69/Venus-120b-v1.1) and getting me off the starting block with some questions on mergekit and tokenizers
39
 
40
- Keep it open and keep sharing everyone! With Mixtral and MOE changes to mergekit coupled with these larger merged models? I think the sky is the limit for us all. I can only imagine what will happen if we took a group of these 120 models, fin tuned them each a bit and applied the MOE Mixtral merge method to them? I would also point out that if a clever VC came along and funded that work? You have the people you need right here on huggingface and all they need is the equipment to do it on.
41
-
 
3
  language:
4
  - en
5
  pipeline_tag: conversational
6
+ tags:
7
+ - merge
8
  ---
9
  # Giant Macaroni 120b
10
 
 
39
  [@alpindale](https://huggingface.co/alpindale) - for [Goliath-120B](https://huggingface.co/alpindale/goliath-120b?text=Hey+my+name+is+Thomas%21+How+are+you%3F) that started this crazy endeavor for us all
40
  [@nsfwthrowitaway69](https://huggingface.co/nsfwthrowitaway69) - for sharing the merge config for [Venus-120B](https://huggingface.co/nsfwthrowitaway69/Venus-120b-v1.1) and getting me off the starting block with some questions on mergekit and tokenizers
41
 
42
+ Keep it open and keep sharing everyone! With Mixtral and MOE changes to mergekit coupled with these larger merged models? I think the sky is the limit for us all. I can only imagine what will happen if we took a group of these 120 models, fin tuned them each a bit and applied the MOE Mixtral merge method to them? I would also point out that if a clever VC came along and funded that work? You have the people you need right here on huggingface and all they need is the equipment to do it on.