Anthony
commited on
Commit
•
43bf196
1
Parent(s):
a2fb218
Update README.md
Browse files
README.md
CHANGED
@@ -6,14 +6,14 @@ tags:
|
|
6 |
- mergekit
|
7 |
- lazymergekit
|
8 |
- beowolx/MistralHermes-CodePro-7B-v1
|
9 |
-
-
|
10 |
---
|
11 |
|
12 |
# Proximus-2x7B-v1
|
13 |
|
14 |
Proximus-2x7B-v1 is a Mixure of Experts (MoE) made with the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
|
15 |
* [beowolx/MistralHermes-CodePro-7B-v1](https://huggingface.co/beowolx/MistralHermes-CodePro-7B-v1)
|
16 |
-
* [
|
17 |
|
18 |
## 🧩 Configuration
|
19 |
|
@@ -27,7 +27,7 @@ experts:
|
|
27 |
- "javascript"
|
28 |
- "programming"
|
29 |
- "algorithm"
|
30 |
-
- source_model:
|
31 |
positive_prompts:
|
32 |
- "cybersecurity"
|
33 |
- "information security"
|
@@ -45,7 +45,7 @@ from transformers import AutoTokenizer
|
|
45 |
import transformers
|
46 |
import torch
|
47 |
|
48 |
-
model = "
|
49 |
|
50 |
tokenizer = AutoTokenizer.from_pretrained(model)
|
51 |
pipeline = transformers.pipeline(
|
|
|
6 |
- mergekit
|
7 |
- lazymergekit
|
8 |
- beowolx/MistralHermes-CodePro-7B-v1
|
9 |
+
- anthonylx/Prox-MistralHermes-7B
|
10 |
---
|
11 |
|
12 |
# Proximus-2x7B-v1
|
13 |
|
14 |
Proximus-2x7B-v1 is a Mixure of Experts (MoE) made with the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
|
15 |
* [beowolx/MistralHermes-CodePro-7B-v1](https://huggingface.co/beowolx/MistralHermes-CodePro-7B-v1)
|
16 |
+
* [anthonylx/Prox-MistralHermes-7B](https://huggingface.co/anthonylx/Prox-MistralHermes-7B)
|
17 |
|
18 |
## 🧩 Configuration
|
19 |
|
|
|
27 |
- "javascript"
|
28 |
- "programming"
|
29 |
- "algorithm"
|
30 |
+
- source_model: anthonylx/Prox-MistralHermes-7B
|
31 |
positive_prompts:
|
32 |
- "cybersecurity"
|
33 |
- "information security"
|
|
|
45 |
import transformers
|
46 |
import torch
|
47 |
|
48 |
+
model = "anthonylx/Proximus-2x7B-v1"
|
49 |
|
50 |
tokenizer = AutoTokenizer.from_pretrained(model)
|
51 |
pipeline = transformers.pipeline(
|