louisbrulenaudet commited on
Commit
b83e92a
·
verified ·
1 Parent(s): f3c4910

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +37 -4
README.md CHANGED
@@ -1,5 +1,5 @@
1
  ---
2
- license: apache-2.0
3
  tags:
4
  - moe
5
  - frankenmoe
@@ -8,17 +8,33 @@ tags:
8
  - lazymergekit
9
  - deepseek-ai/deepseek-coder-6.7b-instruct
10
  - defog/sqlcoder-7b-2
 
 
 
11
  base_model:
12
  - deepseek-ai/deepseek-coder-6.7b-instruct
13
  - defog/sqlcoder-7b-2
 
 
 
 
14
  ---
 
15
 
16
- # DevPearl-2x7B
17
 
18
- DevPearl-2x7B is a Mixure of Experts (MoE) made with the following models :
19
  * [deepseek-ai/deepseek-coder-6.7b-instruct](https://huggingface.co/deepseek-ai/deepseek-coder-6.7b-instruct)
20
  * [defog/sqlcoder-7b-2](https://huggingface.co/defog/sqlcoder-7b-2)
21
 
 
 
 
 
 
 
 
 
22
  ## Configuration
23
 
24
  ```yaml
@@ -56,4 +72,21 @@ messages = [{"role": "user", "content": "Explain what a Mixture of Experts is in
56
  prompt = pipeline.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
57
  outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
58
  print(outputs[0]["generated_text"])
59
- ```
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ license: cc-by-sa-4.0
3
  tags:
4
  - moe
5
  - frankenmoe
 
8
  - lazymergekit
9
  - deepseek-ai/deepseek-coder-6.7b-instruct
10
  - defog/sqlcoder-7b-2
11
+ - Python
12
+ - Javascript
13
+ - sql
14
  base_model:
15
  - deepseek-ai/deepseek-coder-6.7b-instruct
16
  - defog/sqlcoder-7b-2
17
+ language:
18
+ - en
19
+ library_name: transformers
20
+ pipeline_tag: text-generation
21
  ---
22
+ <center><img src='https://i.imgur.com/0xFTuAX.png' width='450px'></center>
23
 
24
+ # Pearl-3x7B, an xtraordinary Mixture of Experts (MoE) for development
25
 
26
+ DevPearl-2x7B is a Mixture of Experts (MoE) made with the following models :
27
  * [deepseek-ai/deepseek-coder-6.7b-instruct](https://huggingface.co/deepseek-ai/deepseek-coder-6.7b-instruct)
28
  * [defog/sqlcoder-7b-2](https://huggingface.co/defog/sqlcoder-7b-2)
29
 
30
+ A Mixture of Experts (MoE) model represents a sophisticated architecture that amalgamates the capabilities of multiple specialized models to address a wide array of tasks within a unified framework. Within the realm of a MoE model tailored for a chat application, the integration of expertise spanning three distinct domains - chat, code, and mathematics - substantially enhances its capacity to furnish nuanced and precise responses to a diverse spectrum of user inquiries.
31
+
32
+ The initial expert model, honed for chat applications, exhibits prowess in comprehending natural language nuances, conversational dynamics, and contextual cues. Drawing upon extensive conversational data, it adeptly generates engaging and contextually pertinent responses, thereby fostering meaningful interactions with users.
33
+
34
+ The subsequent expert model, centered on code, brings to the fore proficiency in programming languages, algorithms, and software engineering principles. Possessing a deep-seated understanding of syntax, logical constructs, and problem-solving methodologies, it deftly tackles queries spanning coding challenges, debugging assistance, and software development inquiries.
35
+
36
+ Lastly, the third expert model, specializing in mathematics, boasts expertise in mathematical reasoning, problem-solving strategies, and analytical techniques. Armed with a breadth of knowledge encompassing arithmetic, algebra, calculus, and beyond, it offers precise solutions, lucid explanations, and profound insights for mathematical queries, equations, and proofs.
37
+
38
  ## Configuration
39
 
40
  ```yaml
 
72
  prompt = pipeline.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
73
  outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
74
  print(outputs[0]["generated_text"])
75
+ ```
76
+
77
+ ## Citing & Authors
78
+
79
+ If you use this code in your research, please use the following BibTeX entry.
80
+
81
+ ```BibTeX
82
+ @misc{louisbrulenaudet2023,
83
+ author = {Louis Brulé Naudet},
84
+ title = {Pearl-3x7B, an xtraordinary Mixture of Experts (MoE) for development},
85
+ year = {2024}
86
+ howpublished = {\url{https://huggingface.co/louisbrulenaudet/DevPearl-2x7B}},
87
+ }
88
+ ```
89
+
90
+ ## Feedback
91
+
92
+ If you have any feedback, please reach out at [louisbrulenaudet@icloud.com](mailto:louisbrulenaudet@icloud.com).