--- license: cc-by-sa-4.0 tags: - moe - merge - mergekit - lazymergekit - deepseek-ai/deepseek-coder-6.7b-instruct - defog/sqlcoder-7b-2 - Python - Javascript - sql base_model: - deepseek-ai/deepseek-coder-6.7b-instruct - defog/sqlcoder-7b-2 language: - en library_name: transformers pipeline_tag: text-generation ---
# DevPearl-2x7B, an xtraordinary Mixture of Experts (MoE) for development DevPearl-2x7B is a Mixture of Experts (MoE) made with the following models : * [deepseek-ai/deepseek-coder-6.7b-instruct](https://huggingface.co/deepseek-ai/deepseek-coder-6.7b-instruct) * [defog/sqlcoder-7b-2](https://huggingface.co/defog/sqlcoder-7b-2) A Mixture of Experts (MoE) model represents a sophisticated architecture that amalgamates the capabilities of multiple specialized models to address a wide array of tasks within a unified framework. Within the realm of a MoE model tailored for a chat application, the integration of expertise spanning three distinct domains - chat, code, and mathematics - substantially enhances its capacity to furnish nuanced and precise responses to a diverse spectrum of user inquiries. ## Configuration ```yaml base_model: codellama/CodeLlama-7b-Instruct-hf experts: - source_model: deepseek-ai/deepseek-coder-6.7b-instruct positive_prompts: - "python" - "javascript" - "java" - source_model: defog/sqlcoder-7b-2 positive_prompts: - "SQL" ``` ## Usage ```python !pip install -qU transformers bitsandbytes accelerate from transformers import AutoTokenizer import transformers import torch model = "louisbrulenaudet/DevPearl-2x7B" tokenizer = AutoTokenizer.from_pretrained(model) pipeline = transformers.pipeline( "text-generation", model=model, model_kwargs={"torch_dtype": torch.float16, "load_in_4bit": True}, ) messages = [{"role": "user", "content": "Explain what a Mixture of Experts is in less than 100 words."}] prompt = pipeline.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True) outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95) print(outputs[0]["generated_text"]) ``` ## Citing & Authors If you use this code in your research, please use the following BibTeX entry. ```BibTeX @misc{louisbrulenaudet2023, author = {Louis Brulé Naudet}, title = {DevPearl-2x7B, an xtraordinary Mixture of Experts (MoE) for development}, year = {2024} howpublished = {\url{https://huggingface.co/louisbrulenaudet/DevPearl-2x7B}}, } ``` ## Feedback If you have any feedback, please reach out at [louisbrulenaudet@icloud.com](mailto:louisbrulenaudet@icloud.com).