Llama-3-ELYZA-hermes-2x8B / mergekit_moe_config.yml
keitokei1994's picture
Upload 9 files
48e45d0 verified
base_model: /content/Llama-3-ELYZA-JP-8B
gate_mode: cheap_embed
dtype: bfloat16
experts:
- source_model: /content/Llama-3-ELYZA-JP-8B
positive_prompts:
- "How do you"
- "Explain the concept of"
- "Give an overview of"
- "Compare and contrast between"
- "Provide information about"
- "Help me understand"
- "Summarize"
- "Make a recommendation on"
- "Answer this question"
- "[Mode: Chat]"
- source_model: /content/Hermes-2-Theta-Llama-3-8B
positive_prompts:
- "Write a program to solve this problem"
- "Modify this function to improve its performance"
- "Refactor this code to enhance readability"
- "Create a custom function for this specific use case"
- "Optimize this algorithm to reduce computational complexity"
- "Implement this feature by extending existing codebase"
- "Integrate this API call into the application"
- "Help me troubleshoot and fix this bug"
- "Review and test this code snippet before deployment"
- "Analyze this error log to identify potential issues"
- "Generate a set of unit tests for this module"
- "Evaluate different approaches to solving this problem"
- "Do a web search for"
- "Use the plugin to"
- "[Mode: Writing]"
tokenizer_source: union