File size: 2,120 Bytes
f9a215b 81e885b f9a215b 952e6c6 f9a215b 952e6c6 f9a215b 952e6c6 0ffc2ad f9a215b 952e6c6 f9a215b 952e6c6 f9a215b |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 |
---
tags:
- merge
- mergekit
- lazymergekit
- microsoft/Phi-3-mini-128k-instruct
- NexaAIDev/Octopus-v4
base_model:
- microsoft/Phi-3-mini-128k-instruct
- NexaAIDev/Octopus-v4
license: mit
language:
- en
library_name: transformers
pipeline_tag: text-generation
---
# Phi-3-Instruct-Bloated
Phi-3-Instruct-Bloated is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
* [microsoft/Phi-3-mini-128k-instruct](https://huggingface.co/microsoft/Phi-3-mini-128k-instruct)
* [NexaAIDev/Octopus-v4](https://huggingface.co/NexaAIDev/Octopus-v4)
## 🧩 Configuration
```yaml
slices:
- sources:
- model: microsoft/Phi-3-mini-128k-instruct
layer_range: [0, 32]
- model: NexaAIDev/Octopus-v4
layer_range: [0, 32]
merge_method: slerp
base_model: microsoft/Phi-3-mini-128k-instruct
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5
dtype: bfloat16
```
## 💻 Usage
```python
# Installation
!pip install -qU transformers accelerate
# Imports
from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
# Loading
tokenizer = AutoTokenizer.from_pretrained("MrOvkill/Phi-3-Instruct-Bloated")
model = AutoModelForCausalLM.from_pretrained("MrOvkill/Phi-3-Instruct-Bloated")
# Completion function
def infer(prompt, **kwargs):
inputs = tokenizer(prompt, return_tensors="pt")
with torch.no_grad():
outputs = model.generate(**inputs, **kwargs)
return tokenizer.decode(outputs[0], skip_special_tokens=True)
# Some silliness
infer("<|user|>\nBen is going to the store for some Ice Cream. So is Jerry. They mix up the ice cream at the store. Is the ice cream: (a. Ben's (b. Jerry's (c. Ben and Jerry's <|end|>\n<|assistant|>\nMy answer is (", max_new_tokens=1024)
# A proper test
infer(
"""
<|user|>
Explain what a Mixture of Experts is in less than 100 words.
<|assistant|>
""",
max_new_tokens=1024,
do_sample=False,
temperature=0.0,
top_k=50,
top_p=0.89,
)
``` |