TinyYi-7B-Test
This model is a merge of the following models:
🧩 Configuration
slices:
- sources:
- model: Yash21/DeepYi-Base
layer_range: [0, 32]
- model: Yash21/DeepYi-Second
layer_range: [0, 32]
merge_method: slerp
base_model: Yash21/DeepYi-Base
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5
dtype: bfloat16
Please Reach Out to maratheyash108@gmail.com if you want to support me to Fine tune the models and create more such exciting models
- Downloads last month
- 580
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.