Update README.md
Browse files
README.md
CHANGED
@@ -10,23 +10,23 @@ tags:
|
|
10 |
- mistral
|
11 |
- optimized
|
12 |
---
|
13 |
-
# NeuralPipe-7B-slerp
|
14 |
|
15 |
-
This is
|
16 |
|
17 |
## About Me
|
18 |
-
I'm David Soeiro-Vuong, a third-year Computer Science student working as an apprentice at TW3 Partners, a company specialized in Generative AI. Passionate about artificial intelligence and language models optimization, I focus on creating efficient model merges that balance performance and
|
19 |
|
20 |
🔗 [Connect with me on LinkedIn](https://www.linkedin.com/in/david-soeiro-vuong-a28b582ba/)
|
21 |
|
22 |
-
## Model Size Optimization
|
23 |
-
The reduction from 7B to 3B parameters was achieved through:
|
24 |
-
- Layer reduction from 32 to 12 layers
|
25 |
-
- Conversion to bfloat16 format (half precision)
|
26 |
-
- Selective layer range implementation
|
27 |
-
- SLERP merge method optimization
|
28 |
-
|
29 |
## Merge Details
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
30 |
### Models Merged
|
31 |
* [OpenPipe/mistral-ft-optimized-1218](https://huggingface.co/OpenPipe/mistral-ft-optimized-1218)
|
32 |
* [mlabonne/NeuralHermes-2.5-Mistral-7B](https://huggingface.co/mlabonne/NeuralHermes-2.5-Mistral-7B)
|
|
|
10 |
- mistral
|
11 |
- optimized
|
12 |
---
|
13 |
+
# NeuralPipe-7B-slerp
|
14 |
|
15 |
+
This is a merge of pre-trained language models created using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing), combining the capabilities of OpenPipe's optimized Mistral and NeuralHermes through an efficient SLERP fusion.
|
16 |
|
17 |
## About Me
|
18 |
+
I'm David Soeiro-Vuong, a third-year Computer Science student working as an apprentice at TW3 Partners, a company specialized in Generative AI. Passionate about artificial intelligence and language models optimization, I focus on creating efficient model merges that balance performance and capabilities.
|
19 |
|
20 |
🔗 [Connect with me on LinkedIn](https://www.linkedin.com/in/david-soeiro-vuong-a28b582ba/)
|
21 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
22 |
## Merge Details
|
23 |
+
### Merge Method
|
24 |
+
This model uses SLERP (Spherical Linear Interpolation) with carefully tuned parameters:
|
25 |
+
- Optimized attention layer fusion patterns
|
26 |
+
- Balanced MLP layer transitions
|
27 |
+
- bfloat16 format for efficient memory usage
|
28 |
+
- Full layer utilization for maximum capability retention
|
29 |
+
|
30 |
### Models Merged
|
31 |
* [OpenPipe/mistral-ft-optimized-1218](https://huggingface.co/OpenPipe/mistral-ft-optimized-1218)
|
32 |
* [mlabonne/NeuralHermes-2.5-Mistral-7B](https://huggingface.co/mlabonne/NeuralHermes-2.5-Mistral-7B)
|