File size: 426 Bytes
34ccb80 f8c09fb 34ccb80 c6620c1 34ccb80 09b89a2 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
---
library_name: transformers
license: apache-2.0
datasets:
- netcat420/MFANN
---
I am now basing all future releases of the MFANN experiment using llama-3 as a base model, I may continue fine-tuning mistral-7b every other release
this model uses meta's llama-3 as its base, and benchmarks are pending
![image/png](https://cdn-uploads.huggingface.co/production/uploads/6435f27b2d0ed796668ffd8b/VlqyDezfgqoujwIdiNfYB.png)
|