laserxtral / README.md
fernandofernandes's picture
Update README.md
df30192 verified
|
raw
history blame
No virus
724 Bytes
metadata
license: cc-by-nc-2.0

by David, Fernando and Eric

An experimentation regarding 'lasering' each expert to denoise and enhance model capabilities.

This model has half size in comparison to the Mixtral 8x7b Instruct. And it basically has the same level of performance (we are working to get a better MMLU score).

It follows the implementation of laserRMT @ https://github.com/cognitivecomputations/laserRMT

Here, we are controlling layers checking which ones have lower signal to noise ratios (which are more subject to noise), to apply Laser interventions, still using Machenko Pastur to calculate this ratio.

We intend to be the first of a family of experimentations being carried out @ Cognitive Computations.