|
--- |
|
model-index: |
|
- name: Mistral-7B-v0.1 |
|
results: |
|
- task: |
|
type: text-generation |
|
dataset: |
|
name: Wikitext |
|
type: wikitext |
|
metrics: |
|
- type: perplexity (BASELINE) |
|
value: 8.041976819283537 |
|
- type: perplexity (BASIC) |
|
value: 221.02403769073769 |
|
--- |
|
This is a d-Matrix functional reference of the MISTRAL-7B-V0.1 model. |
|
The reference provides the following functional *configurations*: |
|
Configuration | Explanation |
|
:-- | :-- |
|
**`BASELINE`** | a reference functionally equivalent to the original model |
|
**`BASIC`** | all linear algebraic operands quantized to `MXINT8-64`, and all other operations transformed to approximated kernel simulations |
|
|
|
|
|
### Usage |
|
|
|
Install d-Matrix [Dmx_Compressor](https://github.com/d-matrix-ai/dmx-compressor) first. |
|
```sh |
|
pip install dmx_compressor |
|
``` |
|
|
|
The following is an example model and its evaluation. |
|
|
|
```sh |
|
git clone https://github.com/EleutherAI/lm-evaluation-harness |
|
cd lm-evaluation-harness |
|
pip install -e . |
|
``` |
|
|
|
```python |
|
from dmx.compressor.modeling import DmxModel |
|
import lm_eval |
|
|
|
model_args = "pretrained='d-matrix/Mistral',trust_remote_code=True" |
|
|
|
lm = lm_eval.api.registry.get_model("hf").create_from_arg_string(model_args, {"batch_size": 1}) |
|
|
|
# Transform the model with DMX |
|
lm._model = DmxModel.from_torch(lm._model).to_basic_model() # Using BASIC configuration |
|
|
|
eval_results = lm_eval.evaluate(lm, lm_eval.tasks.get_task_dict([task])) # Assign desired task, i.e. "wikitext" |
|
``` |