![](https://i.imgur.com/0xFTuAX.png)
mlx-community/Pearl-3x7B
This model was converted to MLX format from louisbrulenaudet/Pearl-3x7B
using mlx-vlm version 0.16.1.
Refer to the original model card for more details on the model.
Use with mlx
pip install -U mlx-vlm
python -m mlx_vlm.generate --model mlx-community/Pearl-3x7B --max-tokens 100 --temp 0.0
from mlx_lm import load, generate
model, tokenizer = load("mlx-community/Pearl-3x7B")
response = generate(model, tokenizer, prompt="hello", verbose=True)
Citing & Authors
If you use this code in your research, please use the following BibTeX entry.
@misc{louisbrulenaudet2024,
author = {Louis Brulé Naudet},
title = {Pearl-3x7B, an xtraordinary Mixture of Experts (MoE) for data science},
year = {2024}
howpublished = {\url{https://huggingface.co/mlx-community/Pearl-3x7B}},
}
Feedback
If you have any feedback, please reach out at louisbrulenaudet@icloud.com.
- Downloads last month
- 6
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The HF Inference API does not support text-generation models for mlx library.
Model tree for mlx-community/Pearl-3x7B
Base model
louisbrulenaudet/Pearl-3x7B