mlx-community/plamo-2-1b
The Model mlx-community/plamo-2-1b was converted to MLX format from pfnet/plamo-2-1b using mlx-lm version 0.22.0.
Use with mlx
# numba is required for the new PLaMo tokenizer
pip install mlx numba 'mlx-lm>=0.22.0'
python -m mlx_lm.generate \
--model mlx-community/plamo-2-1b \
--prompt '็พๅณใใใซใฌใผใฎไฝใๆนใ็ดนไปใใพใใ' \
--ignore-chat-template \
--max-tokens 1024 \
--extra-eos-token '<|plamo:bos|>' \
--temp 0.7 \
--seed 0
==========
ในใใคในใฎไฝฟใๆนใใใซใฌใผใฎไฝใๆนใพใง่ฉณใใ่งฃ่ชฌใใพใใ
## ใซใฌใผใฎไฝใๆน
**โ ๆๆใ็จๆใใ**
ๅกฉใ้ฉ้ๅ ใใใจใในใใคใทใผใช้ฆใใๅบใใใพใใ
ใฏใใณใใณใชใขใณใใผใชใฉใฎในใใคในใไฝฟใใจใ้ฃๆฌฒใใใใใพใใ
**โกในใใคในใ็ใใ**
ในใใคในใฏๅงใใซๅฐใๅ
ฅใใใ ใใงใใใฎๅพใฏๅพใ
ใซๅ ใใพใใ
**โข็ใญใใฎใฟใใๅใใ็ใใ**
็ใญใใ็ใใใใจใงใ็ใฟใๅผใๅบใใใพใใ
**โฃ่ใ็ใใ**
่ใฏ่ใในใฉใคในใใฆใในใใคในใจใใๆททใใพใใ
**โค้่ใ็ใใ**
ไบบๅใใธใฃใฌใคใขใใคใณใฒใณใชใฉใไธ็ทใซ็ใใพใใ้่ใฏ็ใใๅใซใซใใใใฆใใใพใใใใ
**โฅๆฐดใๅ ใใ**
**โฆใซใฌใผใซใผใๅ ใใ**
**โงๅบๆฅไธใใ**
## ใซใฌใผใฎๅณไปใๆนๆณ
**โ ่ใ็ใใ**
่ใฏ่ใในใฉใคในใใฆใในใใคในใจใใๆททใใพใใ
**โก้่ใ็ใใ**
้่ใฏ็ใใๅใซใซใใใใฆใใใพใใใใ
**โขๆฐดใๅ ใใ**
**โฃใซใฌใผใซใผใๅ ใใ**
## ใซใฌใผใซใฏโโใๅ
ฅใใใ
**โ ใใใใ**
ใใใใใๅ ใใใใจใงใใพใใใใงใณใฏใฎใใๅณใซใชใใพใใ
**โกใซใใซใ**
ใซใใซใใๅ ใใใใจใงใ้ฆใใจ้ขจๅณใใขใใใใพใใ
**โขใใใใ**
ใใใใใๅ ใใใใจใงใในใใคใทใผใช้ขจๅณใๅ ใใใพใใ
**โฃใใใ็ผถ**
ใใใ็ผถใๅ ใใใใจใงใ้
ธๅณใๅ ใใใๆทฑใฟใฎใใๅณใใใซใชใใพใใ
**โคใจใผใฐใซใ**
ใจใผใฐใซใใๅ ใใใใจใงใใณใฏใใขใใใใพใใ
## ใซใฌใผใฎใชในในใกใฎๅ
ทๆ
**โ ่**
็่ใ่ฑ่ใ้ถ่ใชใฉใใๅฅฝใฟใฎ่ใไฝฟใฃใฆใฟใพใใใใ
**โก้่**
ไบบๅใใธใฃใฌใคใขใใคใณใฒใณใชใฉใใๅฅฝใฟใฎ้่ใไฝฟใฃใฆใฟใพใใใใ
**โขใทใผใใผใ**
ใใใใใใณใใใใชใฉใใทใผใใผใใไฝฟใฃใฆใฟใพใใใใ
**โฃใใผใบ**
ใใผใบใๅ ใใใใจใงใใฏใชใผใใผใงใณใฏใฎใใๅณใใใซใชใใพใใ
## ใซใฌใผใฎ็พๅณใใไฝใๆน
**โ ๆๆใ็จๆใใ**
ๅกฉใ้ฉ้ๅ ใใใจใในใใคใทใผใช้ฆใใๅบใใใพใใ
ใฏใใณใใณใชใขใณใใผใชใฉใฎในใใคในใไฝฟใใจใ้ฃๆฌฒใใใใใพใใ
**โกในใใคในใ็ใใ**
ในใใคในใฏๅงใใซๅฐใๅ
ฅใใใ ใใงใใใฎๅพใฏๅพใ
ใซๅ ใใพใใ
**โข็ใญใใฎใฟใใๅใใ็ใใ**
็ใญใใ็ใใใใจใงใ็ใฟใๅผใๅบใใใพใใ
**โฃ่ใ็ใใ**
่ใฏ่ใในใฉใคในใใฆใในใใคในใจใใๆททใใพใใ
**โค้่ใ็ใใ**
ไบบๅใใธใฃใฌใคใขใใคใณใฒใณใชใฉใไธ็ทใซ็ใใพใใ้่ใฏ็ใใๅใซใซใใใใฆใใใพใใใใ
**โฅๆฐดใๅ ใใ**
**โฆใซใฌใผใซใผใๅ ใใ**
**โงๅบๆฅไธใใ**
## ใพใจใ
ในใใคในใจ้่ใฎ็ตใฟๅใใใฏใใซใฌใผใฎๅณใใใๆทฑใใใฎใซๆฌ ใใใพใใใ
ใใฒไปๅ็ดนไปใใใฌใทใใๅ่ใซใ็พๅณใใใซใฌใผใไฝใฃใฆใฟใฆใใ ใใใ
==========
Prompt: 6 tokens, 87.012 tokens-per-sec
Generation: 496 tokens, 52.861 tokens-per-sec
Peak memory: 5.317 GB
You can also write your code to use this model like this:
from mlx_lm import load, generate
model, tokenizer = load("mlx-community/plamo-2-1b")
prompt = "็พๅณใใใซใฌใผใฎไฝใๆนใฎใฌใทใใ็ดนไปใใพใใ"
response = generate(model, tokenizer, prompt=prompt, verbose=True)
- Downloads last month
- 92
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for mlx-community/plamo-2-1b
Base model
pfnet/plamo-2-1b