mlx-community/plamo-2-1b-bf16
The Model mlx-community/plamo-2-1b-bf16 was converted to MLX format from pfnet/plamo-2-1b using mlx-lm version 0.21.5.
Use with mlx
pip install mlx numba # numba is required for the new PLaMo tokenizer
pip install 'git+https://github.com/ml-explore/mlx-examples.git@main#egg=mlx-lm&subdirectory=llms'
python -m mlx_lm.generate \
--model mlx-community/plamo-2-1b-bf16 \
--prompt '## ็พๅณใใใซใฌใผใฎไฝใๆน:\n' \
--ignore-chat-template \
--max-tokens 1024 \
--extra-eos-token '<|plamo:bos|>' \
--temp 0.7 \
--seed 0
Fetching 8 files: 100%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ| 8/8 [00:00<00:00, 68338.97it/s]
==========
### ๆ้ 1๏ผ้ฃๆใฎ่ณผๅ
ฅใจๆบๅ
ใซใฌใผใฎๆๆใ้ใใพใใใใ
- ใธใฃใฌใคใข 2 ๅ
- ็ใญใ 2 ๅ
- ใใใชใซ ๏ผๅ
- ใใณใใฏ 3 ็
- ในใใคใน๏ผใฏใใณใใณใชใขใณใใผใใฌใฉใ ใใตใฉใชใฉ๏ผ
- ใซใฌใผ็ฒ
- ๆฐด
- ๅกฉ
- ๆฒน
- ใซใฌใผใซใผ
### ๆ้ 2๏ผ้ฉๅใช้่ใฎๆบๅ
ใในใฆใฎๆๆใๆดใฃใฆใ่ๅใใซใใพใใ
- ใธใฃใฌใคใขใใใใใซๆดใใๅใคๅใใซใใพใใ
- ็ใญใใใใใใซๆดใใในใฉใคในใใพใใ
- ใใใชใซใๆดใฃใฆใในใฉใคในใใพใใ
- ใซใใซใใ่ใในใฉใคในใใพใใ
### ๆ้ 3๏ผในใใคในใฎๆบๅ
ในใใคในใๆบๅใใพใ๏ผ
- ใฏใใณใใณใชใขใณใใผใใฌใฉใ ใใตใฉใชใฉใๅในใใคในใ้ใซๅฟใใฆใใใใๆบๅใใพใใ
### ๆ้ 4๏ผ้่ใฎใฝใใผ
้่ใใฝใใผใใพใ๏ผ
- ไธญ็ซใงๅคงใใช้ใ็ฑใใๆฒนใ็ฑใใพใใ
- ไธญ็ซใใไธญ็ซใงใๆฒนใ็ฑใใชใใใใชใ็จๅบฆใซ็ฑใใพใใ
- ็ใญใใจในใฉใคในใใใใใชใซใๅ ใใๆใ
ใใๆททใใชใใใๆใใใใชใใพใง็ใใพใใ
### ๆ้ 5๏ผในใใคในใ็ใใ
ในใใคในใๅ ใใ๏ผ
- ใฏใใณใใณใชใขใณใใผใใณใชใขใณใใผใปใฌใฉใ ใใตใฉใชใฉใฎในใใคในใๅ ใใในใใคในใๆบถใใใพใงใใๆททใใพใใ
### ๆ้ 6๏ผใใใใใใ็ใใ
ใใใใใใ็ใใ๏ผ
- ใธใฃใฌใคใขใ็ฑใใๆฒนใฎไธญใซใใฃใจๅ
ฅใใๆใ
ใใๆททใใชใใใๅบใ็ฆใใชใใใใซใใพใใ
### ๆ้ 7๏ผๆๆใๆททใใ
ๆๆใๆททใใ:
- ็
ฎ่พผใใ ใธใฃใฌใคใขใ้ใซๆปใใใใใซในใใคในใๅ ใใๆฐดใจๅกฉใๅ ใใๅกฉใๅคงใใ1/2ๅ ใใพใใ
### ๆ้ 8๏ผ็
ฎ่พผใ
็
ฎ่พผใ๏ผ
- ็ซใๅผฑใใใใๆททใใชใใใใฃใใใจ็
ฎ่พผใฟใพใใ
- ็ด10ๅ้ใๆใ
ใใๆททใใชใใใๆฐดๅใ่ถณใใชใใชใใพใง็
ฎ่พผใฟใพใใ
- ๆฐดใฎ้ใๅคงใใ1/2ๆธใใใๆฎใใฎๆฐดใๅ ใใพใใ
### ๆ้ 9๏ผๅณใ่ชฟใใ
ๅณใ่ชฟใใ๏ผ
- ๅกฉใใใฟใผใใฟใผใกใชใใฏใ้่ใซๆฏใใใใพใใ
- ใซใฌใผใๅบๆฅไธใใฃใใใๅฐ้บฆ็ฒใใณใผใณในใฟใผใใงใจใใฟใใคใใพใใ
### ๆ้ 10๏ผ็ใไปใ
็ใไปใ๏ผ
- ๅฎๆใใใซใฌใผใใ็ฟใซๆณจใใใณใณใใใใใซใฏใใใใฆๅบๆฅไธใใใพใใ
ใใใใฎๆ้ ใซๅพใใใจใงใ็พๅณใใ่ชๅฎถ่ฃฝใคใณใใซใฌใผใไฝใใใจใใงใใพใใ
### ใคใณใใซใฌใผใฎใฌใทใใซ้ขใใใใใใ่ณชๅ
Q1: ใซใฌใผใฎๆๆใฏไฝใงใใ๏ผ
A1: ใซใฌใผใฎๆๆใฏใ้่ใ้ฆ่พๆใในใใคในใในใใคในใๆฐดใๅกฉใใใฟใผใๅฐ้บฆ็ฒใๆฒนใชใฉใงใใใปใจใใฉใฎใซใฌใผใซใฏใใใใฎๆๆใไฝฟใใใฆใใพใใ
Q2: ใซใฌใผใใใใใใใๆนๆณใฏ๏ผ
A2: ใซใฌใผใใใใใใใๆนๆณใฏใในใใคในใใพใในใใชใไฝฟใไบใจใใซใฌใผใฎไฝใๆนใงใใไพใใฐใใฏใใณใใณใชใขใณใใผใใฟใผใกใชใใฏใใซใซใใขใณใใฏใญใผใใใใงใใฐใชใผใฏใใใชใใฌใฉใ ใใตใฉใชใฉใไธ่ฌ็ใซไฝฟใใใฆใใพใใ
Q3: ใชใใซใฌใผใไฝใใฎใงใใ๏ผ
A3: ใชใใซใฌใผใไฝใใฎใใจใใใจใใซใฌใผใฏ็พๅณใใใฆใๅฅๅบท็ใ ใใใงใใใพใใๆงใ
ใชในใใคในใใใผใใ่ชฟๅณๆใ็ตใฟๅใใใใใจใงใๆงใ
ใช้ขจๅณใๅนๆใๅพใใใจใใงใใพใใ
Q4: ใซใฌใผใ่ธใใฎใงใใ๏ผ
A4: ใใใใใซใฌใผใ่ธใใฎใฏไธ่ฌ็ใงใฏใใใพใใใใๅฟใใๆใใใถใผใใจใใฆ้ฃในใๅ ดๅใฏใ้่ใใใซใผใใใใใใ่ธใใฆไฝใใใจใๅบๆฅใพใใ
Q5: ใซใฌใผใฎไฝใๆนใๆใใฆใใ ใใใ
A5: ใซใฌใผใฎไฝใๆนใๆใใฆใใ ใใใในใใคในใ็ฑใใฆใ้ฃๆใ้ฉๅใซ่ชฟ็ใใฆใ่ชฟ็ใใในใใคในใจในใใคในใๆฐดใๅกฉใใใฟใผใๅฐ้บฆ็ฒใๆฒนใใฏใชใผใ ใใใผในใใชใฉใๅ ใใฆ็
ฎ่พผใใ ใใงใใ
==========
Prompt: 8 tokens, 156.478 tokens-per-sec
Generation: 738 tokens, 86.598 tokens-per-sec
Peak memory: 2.741 GB
You can also write your code to use this model like this:
from mlx_lm import load, generate
model, tokenizer = load("mlx-community/plamo-2-1b-bf16")
prompt = "็พๅณใใใซใฌใผใฎไฝใๆนใฎใฌใทใใ็ดนไปใใพใใ"
response = generate(model, tokenizer, prompt=prompt, verbose=True)
- Downloads last month
- 0
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The HF Inference API does not support model that require custom code execution.
Model tree for mlx-community/plamo-2-1b-bf16
Base model
pfnet/plamo-2-1b