--- language: - ko - en license: other library_name: transformers tags: - pytorch - mlx license_name: gemma-terms-of-use license_link: https://ai.google.dev/gemma/terms pipeline_tag: text-generation --- # sosoai/beomi-gemma-2b-ko-mlx The Model [sosoai/beomi-gemma-2b-ko-mlx](https://huggingface.co/sosoai/beomi-gemma-2b-ko-mlx) was converted to MLX format from [beomi/gemma-ko-2b](https://huggingface.co/beomi/gemma-ko-2b) using mlx-lm version **0.13.0**. ## Use with mlx ```bash pip install mlx-lm ``` ```python from mlx_lm import load, generate model, tokenizer = load("sosoai/beomi-gemma-2b-ko-mlx") response = generate(model, tokenizer, prompt="hello", verbose=True) ```