--- language: - en license: apache-2.0 library_name: transformers tags: - gpt - llm - large language model - h2o-llmstudio - mlx datasets: - HuggingFaceH4/ultrafeedback_binarized - Intel/orca_dpo_pairs - argilla/distilabel-math-preference-dpo - Open-Orca/OpenOrca - OpenAssistant/oasst2 - HuggingFaceH4/ultrachat_200k - meta-math/MetaMathQA thumbnail: https://h2o.ai/etc.clientlibs/h2o/clientlibs/clientlib-site/resources/images/favicon.ico widget: - text: <|prompt|>Why is drinking water so healthy?<|answer|> --- # mlx-community/h2o-danube-1.8b-chat-4bit-mlx This model was converted to MLX format from [`h2oai/h2o-danube-1.8b-chat`](). Refer to the [original model card](https://huggingface.co/h2oai/h2o-danube-1.8b-chat) for more details on the model. ## Use with mlx ```bash pip install mlx-lm ``` ```python from mlx_lm import load, generate model, tokenizer = load("mlx-community/h2o-danube-1.8b-chat-4bit-mlx") response = generate(model, tokenizer, prompt="hello", verbose=True) ```