license: apache-2.0 | |
language: | |
- en | |
pipeline_tag: text-generation | |
tags: | |
- chat | |
# SmolLM2-1.7B-Instruct-MNN | |
## Introduction | |
This model is a 4-bit quantized version of the MNN model exported from [SmolLM2-1.7B-Instruct](https://huggingface.co/HuggingFaceTB/SmolLM2-1.7B-Instruct) using [llm-export](https://github.com/wangzhaode/llm-export). | |