| | --- |
| | license: other |
| | license_name: lfm1.0 |
| | license_link: LICENSE |
| | language: |
| | - en |
| | - ar |
| | - zh |
| | - fr |
| | - de |
| | - ja |
| | - ko |
| | - es |
| | pipeline_tag: text-generation |
| | tags: |
| | - liquid |
| | - lfm2 |
| | - edge |
| | - llama.cpp |
| | - gguf |
| | base_model: |
| | - LiquidAI/LFM2-350M |
| | --- |
| | |
| | <center> |
| | <div style="text-align: center;"> |
| | <img |
| | src="https://cdn-uploads.huggingface.co/production/uploads/61b8e2ba285851687028d395/2b08LKpev0DNEk6DlnWkY.png" |
| | alt="Liquid AI" |
| | style="width: 100%; max-width: 100%; height: auto; display: inline-block; margin-bottom: 0.5em; margin-top: 0.5em;" |
| | /> |
| | </div> |
| | <div style="display: flex; justify-content: center; gap: 0.5em;"> |
| | <a href="https://playground.liquid.ai/chat"> |
| | <a href="https://playground.liquid.ai/"><strong>Try LFM</strong></a> • <a href="https://docs.liquid.ai/lfm"><strong>Documentation</strong></a> • <a href="https://leap.liquid.ai/"><strong>LEAP</strong></a></a> |
| | </div> |
| | </center> |
| |
|
| | # LFM2-350M-GGUF |
| |
|
| | LFM2 is a new generation of hybrid models developed by [Liquid AI](https://www.liquid.ai/), specifically designed for edge AI and on-device deployment. It sets a new standard in terms of quality, speed, and memory efficiency. |
| |
|
| | Find more details in the original model card: https://huggingface.co/LiquidAI/LFM2-350M |
| |
|
| | ## 🏃 How to run LFM2 |
| |
|
| | Example usage with [llama.cpp](https://github.com/ggml-org/llama.cpp): |
| |
|
| | ``` |
| | llama-cli -hf LiquidAI/LFM2-350M-GGUF |
| | ``` |