fakezeta's picture
Upload 9 files
a583781 verified
|
raw
history blame
1.43 kB
---
library_name: transformers
tags:
- llama-factory
- llama3
license: llama3
datasets:
- teknium/OpenHermes-2.5
- mudler/function-call-localai-glaive
---
# OpenVINO IR model with int8 quantization
Model definition for LocalAI:
```
name: mirai-nova
backend: transformers
parameters:
model: fakezeta/Mirai-Nova-Llama3-LocalAI-8B-v0.1-ov-int8
context_size: 8192
type: OVModelForCausalLM
template:
use_tokenizer_template: true
```
To run the model directly with LocalAI:
```
local-ai run huggingface://fakezeta/Mirai-Nova-Llama3-LocalAI-8B-v0.1-ov-int8/model.yaml
```
[![local-ai-banner.png](https://cdn-uploads.huggingface.co/production/uploads/647374aa7ff32a81ac6d35d4/bXvNcxQqQ-wNAnISmx3PS.png)](https://localai.io)
## Mirai Nova
![image/png](https://cdn-uploads.huggingface.co/production/uploads/647374aa7ff32a81ac6d35d4/SKuXcvmZ_6oD4NCMkvyGo.png)
Mirai Nova: "Mirai" means future in Japanese, and "Nova" references a star showing a sudden large increase in brightness.
A set of models oriented in function calling, but generalist and with enhanced reasoning capability. This is fine tuned with Llama3.
Mirai Nova works particularly well with LocalAI, leveraging the function call with grammars feature out of the box.
GGUF quants: https://huggingface.co/mudler/Mirai-Nova-Llama3-LocalAI-8B-v0.1-GGUF
To run on LocalAI:
```
local-ai run huggingface://mudler/Mirai-Nova-Llama3-LocalAI-8B-v0.1-GGUF/localai.yaml
```