Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,37 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: apache-2.0
|
3 |
+
---
|
4 |
+
|
5 |
+
## Overview
|
6 |
+
|
7 |
+
mistralai developed and released the [Mistral-Nemo](https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407) family of large language models (LLMs).
|
8 |
+
|
9 |
+
## Variants
|
10 |
+
|
11 |
+
| No | Variant | Cortex CLI command |
|
12 |
+
| --- | --- | --- |
|
13 |
+
| 2 | [gguf](https://huggingface.co/cortexso/llama3.1/tree/gguf) | `cortex run llama3.1:gguf` |
|
14 |
+
| 3 | [main/default](https://huggingface.co/cortexso/llama3.1/tree/main) | `cortex run llama3.1` |
|
15 |
+
|
16 |
+
## Use it with Jan (UI)
|
17 |
+
|
18 |
+
1. Install **Jan** using [Quickstart](https://jan.ai/docs/quickstart)
|
19 |
+
2. Use in Jan model Hub:
|
20 |
+
```
|
21 |
+
cortexso/llama3.1
|
22 |
+
```
|
23 |
+
|
24 |
+
## Use it with Cortex (CLI)
|
25 |
+
|
26 |
+
1. Install **Cortex** using [Quickstart](https://cortex.jan.ai/docs/quickstart)
|
27 |
+
2. Run the model with command:
|
28 |
+
```
|
29 |
+
cortex run mistral-nemo
|
30 |
+
```
|
31 |
+
|
32 |
+
## Credits
|
33 |
+
|
34 |
+
- **Author:** Meta
|
35 |
+
- **Converter:** [Homebrew](https://www.homebrew.ltd/)
|
36 |
+
- **Original License:** [License](https://huggingface.co/meta-llama/Meta-Llama-3.1-8B/blob/main/LICENSE)
|
37 |
+
- **Papers:** [Llama-3.1 Blog](https://scontent.fsgn3-1.fna.fbcdn.net/v/t39.2365-6/452387774_1036916434819166_4173978747091533306_n.pdf?_nc_cat=104&ccb=1-7&_nc_sid=3c67a6&_nc_ohc=DTS7hDTcxZoQ7kNvgHxaQ8K&_nc_ht=scontent.fsgn3-1.fna&oh=00_AYC1gXduoxatzt8eFMfLunrRUzpzQcoKzAktIOT7FieZAQ&oe=66AE9C4D)
|