Update README.md
Browse files
README.md
CHANGED
|
@@ -1,12 +1,12 @@
|
|
| 1 |
-
---
|
| 2 |
-
license: apache-2.0
|
| 3 |
-
language:
|
| 4 |
-
- en
|
| 5 |
-
base_model:
|
| 6 |
-
- allenai/OLMoE-1B-7B-0125
|
| 7 |
-
pipeline_tag: text-generation
|
| 8 |
-
library_name: transformers
|
| 9 |
-
---
|
| 10 |
|
| 11 |

|
| 12 |
|
|
@@ -15,7 +15,7 @@ _The **G**eneral **R**easoning **A**gent (for) **P**roject **E**xploration_
|
|
| 15 |
# The GRaPE Family
|
| 16 |
| Attribute | Size | Modalities | Domain |
|
| 17 |
| :--- | :--- | :--- | :--- |
|
| 18 |
-
| **GRaPE** | 10B A2.7B | Text + Image + Video in, Text out | Complex Reasoning Tasks |
|
| 19 |
| **GRaPE Flash** (this model) | 7B A1B | Text in, Text out | High-Speed Applications |
|
| 20 |
| **GRaPE Mini** | 1.7B | Text in, Text out | Edge Deployment |
|
| 21 |
|
|
@@ -29,7 +29,7 @@ The GRaPE Family was trained on about **14 billion** tokens of data after pre-tr
|
|
| 29 |
|
| 30 |
# Architecture
|
| 31 |
|
| 32 |
-
* GRaPE: Built on the `Qwen3 VL MoE` Architecture, allowing for long-context understanding and reasoning over visual tasks, as well as any text-based task. Allowing for deep understanding.
|
| 33 |
|
| 34 |
* GRaPE Flash: Built on the `OlMoE` Architecture, allowing for incredibly fast speeds where it matters. Allows for retaining factual information, but lacks in logical tasks.
|
| 35 |
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: apache-2.0
|
| 3 |
+
language:
|
| 4 |
+
- en
|
| 5 |
+
base_model:
|
| 6 |
+
- allenai/OLMoE-1B-7B-0125
|
| 7 |
+
pipeline_tag: text-generation
|
| 8 |
+
library_name: transformers
|
| 9 |
+
---
|
| 10 |
|
| 11 |

|
| 12 |
|
|
|
|
| 15 |
# The GRaPE Family
|
| 16 |
| Attribute | Size | Modalities | Domain |
|
| 17 |
| :--- | :--- | :--- | :--- |
|
| 18 |
+
| **GRaPE Pro** | 10B A2.7B | Text + Image + Video in, Text out | Complex Reasoning Tasks |
|
| 19 |
| **GRaPE Flash** (this model) | 7B A1B | Text in, Text out | High-Speed Applications |
|
| 20 |
| **GRaPE Mini** | 1.7B | Text in, Text out | Edge Deployment |
|
| 21 |
|
|
|
|
| 29 |
|
| 30 |
# Architecture
|
| 31 |
|
| 32 |
+
* GRaPE Pro: Built on the `Qwen3 VL MoE` Architecture, allowing for long-context understanding and reasoning over visual tasks, as well as any text-based task. Allowing for deep understanding.
|
| 33 |
|
| 34 |
* GRaPE Flash: Built on the `OlMoE` Architecture, allowing for incredibly fast speeds where it matters. Allows for retaining factual information, but lacks in logical tasks.
|
| 35 |
|