Upload 3 files
Browse files
LICENSE
ADDED
|
@@ -0,0 +1,35 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
XLPHY Amethyst Core License Notice
|
| 2 |
+
|
| 3 |
+
This repository distributes quantized GGUF model artifacts derived from the
|
| 4 |
+
Google Gemma family for Project: XLPHY AI. These models are released under
|
| 5 |
+
dual licensing depending on the base model.
|
| 6 |
+
|
| 7 |
+
Model Artifact Terms
|
| 8 |
+
|
| 9 |
+
### Gemma 3 Models (Amethyst Arc 1B)
|
| 10 |
+
|
| 11 |
+
Amethyst Arc models derived from Gemma 3 are subject to the Gemma Terms of Use.
|
| 12 |
+
You may use, copy, modify, distribute, and deploy these artifacts only as
|
| 13 |
+
permitted by those terms.
|
| 14 |
+
|
| 15 |
+
**Official Terms:** https://ai.google.dev/gemma/terms
|
| 16 |
+
**Upstream Reference:** https://huggingface.co/google/gemma-3-1b-it
|
| 17 |
+
|
| 18 |
+
### Gemma 4 Models (Amethyst Beam E2B)
|
| 19 |
+
|
| 20 |
+
Amethyst Beam models derived from Gemma 4 are released under the Apache License 2.0.
|
| 21 |
+
You may use, copy, modify, distribute, and deploy these artifacts in accordance
|
| 22 |
+
with the terms of the Apache License 2.0.
|
| 23 |
+
|
| 24 |
+
**Official Terms:** https://www.apache.org/licenses/LICENSE-2.0
|
| 25 |
+
**Upstream Reference:** https://huggingface.co/google/gemma-4-e2b-it
|
| 26 |
+
|
| 27 |
+
No Additional Rights
|
| 28 |
+
|
| 29 |
+
Except where explicitly stated above, this repository does not grant rights
|
| 30 |
+
beyond the respective licenses for each model artifact.
|
| 31 |
+
|
| 32 |
+
Attribution
|
| 33 |
+
|
| 34 |
+
Google, Gemma, and related marks are trademarks of Google LLC.
|
| 35 |
+
No endorsement by Google DeepMind is implied.
|
NOTICE
ADDED
|
@@ -0,0 +1,35 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
XLPHY Amethyst Core Model Distribution Notice
|
| 2 |
+
|
| 3 |
+
This distribution package contains rebranded and quantized GGUF model files:
|
| 4 |
+
|
| 5 |
+
Gemma 3 Models (Amethyst Arc):
|
| 6 |
+
- amethyst-arc-1b-Q4_K_M.gguf
|
| 7 |
+
- amethyst-arc-1b-Q5_K_M.gguf
|
| 8 |
+
- amethyst-arc-1b-Q6_K.gguf
|
| 9 |
+
|
| 10 |
+
Gemma 4 Models (Amethyst Beam):
|
| 11 |
+
- amethyst-beam-e2b-Q4_K_M.gguf
|
| 12 |
+
- amethyst-beam-e2b-Q5_K_M.gguf
|
| 13 |
+
- amethyst-beam-e2b-Q6_K.gguf
|
| 14 |
+
|
| 15 |
+
These files are derivative/packaged artifacts based on upstream Gemma models:
|
| 16 |
+
|
| 17 |
+
- google/gemma-3-1b-it (Licensed under Gemma Terms of Use)
|
| 18 |
+
- google/gemma-4-e2b-it (Licensed under Apache License 2.0)
|
| 19 |
+
|
| 20 |
+
Upstream model provider: Google DeepMind.
|
| 21 |
+
|
| 22 |
+
Dual Licensing
|
| 23 |
+
|
| 24 |
+
This package distributes models under dual licensing:
|
| 25 |
+
- Gemma 3 derivatives are redistributed under Gemma Terms of Use
|
| 26 |
+
- Gemma 4 derivatives are redistributed under Apache License 2.0
|
| 27 |
+
|
| 28 |
+
See the LICENSE file in this directory for detailed license information:
|
| 29 |
+
https://ai.google.dev/gemma/terms (Gemma Terms)
|
| 30 |
+
https://www.apache.org/licenses/LICENSE-2.0 (Apache License 2.0)
|
| 31 |
+
|
| 32 |
+
Attribution and notices are provided here for transparency and compliance.
|
| 33 |
+
This NOTICE file does not modify applicable license terms.
|
| 34 |
+
|
| 35 |
+
No endorsement by the upstream provider is implied.
|
README.md
CHANGED
|
@@ -1,3 +1,81 @@
|
|
| 1 |
-
---
|
| 2 |
-
license:
|
| 3 |
-
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license:
|
| 3 |
+
- gemma
|
| 4 |
+
- apache-2.0
|
| 5 |
+
language:
|
| 6 |
+
- en
|
| 7 |
+
base_model:
|
| 8 |
+
- google/gemma-3-1b-it
|
| 9 |
+
- google/gemma-4-e2b-it
|
| 10 |
+
tags:
|
| 11 |
+
- Gemma 3
|
| 12 |
+
- gguf
|
| 13 |
+
- quantized
|
| 14 |
+
- vision
|
| 15 |
+
- text-generation
|
| 16 |
+
- edge-ai
|
| 17 |
+
- local-first
|
| 18 |
+
- xlphy
|
| 19 |
+
- codexcon
|
| 20 |
+
- 1b
|
| 21 |
+
- 4b
|
| 22 |
+
- 12b
|
| 23 |
+
- 27b
|
| 24 |
+
---
|
| 25 |
+
|
| 26 |
+
# 🔮 XLPHY Amethyst (Gemma Series) for Project: XLPHY AI
|
| 27 |
+
|
| 28 |
+
XLPHY Amethyst is a suite of high-efficiency, local-first AI models optimized specifically for the Project: XLPHY AI ecosystem. These models are repackaged and quantized to provide a premium, low-latency, and multimodal experience for autonomous agents and sovereign AI applications.
|
| 29 |
+
|
| 30 |
+
> **Developer Note:** These are optimized derivatives of the Google Gemma 3 series, rebranded and tuned for seamless integration within the Project: XLPHY AI autonomous agent architecture.
|
| 31 |
+
|
| 32 |
+
## 🧠 Model Selection
|
| 33 |
+
|
| 34 |
+
The Amethyst series is built for Project: XLPHY AI and is divided into four "Gemstone Tiers." Each tier is available in `Q4_K_M`, `Q5_K_M`, and `Q6_K` quantization levels.
|
| 35 |
+
|
| 36 |
+
| File Name (Template) | Tier Identity | Base Engine | Primary Purpose | License | Available Quants |
|
| 37 |
+
| --- | --- | --- | --- | --- | --- |
|
| 38 |
+
| `amethyst-arc-1b-[quant].gguf` | arc | Gemma 3 1B IT | Ultra-fast local execution and IoT. | Gemma | `Q4_K_M`, `Q5_K_M`, `Q6_K` |
|
| 39 |
+
| `amethyst-beam-e2b-[quant].gguf` | Core | Gemma 4 E2B IT | Main driver with Vision support. | Apache 2.0 | `Q4_K_M`, `Q5_K_M`, `Q6_K` |
|
| 40 |
+
|
| 41 |
+
## 📦 Quantization Guide
|
| 42 |
+
|
| 43 |
+
- **Q4_K_M (Low):** Fastest and most memory-efficient. Ideal for mobile and entry-level hardware.
|
| 44 |
+
- **Q5_K_M (Medium):** The "sweet spot" for Amethyst, with minimal quality differences from the original model.
|
| 45 |
+
- **Q6_K (High):** Near-lossless performance for users who prioritize maximum accuracy.
|
| 46 |
+
|
| 47 |
+
## 🛠️ Implementation & Runtime
|
| 48 |
+
|
| 49 |
+
Designed for the Project: XLPHY AI "Offline-First" philosophy. Best executed via:
|
| 50 |
+
|
| 51 |
+
- **XLPHY Desktop App** (Native Integration)
|
| 52 |
+
- **`llama.cpp` / `llama-cli`**
|
| 53 |
+
- Any GGUF-compatible inference engine supporting Gemma 3 and Gemma 4
|
| 54 |
+
|
| 55 |
+
## 🔐 Checksums (SHA256)
|
| 56 |
+
|
| 57 |
+
To ensure file integrity during the XLPHY automated download process:
|
| 58 |
+
|
| 59 |
+
| Tier | Q4_K_M | Q5_K_M | Q6_K |
|
| 60 |
+
| --- | --- | --- | --- |
|
| 61 |
+
| Arc (1B) | `12bf0fff8815d5f73a3c9b586bd8fee8e7b248c935de70dec367679873d0f29d` | `59a10a3c8dc8a9c0bda2c8882198073b1cfebbb2b443aa2fc4cfca4f92eeb805` | `ccad0cb14e9008f699f4b820110b899cf81983a987c40a05a8a1128d2fb713fb` |
|
| 62 |
+
| Beam (E2B) | `cded614c9b24be92e5a868d2ba38fb24e15dfea34fc650193c475a6debc233a7` | `43b6d9cfc1108e172b9ff99759ce7c2052bbed5dd7c4b4675ca63a04b6ed8dfc` | `b4c977371027c423ba6e36c7ca6e31e11803853224046f62d94a24a827e4f041` |
|
| 63 |
+
|
| 64 |
+
## ⚖️ Attribution & Licensing
|
| 65 |
+
|
| 66 |
+
These files are redistributed/repackaged quantized derivatives of the Google Gemma family.
|
| 67 |
+
|
| 68 |
+
- **Original Architecture:** Developed by Google DeepMind
|
| 69 |
+
- **Optimization:** Repackaged by CodexCon Digital Solutions for Project: XLPHY AI
|
| 70 |
+
- **Amethyst Arc (Gemma 3):** Gemma Terms of Use
|
| 71 |
+
- **Amethyst Beam (Gemma 4):** Apache License 2.0
|
| 72 |
+
|
| 73 |
+
## ⚠️ Limitations & Safety
|
| 74 |
+
|
| 75 |
+
- **Hallucinations:** Like all LLMs, these models may produce incorrect information.
|
| 76 |
+
- **Human-in-the-loop:** Always validate technical outputs, especially for vision-based tasks or critical code.
|
| 77 |
+
- **Non-Critical Use:** Not intended for medical, legal, or other high-stakes safety-critical applications.
|
| 78 |
+
|
| 79 |
+
---
|
| 80 |
+
|
| 81 |
+
Developed by **CodexCon** | Lead Founder: **Cid Cruz**
|