File size: 2,434 Bytes
b677172 63f38c5 b677172 7e00f64 3f8a2a7 63f38c5 107b57d a731a3d 44706d6 1453ec6 04493a3 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 |
---
license: apache-2.0
datasets:
- anon8231489123/Omegle_logs_dataset
language:
- en
library_name: transformers
pipeline_tag: text-generation
tags:
- not-for-all-audiences
---
<a href="https://www.buymeacoffee.com/acrastt" target="_blank"><img src="https://cdn.buymeacoffee.com/buttons/v2/default-yellow.png" alt="Buy Me A Coffee" style="height: 60px !important;width: 217px !important;" ></a>
This is [Xander Boyce](https://huggingface.co/takeraparterer)'s [OmegLLaMA LoRA](https://huggingface.co/takeraparterer/Omegllama) merged with [OpenLLama 3B](https://huggingface.co/openlm-research/open_llama_3b).
Prompt format:
```
Interests: {interests}
Conversation:
You: {prompt}
Stranger:
```
For multiple interests, seperate them with space. Repeat You and Stranger for multi-turn conversations, which means Interests and Conversation are technically part of the system prompt.
q4_0 GGML and GGUF quants [here](https://huggingface.co/Aryanne/OmegLLaMA-3B-ggml-and-gguf).
This model is very good at NSFW ERP and sexting(For a 3B model). I recommend using this with [Faraday.dev](https://faraday.dev/) if you want ERP or sexting.
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_acrastt__OmegLLaMA-3B)
| Metric | Value |
|-----------------------|---------------------------|
| Avg. | 33.55 |
| ARC (25-shot) | 40.36 |
| HellaSwag (10-shot) | 66.13 |
| MMLU (5-shot) | 28.0 |
| TruthfulQA (0-shot) | 33.31 |
| Winogrande (5-shot) | 61.64 |
| GSM8K (5-shot) | 0.23 |
| DROP (3-shot) | 5.17 |
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_acrastt__OmegLLaMA-3B)
| Metric | Value |
|-----------------------|---------------------------|
| Avg. | 33.55 |
| ARC (25-shot) | 40.36 |
| HellaSwag (10-shot) | 66.13 |
| MMLU (5-shot) | 28.0 |
| TruthfulQA (0-shot) | 33.31 |
| Winogrande (5-shot) | 61.64 |
| GSM8K (5-shot) | 0.23 |
| DROP (3-shot) | 5.17 |
|