VSCD-2b / README.md
ElMater06's picture
Create README.md
4eb350a verified
|
raw
history blame
983 Bytes
---
language:
- en
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- gemma
- trl
base_model: unsloth/gemma-2b-it-bnb-4bit
pipeline_tag: summarization
---
# Uploaded model
- **What's This?**
- Vega Summarization Controlling Dataset or VSCD
- VSCD-2b is a large language model trained on search queries and summarized search queries.
- VSCD was trained on a synthetic dataset based off the Gemma-2b-IT model which contained 300 entries for 9 epochs, It was made to provide fast summarization of a user's query to make searching algorithms faster.
- **Developed by:** ElMater06
- **License:** apache-2.0
- **Finetuned from model :** unsloth/gemma-2b-it-bnb-4bit
This gemma model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)