VSCD-2b / README.md
ElMater06's picture
Create README.md
4eb350a verified
|
raw
history blame
983 Bytes
metadata
language:
  - en
license: apache-2.0
tags:
  - text-generation-inference
  - transformers
  - unsloth
  - gemma
  - trl
base_model: unsloth/gemma-2b-it-bnb-4bit
pipeline_tag: summarization

Uploaded model

  • What's This?

  • Vega Summarization Controlling Dataset or VSCD

  • VSCD-2b is a large language model trained on search queries and summarized search queries.

  • VSCD was trained on a synthetic dataset based off the Gemma-2b-IT model which contained 300 entries for 9 epochs, It was made to provide fast summarization of a user's query to make searching algorithms faster.

  • Developed by: ElMater06

  • License: apache-2.0

  • Finetuned from model : unsloth/gemma-2b-it-bnb-4bit

This gemma model was trained 2x faster with Unsloth and Huggingface's TRL library.